Category Archives: Project Management

Reengineered Processes Need Business-Defined Data

“Business process reengineering is the act of recreating a core business process with the goal of improving product output, quality, or reducing costs.”* Recently I’ve perused articles on business process reengineering and have been surprised to find that they share a lack of emphasis on data definition.

By establishing a shared business vocabulary, identifying and describing business-critical entities and events, and applying the defined entities and events in process and system design, BPR teams can ensure an efficient redesigned process that works smoothly from end to end.

In spite of data concerns making up two of the seven key BPR principles (“Merging data collection and processing units” and “Shared databases to interconnect dispersed departments”), articles on the topic tend to lump these concerns into general Information Technology topics, without acknowledging the need for business driven data definition and management. For example, this post stresses the need for “more sources of data and enhanced connectedness to information”. This one recounts a famous Ford BPR example where a new database was central to the solution. Many, like this one, cite “shared databases” as a core principle. However, none details the business leadership and participation necessary to define a common data foundation across a reengineered business process. Continue reading

How to be a good client


I recently listened to Brian O’Neill’s excellent interview with Tom Davenport, headlined “Why on a scale of 1-10, the field of analytics has only gone from a one to about a two in ten years time.”

The conversation covered a lot of ground as Mr O’Neill and Mr Davenport explored the reasons why. Highlights included general lack of technical literacy and lack of an organizational data driven culture. But to their credit, they took responsibility on behalf of analytics professionals, emphasizing how we in the field could change in order to make more analytics efforts successful. Rather than focusing on providing technology-centered solutions, they recommended that data and AI professionals seek first to understand and empathize with their clients or internal customers, enabling data and AI pros to develop more effective analytics capabilities in light of that understanding.

I agree that analytics professionals can improve their game. However, as a former consultant who’s switched over to the client side, I think there’s room for improvement all around. To me, clients who work proactively to prepare for an analytics project position themselves for better outcomes. Continue reading

Toward a Values-Based Approach to Auditing Agile Projects

By now Agile has taken over waterfall as the dominant app dev project pattern*. In many large organizations, the traditional waterfall PMO also owns Agile projects. One aspect of PMO oversight that can work against Agile culture is the project audit. If the goal of an audit is to ensure the project reflects Agile values, it can help ensure working software and a satisfied customer. If not, an Agile project audit can reinforce process, documentation, and other values that don’t directly promote project success.

In this post I’ll briefly review the Agile Manifesto, recount some prominent advice for auditors of Agile projects, and offer suggestions for auditors who want to reinforce rather than suppress Agile values. Continue reading

Leadership Must Prioritize Data Quality

Data quality improvements follow specific, clear leadership from the top. Project leaders count data quality among project goals when senior management encourages them to do so with unequivocal incentives, a common business vocabulary, shared understanding of data quality principles, and general agreement on the objects of interest to the business and their key characteristics.

Poor data quality costs businesses about “$15 million per year in losses, according to Gartner.” As Tendü Yoğurtçu puts it, “artificial intelligence (AI) and machine learning algorithms are only as effective as the data they use.” Data scientists understand the difficulties well, as they spend over 70% of their time in data prep.

Recent studies report that data entry typos are the largest source of poor data quality (here and here). My experience says otherwise. From what I’ve seen, operational data is generally good, and data errors only appear when data changes context. In this post I’ll detail why data quality is management’s responsibility, and why data quality will remain poor until leadership makes it a priority. Continue reading

Anonymize Data for Better Executive Analytics

Reading articles about data anonymization makes it clear that it is not an entirely effective security measure (here and here), but still part of a robust security capability, and required if your organization is affected by GDPR. (I use “anonymization” as a general term encompassing techniques that de-identify personal data within a given data set.)

But there’s a positive side of anonymized data that hasn’t received much press. Providing anonymous data to senior managers who don’t need access to personal data can encourage them to take a broader perspective, and thereby bring new energy to fact-based senior planning and analysis. Continue reading

Toward an Analytics Code of Ethics

In data management and analytics, we often focus on correcting apparent inability and unwillingness on the part of business leaders to effectively gather and capitalize on data resources. With that perspective, we often see ethics as a side issue difficult to prioritize given the scale and persistence of our other challenges.

At least that was my perspective, and my initial response when confronted recently by a family member on this topic. Her view from outside the field was that ethics should be a primary concern. As I’ve reflected on this conversation, I’ve come around to her point.

In recent years we’ve seen many examples of data misuse due to ethical lapses. Here’s a post that gives five examples, including police officers looking up data on individuals not related to any police business, an employee passing personal data including SSNs to a text sharing site, and Uber’s “god view”, available at the corporate level, which an employee used in 2014 to track a journalist’s location. Continue reading

Meaningful Requirements Start Successful Data Projects

To me, development projects fail or succeed in the first few weeks. Once a project starts off in the wrong direction, momentum and expectations tend to prevent a return to the proper path. With today’s wealth of database options each addressing exciting new possibilities, the right choice for the application’s data foundation plays a large part in steering a project to success.

At this year’s Enterprise Data World conference, William Brooks showed the relations among different data modeling approaches, in effect detailing how to derive nine different model types from a detailed conceptual entity relationship model. Mr Brooks’ presentation hinted at a way to correctly frame up your data direction early on in a project, setting the stage for success.

According to his presentation, called “Symmetry in Modeling Approaches“, the different model types — relational, graph, dimensional, JSON, XML, and so on — all represent different perspectives on the same data relationships. Each suits a different application, like dimensional for reporting applications, data vault for data warehouses, graph databases for multi-layered search, and so on. However, if properly constructed they all map back in predictable and specific ways to a normalized entity-relationship model.

I and others write that ER modeling should be integral to requirements definition, but Mr. Brooks’ presentation implies that ER modeling can also serve as the basis for application architecture as well. Continue reading

Values and Behaviors of the Successful Agilist

Of course, any discussion of Agile values starts with the Agile Manifesto. The first sentence declares that Agile development is about seeking better ways and helping others. Then, as if espousing self-evident truths, the founders present four relative value statements. Finally, they emphasize appropriate balance, saying that the relatively less valued items aren’t worthless: implying that they are to be maintained inasmuch as they support the relatively more valued items.

While there is value in the four relative value statements, I believe most successful Agilists value the first and last statements more. So to me, the core Agile values are continuous improvement, helping others, and balance.

There’s a lot written about Agile behaviors, but as I read most is geared toward scrummasters or managers, and most is about transitioning from the waterfall world. Starting from the premise that Agile methods are established, focusing on participants rather than managers, and based on the assumption that behaviors are grounded in values, this post details the values and behaviors I’ve observed of those who succeed as Agile team members.

Continue reading

Tableau Rollout Across Five Dimensions

Standing up any new analytics tool in an organization is complex, and early on, new adopters of Tableau often struggle to include all the complexities in their plan. This post proposes a mental model in the form of a story of how Tableau might have rolled out in one hypothetical installation to uncover common challenges for new adopters.

Tableau’s marketing lends one to imagine that introducing Tableau is easy: “Fast Analytics”, “Ease of Use”, “Big Data, Any Data” and so on. (here, 3/31/2017). Tableau’s position in Gartner’s Magic Quadrant (referenced on the same page) attests to the huge upside for organizations that successfully deploy Tableau, which I’ve been lucky enough to witness firsthand. Continue reading

Analytics Requirements: Avoid a Y2.xK Crisis

Even though it happens annually, teams building new visualizations often forget to think about the effects of turning over from one year to another.

In today’s fast paced, Agile world, requirements for even the most critical dashboards and visualizations tend to evolve, and development often proceeds iteratively from a scratchpad sketch through successively more detailed versions to release of a “1.0” production version. Organized analytics teams evolve dashboards within a process framework that include checkpoints ensuring standards are met for security, reliability, usability, and so on.

A reporting team can build a revolutionary analytics capability enabling unprecedented visibility into operations, and then, if year turnover isn’t included in requirements, experience embarrassing errors and usability challenges in the January after initial deployment. In effect, the system experiences its own Y2.xK crisis, not too different from the expected Y2K crisis 16 years ago. Continue reading