Tag Archives: Strategy

Secrets of Successful Projects

I’ve had the good fortune to have been involved in many successful application development and analytics efforts (here and here), and a few that were less so (here and here). Recently, I’ve thought about the differences between the successful and the unsuccessful.

As I see it, there are five general characteristics that the successful endeavors share. They: 

  • Closely Align with the Business
  • Share an End State Vision
  • Are Reality Grounded
  • Have a Technically Competent Team
  • Feature Frequent and Open Communications

To level set, this article focuses on business-facing analytics, data acquisition, application development, and related efforts developing internal capabilities for larger organizations. Continue reading

Guidelines for Successful Tableau Analytics Rollout

I’ve written previously about development of Tableau analytics capability from single user to multiple teams across an organization. This article is intended for those who may have first installed Tableau Server to enable folks outside their own sphere to interact with their Tableau creations. For the way ahead, it presents a few guidelines for successful development and deployment that data analysts should internalize as their analytics product grows.

The theme is, from the very start, to develop dashboards as if they serve hundreds of users and access millions of data records. If you do that, then as your analytical tools grow in usefulness and popularity, you’ll avoid difficult conversions and retooling later. Continue reading

The Myth of Agile Sign-Off

Although Agile writers and thinkers agree that “there is no sign-off” in Agile methodology, the practice of requiring product owners and business customers to sign off on requirements and delivered work products persists in Agile settings. I’ve seen it most when an agile team faces delivery challenges and leaders perceive the problem is scope creep or failure of the UAT process before delivery. In those situations, adding a formal sign-off provides an illusion of a stronger process but does nothing to resolve the underlying issue.

Sure, sometimes sign-off is necessary, especially when two or more separate organizations work together on a project. For example, consulting contracts often require sign-off on interim and final work products. However, addition of a sign-off step is common within organizations in hopes of a remedy for delivery or quality challenges.

The commenter David on this post says that “the purpose of a sign-off (or whatever you wanna call it) is a confirmation from a product owner that artifact A is fine for the time being, and can be used as basis for work on artifact B.” That’s all well and good, but in a well-run Agile context sign-off is a meaningless formality that’s dispensed with because it’s unnecessary.

How could that be? Others have written, often emphatically, on why sign-off is unnecessary in an agile context, including here and here. This quick video explains how “definition of done” and a fully committed, reliable team work together together make sign-off irrelevant. Continue reading

Reengineered Processes Need Business-Defined Data

“Business process reengineering is the act of recreating a core business process with the goal of improving product output, quality, or reducing costs.”* Recently I’ve perused articles on business process reengineering and have been surprised to find that they share a lack of emphasis on data definition.

By establishing a shared business vocabulary, identifying and describing business-critical entities and events, and applying the defined entities and events in process and system design, BPR teams can ensure an efficient redesigned process that works smoothly from end to end.

In spite of data concerns making up two of the seven key BPR principles (“Merging data collection and processing units” and “Shared databases to interconnect dispersed departments”), articles on the topic tend to lump these concerns into general Information Technology topics, without acknowledging the need for business driven data definition and management. For example, this post stresses the need for “more sources of data and enhanced connectedness to information”. This one recounts a famous Ford BPR example where a new database was central to the solution. Many, like this one, cite “shared databases” as a core principle. However, none details the business leadership and participation necessary to define a common data foundation across a reengineered business process. Continue reading

Two Design Principles for Tableau Data Sources

It’s not unusual for talented teams of business analysts to find themselves maintaining significant inventories of Tableau dashboards. In addition to sound development practices, following two key principles in data source design help these teams spend less time in maintenance and focus more on building new visualizations: publishing Tableau data sources separately from workbooks and waiting until the last opportunity to join dimension and fact data.


Imagine a business team — let’s call it Marketing Analytics — with read-only access to a Hadoop store or an enterprise data warehouse. They gain approval for Tableau licenses and Tableau Server publication rights for five tech-savvy data analysts. After a few initial successes with some impactful visualizations, the team gathers steam. After a while the team finds itself supporting scores of published workbooks serving a few hundred managers and executives. In spite of generally sound practices, Marketing Analytics struggles to maintain consistency from one Tableau workbook to another.

Continue reading

How to be a good client


I recently listened to Brian O’Neill’s excellent interview with Tom Davenport, headlined “Why on a scale of 1-10, the field of analytics has only gone from a one to about a two in ten years time.”

The conversation covered a lot of ground as Mr O’Neill and Mr Davenport explored the reasons why. Highlights included general lack of technical literacy and lack of an organizational data driven culture. But to their credit, they took responsibility on behalf of analytics professionals, emphasizing how we in the field could change in order to make more analytics efforts successful. Rather than focusing on providing technology-centered solutions, they recommended that data and AI professionals seek first to understand and empathize with their clients or internal customers, enabling data and AI pros to develop more effective analytics capabilities in light of that understanding.

I agree that analytics professionals can improve their game. However, as a former consultant who’s switched over to the client side, I think there’s room for improvement all around. To me, clients who work proactively to prepare for an analytics project position themselves for better outcomes. Continue reading

Prioritize data initiatives with the new Data Management Maturity Index

In my experience, data management is both a mission critical and an undervalued capability. Perhaps recent customer data losses and regulatory initiatives like GDPR tend to raise the stock of data maturity efforts, but it remains undervalued. For example, any Fortune 1000 firm building end-to-end processes finds that much of the cost goes to translating data from different systems that integrate into the process.

Today we have available stage models like CMMI’s Data Management Maturity Model (DMMM) which, as I’ve written, help organizations assess an organization’s maturity level. However, the DMM model aims to assess data maturity at a single agency. It lacks mechanisms to compare multiple agencies or business functions, and therefore can be difficult to translate to prioritized plans for improvement.

Recently I co-authored, with Manoj Thomas, Joseph Cipolla, and Lemuria Carter, a study introducing techniques for assessing relative data management maturity of different organizations, and different data management capabilities, within a larger enterprise. Continue reading

Data Governance Meets Procurement

Why pay good money for bad data?

Of course no one would do that on purpose, but I as a consultant over many years I’ve often seen it. A vendor fulfills a contract to the letter, which unfortunately allows them to deliver required reports in various, sometimes changing, formats with suspect data quality. The customer company absorbs these costs, leaning on the data analyst to update PowerPoint decks on schedule before the next monthly management meeting in spite of the extra programming work.

These contracts have been for various goods and services, but almost every business contract today is also a contract for data. If a regional gas company hires a vendor to inspect residential lines, then I suspect it wants reports showing inspections conducted and results; a healthcare firm that sends nurses on house calls needs data detailing call schedules and results; and so on.

Companies that supply goods or provide services often don’t feature data management as a core competency, and the quality of their reporting often doesn’t match the quality of their goods or services. Someone in the customer organization has to code around every addition or omission of an expected Excel column, every “N/A” in a numeric field, and every unexpected change from imperial to metric units. Continue reading

Leadership Must Prioritize Data Quality

Data quality improvements follow specific, clear leadership from the top. Project leaders count data quality among project goals when senior management encourages them to do so with unequivocal incentives, a common business vocabulary, shared understanding of data quality principles, and general agreement on the objects of interest to the business and their key characteristics.

Poor data quality costs businesses about “$15 million per year in losses, according to Gartner.” As Tendü Yoğurtçu puts it, “artificial intelligence (AI) and machine learning algorithms are only as effective as the data they use.” Data scientists understand the difficulties well, as they spend over 70% of their time in data prep.

Recent studies report that data entry typos are the largest source of poor data quality (here and here). My experience says otherwise. From what I’ve seen, operational data is generally good, and data errors only appear when data changes context. In this post I’ll detail why data quality is management’s responsibility, and why data quality will remain poor until leadership makes it a priority. Continue reading

Leader’s Data Manifesto at #EDW19: Building a Foundation for Data Science

It’s been a truism that data is a resource, but to prove it you just have to follow the money. As the illustration shows, the vast majority of corporate market value draws from intangible assets. Just as money is an abstraction that represents wealth, data is an abstraction that represents these intangible assets.

It’s year three after initial rollout of the Leader’s Data Manifesto (LDM). Since then, many widely publicized events have highlighted the value of data and metadata, and the importance of sound data management (here, here, and here). Recently at Enterprise Data World, John Ladley, Danette McGilvray, James Price, and Tom Redman presented this year’s LDM update. They reintroduced the Manifesto, recounted events of the past year, discussed strategy for the coming year, and issued a call to action for data professionals. Continue reading