Tag Archives: Project Management

Toward a Values-Based Approach to Auditing Agile Projects

By now Agile has taken over waterfall as the dominant app dev project pattern*. In many large organizations, the traditional waterfall PMO also owns Agile projects. One aspect of PMO oversight that can work against Agile culture is the project audit. If the goal of an audit is to ensure the project reflects Agile values, it can help ensure working software and a satisfied customer. If not, an Agile project audit can reinforce process, documentation, and other values that don’t directly promote project success.

In this post I’ll briefly review the Agile Manifesto, recount some prominent advice for auditors of Agile projects, and offer suggestions for auditors who want to reinforce rather than suppress Agile values. Continue reading

Leadership Must Prioritize Data Quality

Data quality improvements follow specific, clear leadership from the top. Project leaders count data quality among project goals when senior management encourages them to do so with unequivocal incentives, a common business vocabulary, shared understanding of data quality principles, and general agreement on the objects of interest to the business and their key characteristics.

Poor data quality costs businesses about “$15 million per year in losses, according to Gartner.” As Tendü Yoğurtçu puts it, “artificial intelligence (AI) and machine learning algorithms are only as effective as the data they use.” Data scientists understand the difficulties well, as they spend over 70% of their time in data prep.

Recent studies report that data entry typos are the largest source of poor data quality (here and here). My experience says otherwise. From what I’ve seen, operational data is generally good, and data errors only appear when data changes context. In this post I’ll detail why data quality is management’s responsibility, and why data quality will remain poor until leadership makes it a priority. Continue reading

Meaningful Requirements Start Successful Data Projects

To me, development projects fail or succeed in the first few weeks. Once a project starts off in the wrong direction, momentum and expectations tend to prevent a return to the proper path. With today’s wealth of database options each addressing exciting new possibilities, the right choice for the application’s data foundation plays a large part in steering a project to success.

At this year’s Enterprise Data World conference, William Brooks showed the relations among different data modeling approaches, in effect detailing how to derive nine different model types from a detailed conceptual entity relationship model. Mr Brooks’ presentation hinted at a way to correctly frame up your data direction early on in a project, setting the stage for success.

According to his presentation, called “Symmetry in Modeling Approaches“, the different model types — relational, graph, dimensional, JSON, XML, and so on — all represent different perspectives on the same data relationships. Each suits a different application, like dimensional for reporting applications, data vault for data warehouses, graph databases for multi-layered search, and so on. However, if properly constructed they all map back in predictable and specific ways to a normalized entity-relationship model.

I and others write that ER modeling should be integral to requirements definition, but Mr. Brooks’ presentation implies that ER modeling can also serve as the basis for application architecture as well. Continue reading

Values and Behaviors of the Successful Agilist

Of course, any discussion of Agile values starts with the Agile Manifesto. The first sentence declares that Agile development is about seeking better ways and helping others. Then, as if espousing self-evident truths, the founders present four relative value statements. Finally, they emphasize appropriate balance, saying that the relatively less valued items aren’t worthless: implying that they are to be maintained inasmuch as they support the relatively more valued items.

While there is value in the four relative value statements, I believe most successful Agilists value the first and last statements more. So to me, the core Agile values are continuous improvement, helping others, and balance.

There’s a lot written about Agile behaviors, but as I read most is geared toward scrummasters or managers, and most is about transitioning from the waterfall world. Starting from the premise that Agile methods are established, focusing on participants rather than managers, and based on the assumption that behaviors are grounded in values, this post details the values and behaviors I’ve observed of those who succeed as Agile team members.

Continue reading

Tableau Rollout Across Five Dimensions

Standing up any new analytics tool in an organization is complex, and early on, new adopters of Tableau often struggle to include all the complexities in their plan. This post proposes a mental model in the form of a story of how Tableau might have rolled out in one hypothetical installation to uncover common challenges for new adopters.

Tableau’s marketing lends one to imagine that introducing Tableau is easy: “Fast Analytics”, “Ease of Use”, “Big Data, Any Data” and so on. (here, 3/31/2017). Tableau’s position in Gartner’s Magic Quadrant (referenced on the same page) attests to the huge upside for organizations that successfully deploy Tableau, which I’ve been lucky enough to witness firsthand. Continue reading

Analytics Requirements: Avoid a Y2.xK Crisis

Even though it happens annually, teams building new visualizations often forget to think about the effects of turning over from one year to another.

In today’s fast paced, Agile world, requirements for even the most critical dashboards and visualizations tend to evolve, and development often proceeds iteratively from a scratchpad sketch through successively more detailed versions to release of a “1.0” production version. Organized analytics teams evolve dashboards within a process framework that include checkpoints ensuring standards are met for security, reliability, usability, and so on.

A reporting team can build a revolutionary analytics capability enabling unprecedented visibility into operations, and then, if year turnover isn’t included in requirements, experience embarrassing errors and usability challenges in the January after initial deployment. In effect, the system experiences its own Y2.xK crisis, not too different from the expected Y2K crisis 16 years ago. Continue reading

Protect Your Culture: Screening for authoritarian project leaders

Bugs BunnyIt’s fashionable today to talk about the risks of authoritarianism in the political sphere. I’m not going to speculate on that, but such talk got me thinking about the same tendencies among IT project leaders. What is an authoritarian personality? (Yes, that’s actually a thing.) Is it truly antithetical to a healthy project? If so, how can you screen for it in hiring?

Recently, ArsTechnica ran an article that offers a survey of research on authoritarian personalities conducted since the 1940s. The bottom line for us is that those with authoritarian tendencies more often Continue reading

More on the Agile Architect: Process and Knowledge Transfer

webscrum_2444372bI’ve written about groupthink-related quality challenges on Agile projects, and the architect’s role in preventing groupthink from degrading quality. I’ve seen other risks related to the cohesion and potential insularity of successful Agile teams, and the architect is also well positioned to help prevent these: a tendency to neglect setting up and documenting repeatable processes, and a similar tendency not to share of knowledge and lessons learned outside the Agile team. Continue reading

No More Enterprise Data Sinks – An Agile Data Warehousing Manifesto

SinkOver the past year I’ve reviewed what seem like countless plans for enterprise data warehouses. The plans address real problems in the organizations involved: the organization needs better data to recognize trends and react faster to opportunities and challenges; business measures and analyses are unavailable because data in source systems is inconsistent, incomplete, erroneous, or contains current values but no history; and so on.

The plans detail source system data and its integration into a central data hub. But the ones I’m referring to don’t tell how the data will be delivered, or portray a specific vision of how the data is to drive business value. Instead, their business case rests on what I’ll call the “railroad hypothesis”. No one could have predicted how the railroads enabled development of the West, so the improved data infrastructure will create order of magnitude improvements in ability to access, share, and utilize data, from which order of magnitude business benefits will follow.* All too often these plans just build bridges to nowhere. Continue reading

Assumptions: A Key to Technical Leadership

DonkeyThere’s an unfortunate and rather rude saying about assumptions that I’ve found popular among IT folks I’ve worked with. I say unfortunate because, to me, assumptions that are recognized early and handled the right way are a key to successful projects. Technical players who use assumptions well can help set projects on the right path long before they go astray.

Sometimes on waterfall and hybrid projects technical players are asked to estimate work early, before requirements are complete. My instinctive reaction is not to provide an ungrounded estimate, but that’s not helpful. The way to handle this uncomfortable uncertainty is to fill out the unknowns with assumptions: detailed, realistic statements that provide grounding for your estimate. Continue reading