Is perfect Data Quality an attainable goal for an organization? Today I saw a blog post from Henrik Liliendahl Sørensen on “Unmaintainability”. My first reading of this title was “unattainability” which got me thinking about how Data Quality can be seen as “Unattainable.”
When I was first hired by a particular multi-national financial services organization in the late 1980s, my title was “Global System Deployment Specialist,” which did not, however, refer to weapons systems but rather that I was a specialist in the implementation phases of global application systems development and operation. I was a Closer! Interestingly (well, to me), there are few people who are particularly good at this. Organizations tend to focus on development but hesitate, especially with very large systems made up of hundreds or thousands of programs, to finally “go live.” One part of this problem is that those who have little experience with large systems may believe and promote that a system should be without issues prior to implementation. Ha! The key to breaking through that “unattainable” goal is to classify and prioritize issues with strict definitions of priorities. Should a misspelled word on an internal screen stop implementation? Should an enhancement request stop implementation? The costs of delayed implementation can be astronomical and need to be managed firmly. Admittedly, there are many examples of disastrous system implementations that are even more costly.
Of similar “unattainability” was perfectly secure systems. Financial Services has always been on the bleeding edge of security technology, because they tend to be a target for security attacks. So, I developed a view of designing and implementing secure systems as being the equivalent of trying to achieve nirvana – a goal for which one constantly strives without ever achieving that goal. We make our systems secure based on organization and regulatory standards and best practices, balancing cost and risk, as appropriate. Hey, I once implemented a data warehouse in Switzerland to which no one was allowed to have access. Was this perfect security? No, there was no business value in a system no one could access.
The issue of perfect Data Quality has a similar unattainability. We need to classify the types of issues that may be found with data and the importance of particular types of data to the organization. We need to understand the regulatory and organizational rules associated with different types of data. We need to assess the quality of the data and determine what are realistic and cost-effective goals for improvement. Much data may not even be important enough to an organization to warrant the cost of assessment. Then, we need to balance the cost of fixing data with the risk of not.
Achieving perfect Data Quality may be “unattainable”. But the real goal is to understand and manage the risks and costs associated with improving organizational Data Quality.
Perfect data quality is unattainable, but this is a perfect blog post about data quality, April 🙂
Most organizations unfortunately do not, as you rightly suggest, balance the cost of fixing data with the risk of not, but instead fix data for the sake of fixing data.
The endless and unattainable pursuit of Data Nirvana will just leave the organization with a sense of Business Emptiness, which is not emptiness in the positive, though often misunderstood, Buddhist sense.
Emptiness is about must be emptied, what thoughts and concepts we must let go of, such as the self-defeating strategy of attempting to achieve perfect data quality.