Dan's Take

Naveego DQS and the Importance of Data Quality

The importance of the "1-10-100 Rule" in data management.

Data management company Naveego began a recent discussion by defining the "1-10-100 Rule" and its impact on enterprises today. The rule, when applied to bad enterprise data, says that it takes $1 to verify a record as it's entered, $10 to cleanse it, and $100 if nothing is done: the ramifications of the mistakes are felt over and over again for as long as the [bad] data goes unaddressed.

Naveego then pointed to an IBM estimate in 2016 that the annual cost of bad data across the industry was $3.1 trillion. Although the source of both the "1-10-100 Rule" and how IBM came up with this staggering estimate weren't provided, it did get my attention.

Naveego's DQS, according to the company, is designed to help enterprises implement a data quality process with minimal impact on knowledge workers. The product is designed to identify and track quality issues; support trust in the data through a process of transparency and verification; and quickly connect to, then integrate, on- and off-premises systems.

DQS then helps align and synchronize data across applications and supports the creation of "golden records" that are known good and trustworthy. The company believes that validating data across data stores, regardless of which application silo's responsible for this data, makes it possible for enterprises to find and eliminate errors before they succumb to the GIGO scenario: garbage in, garbage out.

DQS supports several types of reporting, and helps the enterprise solidify ownership of specific data items and the creation of business processes and priorities to support data quality.

Dan's Take: Sketchy Sources, Sound Thinking
Since I hadn't come across the 1-10-100 rule before, I tried to find its origin and discovered that it has been used as an estimate in both construction management and project management for decades, and has only recently been applied to enterprise data. I'm somewhat suspicious of "rules" that are taken out of context and blindly reused in other areas.

But regardless of whether the ratio is 1 to 10 to 100 or 1 to 5 to 37.5 in real life, it does seem reasonable. Although I don't have research to support Naveego's use of the rule in this context, the concept that finding errors early, before other plans are built upon erroneous understandings, does seem sound.

Naveego, by the way, isn't the only supplier waving the Data Quality banner and suggesting use of the supplier's own services to address this issue.  IBM, Informatica, Talend, Syncsort and SAS, to name a few, support this theory and offer products and services designed to address the improvement of enterprise data.

Do you believe Naveego's key takeaway that an early focus on quality offers significant savings over the life of a project? If so, learning more about Naveego's DQS could be worthwhile.

About the Author

Daniel Kusnetzky, a reformed software engineer and product manager, founded Kusnetzky Group LLC in 2006. He's literally written the book on virtualization and often comments on cloud computing, mobility and systems software. He has been a business unit manager at a hardware company and head of corporate marketing and strategy at a software company.

Featured

Subscribe on YouTube