Information systems have always been at the mercy of data quality. With each technological advancement comes renewed attention on the imperative for clean, comprehensive and high-quality data. Indeed, the advent of digital transformation and AI elevated the need to continuously monitor, validate and address data quality, as issues can arise at any stage of the data supply chain: collection, processing, transformation, integration, machine learning and inference, human consumption and analysis.
Ideal data stewards are well-versed in both the business and technical contexts of the data, so they understand what to do when issues arise and do not arbitrarily remediate the data. For example, data quality events can reveal process problems like incomplete or inconsistent data, technology problems like a misconfigured interface or sensor failure, or predictive maintenance needs such as fixing an oil leak in an industrial machine.
“Believe it or not, the number one data quality issue I’ve seen in my career is sourcing the wrong data — that’s not a technology issue, it’s a process issue,” said Mark Clare, chief data officer at Quest Diagnostics, in an interview for CDO Magazine.
The vast scope and costs of the challenge, along with the diverse nature of today’s data sources and architectures, have solution providers actively developing new approaches to improve data quality management. Likewise, process and cultural recommendations for data quality strategies are evolving over time.
Financial and operational performance and reputation are at stake
The benefits of data quality management (DQM) far outweigh the high costs of failure. “About 80% of companies believe they have lost income due to poor data quality,” Georgi Chonkov, former manager, data governance and architecture at Deloitte, said in 2025. “In fact, companies lose $10 to $14 million U.S. dollars annually due to poor data quality.” That finding from 2025 is little changed from 2020 Gartner research, which found that poor data quality costs organizations at least $12.9 million a year on average.
When data is scattered and quality issues are not managed, it not only causes errors and customer complaints but also damages the company in the long run, cautions Chonkov. “Hidden costs like regulatory risks, manual handling of data, clumsy processes and reputation damages are a real danger to your company.”
Conversely, a well-implemented DQM strategy enhances data reliability, boosts employee efficiency, increases customer satisfaction, generates operational savings and enables informed decision-making, he explains.
AI amplifies the urgency to pursue cleaner and more reliable data