Strategies and Technologies for Effective Data Quality Management

Learn more about the importance of high-quality data in modern information systems, including the challenges posed by diverse data sources and the role of AI in exposing data flaws, and explore strategies and tools for effective data quality management.
March 31, 2026
6 min read

Key Highlights

  • Data stewards need to differentiate actual source data problems from process flaws, technology failures or predictive maintenance needs revealed as data quality events.
  • Poor data quality costs organizations millions annually through errors, regulatory risks and reputation damage, making effective DQM strategies essential.
  • AI amplifies the need for clean data, as it exposes flaws at scale and relies on validated inputs to function correctly, especially in industrial and operational contexts.
  • Emerging AI-based tools like Anomalo, Databricks, and DataWise provide continuous monitoring, anomaly detection and root cause analysis to improve data reliability.
  • Expert recommendations emphasize defining quality metrics, maintaining reference data, establishing error management processes and keeping humans involved in data validation.

Information systems have always been at the mercy of data quality. With each technological advancement comes renewed attention on the imperative for clean, comprehensive and high-quality data. Indeed, the advent of digital transformation and AI elevated the need to continuously monitor, validate and address data quality, as issues can arise at any stage of the data supply chain: collection, processing, transformation, integration, machine learning and inference, human consumption and analysis.

Ideal data stewards are well-versed in both the business and technical contexts of the data, so they understand what to do when issues arise and do not arbitrarily remediate the data. For example, data quality events can reveal process problems like incomplete or inconsistent data, technology problems like a misconfigured interface or sensor failure, or predictive maintenance needs such as fixing an oil leak in an industrial machine.

“Believe it or not, the number one data quality issue I’ve seen in my career is sourcing the wrong data — that’s not a technology issue, it’s a process issue,” said Mark Clare, chief data officer at Quest Diagnostics, in an interview for CDO Magazine.

The vast scope and costs of the challenge, along with the diverse nature of today’s data sources and architectures, have solution providers actively developing new approaches to improve data quality management. Likewise, process and cultural recommendations for data quality strategies are evolving over time.

Financial and operational performance and reputation are at stake

The benefits of data quality management (DQM) far outweigh the high costs of failure. “About 80% of companies believe they have lost income due to poor data quality,”  Georgi Chonkov, former manager, data governance and architecture at Deloitte, said in 2025. “In fact, companies lose $10 to $14 million U.S. dollars annually due to poor data quality.” That finding from 2025 is little changed from 2020 Gartner research, which found that poor data quality costs organizations at least $12.9 million a year on average.

When data is scattered and quality issues are not managed, it not only causes errors and customer complaints but also damages the company in the long run, cautions Chonkov. “Hidden costs like regulatory risks, manual handling of data, clumsy processes and reputation damages are a real danger to your company.”

Conversely, a well-implemented DQM strategy enhances data reliability, boosts employee efficiency, increases customer satisfaction, generates operational savings and enables informed decision-making, he explains.

AI amplifies the urgency to pursue cleaner and more reliable data

ID 419412508 © Cagkan Sayin | Dreamstime.com
sheila0304
SolStock / E+via Getty Images
Female working from her home office staring at her computer screen.
ID 31772660 © Iqoncept | Dreamstime.com
alexis0225

The data quality challenge is not new. “There has always been a problem with data: It is only as good as the last time it was checked,” says Industry 4.0 expert Jane Arnold, COO at APERIO and an independent board director at Houston AI Institute. “Assuming existing data is ‘good enough’ is a mistake. I see data quality as the foundation of trust in digital systems.”

The AI hype has some senior executives expecting it will magically solve their data quality problems, but what AI actually does is expose the data quality flaws at scale, she explains. 

Capitalizing on AI requires solving the underlying data quality problem first, but companies may not realize they have a problem because of the massive scale of data involved. “If your industrial data systems have 7 million tags, how can you possibly know they're all correct? Only about 45-80% of the time-series data our customers bring in is good data,” Arnold observes.

“As AI becomes embedded in operational decisions, data quality becomes a governance issue because AI won’t work if there are problems with the underlying data,” says Arnold. “It is also a risk management issue because industrial data quality is a leading indicator of operational risk, and AI without validated data introduces systemic exposure.”

AI-driven data quality improvement solutions are fast emerging 

Purpose-built software platforms and tools designed to tackle the data quality challenge are continuously developing and improving, including the following four examples:

  • Anomalo’s Data Quality Monitoring is an AI-based software that automatically detects and alerts teams to quality issues in structured, semi-structured and unstructured enterprise data. The industry-tailored augmented data quality solution continuously monitors data patterns and deviations from the norm, enabling informed mitigation. An agentic interface, Anomalo’s Intelligent Data Analyst (AIDA), supports interaction with the data using natural language.
  • Databricks’ Data Quality Monitoring, integrated with the Databricks Data Intelligence Platform, offers continuous monitoring of the data estate at scale with Agentic AI. Anomaly detection monitors all critical tables in a schema with one click, learning patterns and seasonal behavior to identify issues. At the table level, data profiling captures and tracks summary statistics for historical context. Root causes are surfaced, and issues are prioritized so teams can ensure timely resolution.
  • DataWise from APERIO is specifically built to continuously validate industrial data streams across the enterprise at scale, to detect signal degradation and quality and to quantify the economic impact before AI systems consume flawed inputs. The software can identify, prioritize and help teams resolve operational time-series data quality problems such as bad, missing or stale data based on root cause analysis findings and proposed remediation steps.
  • Stibo Systems MDM is a master data management platform that provides tools for data quality validation, including automatic data profiling and data quality metrics; automatic data cleansing and enrichment; continuous data monitoring and validation; and a framework for exposing and extracting data to streamline auditing and facilitate root cause analysis. 

Seek out expert recommendations when developing your data quality management strategy 

Start now on your journey to data quality improvement by incorporating advice from specialists. Your digital transformation and AI success depend on these universal essentials — especially the need to keep humans in the equation.

From Georgi Chonkov:

  1. Define measurements for good quality data. Is the data you're using complete? Is it unique? Is it up to date? Is it accurate? Is it consistent?
  2. Keep reference data up to date and establish clear responsibilities for maintaining it. Properly managing internal and external reference data — the fixed (not changing) data used to classify or categorize master data, such as product codes or measurement units — can reduce errors substantially.
  3. Define clear error management processes. An effective data error management strategy involves dividing responsibilities, defining bug reporting procedures, selecting appropriate tools, classifying errors and prioritizing urgent issues.

From Jane Arnold:

  1. Industrials need to treat data quality the same way they treat safety or compliance, where it is measurable and non-negotiable, because of its importance to uptime, analytics and everything you do after collecting the data. Those ignoring the fact that you have to deal with data quality first will face a lot of wasted money and confusion.
  2. Regardless of the tool used, the ability to consistently identify and correct data quality problems, from the largest to the smallest, is an imperative for continuous improvement.
  3. With AI, you still need to keep the human in the loop; otherwise, there will be no living, breathing, thinking being reviewing the data and saying, "Yes, this is real," or "No, this is a hallucination," or "No, this is absolutely ridiculous, and I'm not going to do that."

About the Author

Sheila Kennedy

Sheila Kennedy

Contributor

Sheila Kennedy, MBA, CMRP, is a professional freelance writer and award-winning journalist specializing in industrial and technical topics. After working for 11 years in industrial information systems, she established Additive Communications in 2003 to leverage that knowledge and her affinity for research and writing.

Sheila has since produced thousands of client deliverables and hundreds of bylined articles, including more than 30 cover stories for industrial trade publications such as Plant Services, where she has been a contributing editor since 2004.

Quiz

mktg-icon Your Competitive Edge, Delivered

Stay ahead of the curve with weekly insights into emerging technologies, cybersecurity, and digital transformation. TechEDGE brings you expert perspectives, real-world applications, and the innovations driving tomorrow’s breakthroughs, so you’re always equipped to lead the next wave of change.

marketing-image