Machine Learning & Big Data Blog

Data Quality Management: An Introduction

5 minute read
Criss Scruggs

More business leaders are becoming aware of the tremendous impact big data has on the trajectory of the enterprise organization as it relates to:

  • Predicting customer expectations;
  • Assisting with effective product management;
  • Being available on-demand to influence top-down decision making;
  • Tailoring customer service innovations by investigating shopping habits of customers; and
  • Providing organizations with competitor information

However, there’s one big caveat — if your data isn’t accurate, complete and consistent it can lead to major missteps when making business decisions. In fact, Gartner estimates the average financial impact of poor data quality on businesses at $15 million per year1, which means you can’t afford to not make data quality management a priority starting right now.

What is Data Quality Management?

Data quality management (DQM) refers to a business principle that requires a combination of the right people, processes and technologies all with the common goal of improving the measures of data quality that matter most to an enterprise organization. That last part is important: the ultimate purpose of DQM is not just to improve data quality for the sake of having high-quality data but rather to achieve the business outcomes that depend upon high-quality data. The big one is customer relationship management or CRM. As often cited, “CRM systems are only as good as the information they contain”.

A Foundation for High-Quality Data

Effective data quality management requires a structural core that can support data operations. Here are five foundational principles to implement high-quality big data within your data infrastructure:

#1 Organizational Structure

IT leadership should consider the following roles when implementing DQM practices across the enterprise:

DQM Program Manager: This role sets the tone with regard to data quality and helps to establish data quality requirements. He or she is also responsible for keeping a handle on day-to-day data quality management tasks, ensuring the team is on schedule, within budget and meeting predetermined data quality standards.

Organization Change Manager: This person is instrumental in the change management shift that occurs when data is used effectively, they make decisions about data infrastructure and processes.

Data Analyst I or Business Analyst: This individual interprets and reports on data.

Data steward: The data steward is charged with managing data as a corporate asset.

#2 Data Quality Definition

Very simply, if you don’t have a defined standard for quality data, how can you know if you are meeting or exceeding it? While data quality definitions as to what data quality means varies from organization to organization.

The most critical points of defining data quality may vary across industries and from organization to organization. But defining these rules is essential to the successful use of business intelligence software.

Your organization may wish to consider the following characteristics of high-quality data in creating your data quality definitions:

  • Integrity: how does the data stack up against pre-established data quality standards?
  • Completeness: how much of the data has been acquired?
  • Validity: does the data conform to the values of a given data set?
  • Uniqueness: how often does a piece of data appear in a set?
  • Accuracy: how accurate is the data?
  • Consistency: in different data sets does the same data hold the same value?

In addition, to ensure these characteristics are satisfied each time, experts in data protection recommend the following guiding governance principles when implementing your DQM strategy:

  • Accountability: who’s responsible for ensuring DQM?
  • Transparency: how is DQM documented and where are these documents available?
  • Protection: what measures are taken to protect data?
  • Compliance: what compliance agencies ensure governance principles are being met?

#3 Data Profiling Audits

Data profiling is an audit process that ensures data quality. During this process auditors look for validation of data against meta-data and existing measures. Then they report on the quality of data. Conducting data profiling activities routinely is a sure way to ensure your data is the quality needed to keep your organization ahead of the competition.

#4 Data Reporting and Monitoring

For most organizations, this refers to the process of monitoring, reporting and recording exceptions. These exceptions can be captured by business intelligence (BI) software for automated solutions that capture bad data before it becomes usable.

#5 Correcting Errors

Once potentially bad or incomplete data has been sorted out by BI systems, it’s time to make appropriate data corrections such as completing the data, removing duplicates or addressing some other data issue.

Five Best Practices for Data Quality Management

For businesses starting the data quality management process, here are five best practices to keep in mind:

#1 Review Current Data

It’s likely you have a lot of customer data to begin with. You don’t want to toss it out and start over, but as they say in the tech world “garbage in, garbage out.”

The last thing you need is to fill your new data infrastructure with bad insights. Therefore, when you’re getting started with data quality management, do an audit of your current data. This involves taking inventory of inconsistencies, errors, duplicates; and recording and correcting any problems you come across to make sure that the data that goes into your infrastructure is as high-quality as it can be.

#2 Data Quality Firewalls

A firewall is an automated process that prevents and blocks a figurative fire. In this case, the fire is bad data. Putting up a firewall to protect your organization against bad data will help keep the system clear of error.

User-error is easy, and firewalls help prevent this process by blocking bad data at the point of entry. The number of people allowed to feed data into the infrastructure largely affects the quality of the data. But in many large organizations, it’s imperative to have multiple entry points.

A firewall helps data stay error-free even when there are a number of people with access to enter data.

#3 Integrate DQM with BI

In today’s business culture, the buzz is all about integration. And why shouldn’t it be?

When systems work together they work better. The idea here is that no enterprise business can justify the resources required to comb each and every data record for accuracy all the time. But integrating the DQM process with BI software can help to automate it. Based on predetermined parameters, certain datasets can be isolated for review; for instance, new data sets that are likely to be accessed often can be audited as part of the DQM cycle.

#4 Put the Right People in Place

As described above, there are several positions within your organization that have accountability over the data quality process. Ensuring these positions are seated and dedicated to the job, means ensuring governance standards can be met consistently.

#5 Ensure Data Governance with a Board

Creating a data governance board helps protect businesses from the risk of making data-driven decisions. The panel should consist of business and IT users and executives. The group will set the policies and standards that become the cornerstone of data governance.

In addition, the data governance board should meet periodically to set new data quality goals and monitor the success of DQB initiatives DQM across the various LOBs. This is where developing an objective measurement scale comes in handy since in order to improve data quality, there must be a way to measure it.

Data Quality Management is a Marathon, Not a Sprint

Big data is an important component of doing business in today’s digital world. It offers customer and competitor insights that can’t be achieved with any other tools or resources.

Because of its high velocity, big data is accessible to business leaders who can use it to make decisions in real-time. But for that reason, it’s also associated with business risks that need to be managed properly. And DQM is one effective tool for achieving just that.

Overall, DQM offers many benefits to your organization:

  • Business processes run more efficiently — because you get the right data the first time.
  • Better business outcomes — DQM offers you a better view of what’s going on with your customers, vendors, marketers, etc.
  • More confidence – DQM helps to drive more informed business decisions.

With these considerations in mind, it is also important to remember that DQM is an ongoing process that requires continuous data monitoring and reporting.

1 How to Create a Business Case with Data Quality Improvement by Susan Moore, Smarter with Gartner, June 19, 2018? https://www.gartner.com/smarterwithgartner/how-to-create-a-business-case-for-data-quality-improvement/

Automate workflows to simplify your big data lifecycle

In this e-book, you’ll learn how you can automate your entire big data lifecycle from end to end—and cloud to cloud—to deliver insights more quickly, easily, and reliably.


These postings are my own and do not necessarily represent BMC's position, strategies, or opinion.

See an error or have a suggestion? Please let us know by emailing blogs@bmc.com.

BMC Bring the A-Game

From core to cloud to edge, BMC delivers the software and services that enable nearly 10,000 global customers, including 84% of the Forbes Global 100, to thrive in their ongoing evolution to an Autonomous Digital Enterprise.
Learn more about BMC ›

About the author

Criss Scruggs

Senior Manager, Solutions Marketing
With more than 19 years in technology product marketing and management, Criss Scruggs currently drives outbound marketing programs for Digital Business Automation including Big Data and DevOps initiatives at BMC. Additional areas of focus at BMC included solutions marketing for IT Service Management, IT Asset Management, and Mobile Device Management. Prior to BMC, Scruggs spent several years with other IT organizations focused on Systems and Applications Performance Management, VMware Management, and VoIP/Unified Communications Management businesses and several years in product management with Compaq/HP. Scruggs holds a master of business administration degree from The Jones Graduate School of Management, Rice University and a bachelor of arts degree in advertising from the University of Oklahoma.