"Say we want to give (key customers) a dedicated 800 number staffed with people who know their names," Stevens continued. "We know this can optimize the customer's value. But if the data used to determine whether customers are qualified to be in that segment is inadequate -- say data from your billing systems have not fed properly into the marketing database where these decisions are coming from -- we're misallocating money and have left out some top customers. That means our strategy will be sub-optimal."
Friedman noted that, besides lost opportunities, compliance with federal regulations such as Sarbanes-Oxley (SOX) is another significant reason that enterprises should focus on improving data quality. "CIOs must understand the role of compliance on data and data on compliance."
In broad terms, there are two places to start.
"First, the culture of the organization must change to treat data as an asset," Friedman said. "The CIO can be the facilitator, the interpreter in that regard."
The next step is to start quantifying how bad (or good) your enterprise's data is.
"You can't hope to manage it unless you can measure it," Friedman said. "Measuring is the best way to get the business involved because, when you measure it, you may well find unbelievably bad things."
With luck, that will lead to some pain, O'Rourke added.
"Most companies don't' deal with things unless they're in pain, the kind of pain that results in lost revenues or opportunities," he said. "If you can show where the pain points are, it's easy to get people excited about taking action."
One approach that some organizations take is to have a data quality issues championed by the business side, not on the IT side. Often, because of compliance issues, that person is the CFO.
After these initial steps are taken, the path to data quality improvement becomes more technical, Friedman said. Specifically, the next step is to examine your data architecture.
"We're seeing more CIOs forming data architecture teams that set a road map," he said. "They focus on consolidating data and reducing the technologies needed to manage data." That step is necessary because the same data often is stored in multiple systems designed by different teams. Inconsistent data is a natural result of that problem.
O'Rourke agreed, saying his organization has built a single system for data from the ground up. As a result, he estimated that data in his organization is 98% accurate.
"We've taken efforts to centralize our internal data," O'Rourke said. "Often, it was redundant and conflicted data, so we centralized it, and that has done wonders."
After all those steps are taken, it's time to think about specific tools for improving the quality of data. There is a plethora of software tools available to examine and improve data, the experts agreed. But in many enterprises, deploying such software is the easy part. The hard part is learning the extent of the problem and getting the support to solve it.