NEW YORK — Shifting from an IT-oriented view to a business-oriented view is critical in addressing poor data quality, according to Ted Friedman, principal analyst at Gartner Research, where he is a member of the Data Management & Integration team.
Friedman was the keynote speaker yesterday at Firstlogic's iSummit LIVE at the New York Marriott East Side. The event also featured educational sessions and customer case studies. The keynote was also offered as an online seminar.
Though companies often look to technology to solve data quality problems, he said, poor data quality “is a business issue, it's not an IT issue. The only way you are going to be successful in improving data quality is to begin to put the responsibility and accountability where it belongs, on the business side.”
Friedman advocated a process that begins with measuring data quality and determining the gap between a company's current data accuracy and 100 percent accuracy.
“This is the process by which you begin to bring the business case for a data quality initiative,” he said. Measuring and quantifying the effect of poor data quality should happen on an ongoing basis, he said.
Friedman also urged tying data quality best practices to compliance initiatives, enterprise information management, and integration initiatives.
“These are the new data quality drivers,” he said. “In the past, data quality was about customer data and the cleansing of names and addresses for marketing campaigns and doing list management. The drives now are much different … they are not low-level, technical, IT project-oriented things. Rather, they are very important business drivers.”
Another key to a data quality initiative, Friedman said, is to appoint data stewards who manage their departments and are responsible for certain slices of a company's data landscape.
“This is occurring more and more in businesses,” he said. Some organizations even tie compensation to how data are maintained and improved, he said.
Some companies also have a data quality team that measures things such as customer and financial data and produces a quarterly data quality scorecard distributed throughout the organization.
“We are seeing these kinds of things more and more these days in our client base,” he said, “some type of metrics that are reported out on a regular basis, [and] some type of a data quality team … that is measuring [this data] and providing the input.”
Though data quality technology adds value, it will not generate results by itself, Friedman said.
“Focus on people and process improvement for the greatest effectiveness,” he said. “Get the business engaged and start to shift the culture.”
In general, data quality is a hot topic, he said.
“We are seeing our clients in virtually all vertical industries getting very, very focused on this topic,” he said, “understanding the impact of poor data quality on their businesses, trying to understand some of the best practices and trends occurring in the data quality space that they can harness to make some improvements.”