Hitmetrix - User behavior analytics & recording

Avoid ‘Data Decay’ With Monitoring Plan

Years ago, I moved into a house with a yard that needed some work. Flowerbeds were overgrown. The lawn was a hodge-podge of grass, weeds and other unknown vegetation. After moving in, one of my first tasks was to clean up the yard and get everything into shape.

After a few weekends of planting and pruning, the yard had improved dramatically. But as any homeowner knows, maintaining your yard – or any facet of your house, for that matter – is a never-ending activity. If I stopped mowing the lawn, weeding the flowerbeds and watering the plants, the yard would quickly return to its original shape. Ongoing maintenance is the only way to keep everything looking good year after year.

The same dynamic applies for corporate information. Data, if left alone, will become outdated, inaccurate and unusable. As a result, data quality has become a hot topic in business, especially as companies try to make sense of all the information available within their enterprise applications.

Customer and prospect information is particularly prone to a decline in data quality. And any direct marketer knows that poor customer data can doom database marketing, customer relationship management initiatives or customer support programs. The problem is clear. How can you communicate effectively with your customers and prospects if you don’t know where they live, how to contact them and what products they own?

Understand the nature of data. The effort to improve data quality is hardly new. In the past few years, companies have spent time and money to address data quality problems in an effort to improve the quality of corporate data. For most organizations, this was a one-time, often large-scale, initiative.

In this scenario, however, clean data is only temporary. Since data is a fluid, dynamic and ever-evolving resource, building quality data is not a once-and-done activity. The integrity of a business’ data degrades when incorrect, nonstandard or invalid information reaches core applications.

Building and keeping good customer data takes constant vigilance. To manage data effectively, an organization must institute a data management program based on continual, routine data maintenance.

Take time for routine maintenance. The impact of “data decay” can influence, and hinder, many enterprise initiatives. For example, a high-technology company once built a “corporate information factory” to serve as a single repository for all of its information about customers, products and inventory. Why centralize the data? To provide a single information base for marketing programs, business intelligence efforts and customer support.

This company found, however, that new, non-standard data entering the system was compromising overall data integrity in the information factory. Whenever the company explored this data to uncover trends in customer adoption, product availability and so forth, the presence of bad data would skew the results. The company soon learned that the effort to build high-quality data didn’t stop once the warehouse went on line.

The solution for this company was data monitoring. With data monitoring, IT and business users can create rules to examine data continually and automatically to uncover problems with data quality as problems occur. These users can also chart data integrity on a periodic basis and begin to address some of the underlying reasons that bad data is being collected in the first place.

A data monitoring regimen can accomplish many tasks, including:

· Identifying trends in data quality metrics. View ongoing statistics about data to see when the value of customer data starts to decline.

· Generating instant alerts. Set up automated system notifications and e-mails to flag problematic data as a new, inconsistent record enters the system.

· Detecting problems from incoming data. For companies that load data periodically from a Web site or from purchased marketing lists, the ability to validate existing data against established business rules can help uncover and address data integrity issues before they become a problem within the database.

Data monitoring provides the confidence and assurance to know that once data get fixed, it will remain within limits. When data gets out of control, users know immediately, and they can react to problems before the quality of the data declines.

Add monitoring to data management programs. Traditionally, data management programs included four primary elements: profiling, quality, integration and augmentation. These steps allow a company to inspect, correct, merge and enhance data. Monitoring is the fifth, and final, component, creating a cyclical, ongoing method of improving data quality.

For organizations that have started an effort to improve data quality, most of the elements are generally in place to build a data-monitoring program. Data monitoring is an extension of the effort required to get data into a reliable state in the first place. The same business rules used to cleanse, standardize and verify data can serve as the rules to examine and flag data integrity issues.

Building consistent, accurate and reliable data is not easy. Periodic fixes will only provide temporary relief from the various problems that can arise because of bad data. With data monitoring, companies can better control their data and build more reliable customer information for all future marketing, sales and support initiatives.

Related Posts