The Internet has become something of a marketing Shangri-La. With its in-depth demographic and individual profiling capabilities, real-time marketing potential, and one-to-one message delivery, the Internet represents the ideal environment for communicating marketing messages with pinpoint accuracy to very specific targets. In theory, this is completely accurate. In reality, while the Internet has evolved greatly in recent years as a marketing medium, it still has a way to go to fulfill its much-anticipated potential.
One of the key components on the road to marketing Nirvana via the Internet is data quality. This has always been something of a thorn in the side of marketing organizations, with no easy way to eliminate duplications, inconsistent data or errors. As a result, it is difficult to actually use crucial marketing data housed in databases and data warehouses to set intelligent marketing strategy and make smart business decisions.
In the largest survey ever of data warehouse users, conducted by the META Group, data quality was ranked as the No. 1 challenge to the success of data warehouse initiatives. After spending a large amount of their marketing budgets building sophisticated databases and developing information gathering processes to get customer intelligence, marketers still had to rely on the crystal ball method of deciding strategy. They weren’t able to get accurate and usable hard data out of their own databases to make such decisions.
Recent technological developments using advanced fuzzy logic, data pattern analysis, data clustering algorithms and a host of other mind-boggling capabilities make data gathered to drive good marketing decisions actually work to make good marketing decisions.
In the realm of the Internet, these new technologies are more critical than ever. For example, if you have a front-end registration to gather information about who is visiting your Web site, you lose some control on what is input into your database – the customer or site visitor does it for you, leaving a big gap in quality control. How do you know when someone forgets his password and re-registers using a different user name? We’ve all done it; it happens all the time.
Without being able to identify these kinds of redundancies, the demographic information can be skewed and you can’t be sure if your messages are the right ones and if they’re being delivered to the right audience. In Internet marketing, the tailoring of a message to a target happens in real time, all the time. There is no time for extracting data out of databases to clean it; it needs to be an ongoing function of your data gathering initiatives.
As you address the issue of data quality, there are five basic requirements you should consider:
Real time.The immediacy of the Internet is one if its most compelling attributes. Any data-quality solution aimed at Internet marketing must be able to constantly streamline and correct data in real time, at the point of input. By ensuring quality at the outset, you can be assured of accuracy in reporting.
Ease of use. One of the major reasons for the data quality problems most marketers face is that cleaning and streamlining data is a difficult, costly and time-consuming task. And if you avoid addressing the issue, the problem compounds like interest over time.
For any solution to be useful, it must easily integrate into your existing database configuration, and be easy to use. To that end, look for a product with an intuitive graphical user interface and a tool that is entirely point-and-click driven. The data-quality solution also needs to run on common Web development platforms, such as Windows NT/95/98, which lends itself to intuitive user interface design, as well as its familiarity to the majority of potential users. This eliminates the frustration that has been the trademark of mainframe-based data quality software and other high-end (read: expensive) data-quality solutions.
Flexibility. Right after ease of use is flexibility. If the product’s not flexible, it won’t be easy to use. Look for a tool that provides quality results on all kinds of data, not just names and addresses. Also, since each company has data that is very unique, the tool should easily adapt to the uniqueness of your company’s critical data. The ideal data quality tool will have a data-analysis component, allowing you to decide with the help of the tool how best to tackle the data quality problems you’ve identified, such as data inconsistency and data redundancy.
In addition, keep your eye out for a product that will allow you to add new modules, each of which will tackle specific data quality issues as they are made available from the tool developer. This prevents users from having to re-invest in the entire tool set each time a new beneficial module becomes available. Since Internet marketing and non-mainframe-based data quality control tools are in their infancy, there will be new capabilities developed all the time. Don’t get locked into a product that can’t grow and adapt as your needs change.
High-Performance. High-performance means not only fast, but accurate, results. The technology exists today that can meet this requirement. Investigate which data quality tool developers use cutting-edge software building blocks, take advantage of current workstation and server processing power, and use the most advance database development tools available to create high-performance data quality tools. Concepts such as fuzzy logic, advanced scoring algorithms, data pattern analysis, and intelligent data clustering algorithms are good indicators of products with the performance to ensure the most accurate results.
Platform Independence. Any data-quality control tool you select for your Internet marketing efforts should easily and effortlessly plug into your existing database. Identify which companies offer the ability to connect to any database via ODBC or any native drivers provided by data quality program. This allows the tools to be used with any database and in conjunction with other data warehousing tools. Also, this approach allows you to change your underlying foundational database and operating system platform without having to re-invest in new data quality tools.
The database connectivity features contribute to ease-of-use as well, since data quality work can be performed without first removing data from the database and converting it into text files, which many, often costly, data quality tools require.