For the average Internet CEO, competition in the marketplace is making life more than a little bit hairy. Consumers have become conditioned to expect the best products at the lowest prices with the highest service.
Big e-commerce sites are eating small e-commerce sites. And all the while, investors are demanding profitability. Sophisticated marketing techniques can help.
If Web marketers had the ability to fully harness the tornado of click-stream data flowing through their sites, they could deliver the offers that their customers want at the precise time they are in the market to buy with minimal effort and expense. This is no surprise – it’s the driving force behind the trend of personalization.
However, what is curious is that personalization as we’ve defined it has been heralded by the Internet community as a business-saving technique for years. And since everyone is still talking about the promise of personalization, it’s probably because it remains unfulfilled.
The reason lies in the way most sites look at their customer data. Contemporary personalization tools typically rely on techniques such as collaborative filtering and other product-centric correlations. This shortcoming does not result from a failure to recognize the importance of analyzing data to its fullest. Rather, it is a consequence of technical barriers, namely the difficulty of performing sophisticated analysis on huge volumes of data in the tight time frames required by the Internet.
Most collaborative filtering tools discard granular click-stream data containing rich information about user choices and preferences in order to complete processing in the time allotted by e-business. The irony is that by discarding this data, the personalization tools ignore the very information that defines the individual.
Consider a person who buys a golf book from an online bookstore. If the site powered its product recommendations based on collaborative filtering techniques, odds are the site would push golf content and products to that user during future visits. However, a more thorough analysis of the user’s interaction with the site might show that the buyer spent 45 minutes clicking through dozens of gardening books while spending only five minutes in the sports section. Based on this analysis, a more insightful conclusion would be that gardening is the real topic of interest and that the customer probably bought the golf book as a gift. The content push determined by collaborative filtering would not allow the marketing team to cross-sell or up-sell products to this consumer.
The inability of marketing departments at large Web sites to know precisely the interests and needs of their customers explains why the market so desperately needs next-generation electronic customer relationship management and personalization tools powered by high-performance analytics, such as data mining. This technology excels at discovering hidden patterns from massive volumes of granular-level data, such as individual transactions, call record details and clicks. It’s at this level that data mining really comes into its own.
Whereas transaction data and call records show what a consumer bought, clicks reveal the total buying experience, from browsing through product information to evaluating competitive products to buying or not buying a product. Clicks represent the nuances of customer behavior. The opportunity created by this data is the ability to base product and content recommendations not only on what customers have purchased in the past, but also a complete assessment of their shopping behavior.
While e-CRM applications based on data mining will help marketing professionals serve their customers as never before, creating such a solution is not without challenges. The biggest hurdle lies in the combination of online data volumes and the requirements for driving high-performance analytics. Unlike shallow analytical routines, which may take into account only purchased products and product affinities, click-stream mining assesses the customer’s total data profile – from demographics to pages accessed to the significance of the content served to the duration of the visit.
Whereas shallow analysis requires the data be scrubbed and prepped for only a few fields, click-stream mining depends on top-down cleaning and preparation of all fields. This explains why experts estimate that up to 80 percent of the labor in data mining is not in analysis, but in preparation. And with leading Web sites generating millions of hits per day – and more coming tomorrow – there’s no time to put off the processing.
This is not to say the analysis is easy, either. While data mining algorithms are incredibly thorough, they’re also computationally intensive, which often leads to long run times. Over the past few years, leading data mining experts have been exploring ways to tune and accelerate the algorithms for Internet use, but the understanding is that for these algorithms to generate results in Internet time, they had better be encased in one incredibly powerful application.
The warning for those evaluating e-CRM applications based on data mining is to look behind the scenes. Ask whether the application performs the data transformations needed to fuel analysis. If not, you may face the unenviable task of integrating a transformation tool with the e-CRM application. Most importantly, find out how easily the application can scale to accommodate the growth of the Internet. With data volumes exploding exponentially and the race for profitability running full tilt, the last thing an e-business needs is to be saddled with a flat-footed e-CRM application. n
• Tom Ebling is president/CEO of Torrent Systems Inc., Cambridge, MA, a data warehousing vendor.