J.Crew to Use Clickstream Analysis for Product Recommendations

Share this article:
Apparel retailer J.Crew Group Inc. is working to deliver more personalized content by next month to online consumers based on their browsing behavior at www.jcrew.com.


"It's the ability to offer dynamic content based on clickstream analysis," said Jayson Kim, senior director of Web marketing applications at J.Crew.


Product recommendations will be pitched to frequent nonbuyers at jcrew.com without the aid of purchase or registration data. Predictive data mining technology from DigiMine Inc., Bellevue, WA, and cookies will help convert these browsers into buyers.


This effort is aimed to address two issues bedeviling online retailers: the vast number of shoppers that fail to turn into buyers and the high rate of shopping-cart abandonment.


J.Crew's data warehousing initiative has been six months in the making.


So far, DigiMine's data mining and Web analytics technology has been employed to sift through reams of online, catalog and store transactional data to unearth buying patterns.


Now under a central repository, the J.Crew database roughly has a half terabyte of data.


"We've actually worked to build a scalable enterprise like the data warehousing effort that will support our long-term personalization efforts," Kim said.


Such intelligence will also serve to increase the average online order size from current jcrew.com customers. The retailer will not disclose current average order size for jcrew.com transactions.


Kim said it is not enough to simply target customers with proven buying behavior.


In many cases, contextual product recommendations are not true personalization. Products are recommended based on data volunteered by consumers, often neglecting online behavioral patterns.


"I think what you see out there on the Internet is not personalization," he said. "To me that's just mass customization."


A big hurdle with targeting personalized offers to nonbuyers or anonymous users is the difficulty for a retailer's front-end system to identify a profile in real time. In other words, every time a user clicks on a page, that data has to be updated at both the front end and back end. While the software often can handle the sheer volume of data, the hardware may not.


Working its way around the problem, J.Crew is collecting its customer and transactional data in a warehouse to do the segmentation offline. That data is brought back to the front end. A rules-based engine will then target the pertinent segments.


Cookies will play an important role in the new scheme of things for jcrew.com. For example, if a nonbuyer visits the site a number of times, a cookie-based browsing history is created. The retailer's data warehouse is also taking into account log file analysis.


"So, basically based on your clickstream and browsing behavior, if we notice that you're browsing sweaters in the last few visits, we should be showing recommendations and content to you that's centered around sweaters," Kim said.


Share this article:
close

Next Article in Digital Marketing

Follow us on Twitter @dmnews

Latest Jobs:

Featured Listings

More in Digital Marketing

News Byte: CX Scores to Take Their Place Beside Price Listings

News Byte: CX Scores to Take Their Place ...

E-commerce aggregator PriceGrabber will begin offsetting price info with service expectations.

Data Byte: Interactive Ad Revenues Exceeding TV for the First Time

Data Byte: Interactive Ad Revenues Exceeding TV for ...

At nearly $43 billion, interactive advertising revenues exceeded broadcast for the first time in 2013.

Marketers: Data Rich and Knowledge Poor

Marketers: Data Rich and Knowledge Poor

While advertisers have become incredibly data-savvy, the most difficult challenge remains causally linking that data to outcomes that really matter.