Hitmetrix - User behavior analytics & recording

Online Store Design: Lessons Learned

The Internet truly is a direct marketer's dream. It promises an infinitely flexible creative palate, truncated response curves and a plethora of information and statistics.

It often seems that all you need to do is put up an online store, generate some traffic, watch how visitors react and you'll be well on your way to understanding how to build a killer site.

Unfortunately, reality isn't quite so simple.

On the Web, more than ever, details matter. With the appropriate preparation, the dream can be realized. Successful online stores are able to understand how customers and prospects interact and continually refine their features, functions, design and layout to enhance the customer experience. But, as many less successful stores have discovered, it's not easy. Here are some of the lessons I have learned:

Design Your Site for Analysis. Each Web site has its own peculiarities, driven by the platform it operates on, the approach the site's programmers take, and the specific nature of its business. Because of this, if you build your online store without considering the type of information you'll want to track, or if you evolve the site in a haphazard fashion, odds are you'll be frustrated in your efforts. By the time you patch things up, you'll have lost several very valuable months and a lot of irreplaceable information.

Issues range from how you code the links within your site to the type (or types) of cookies you issue and when or how they expire. And, believe me, the interests of developers are often quite in conflict with the interests of the marketing department. Only with thoughtful design up front will you be able to identify and resolve the issues and ensure that you can, in fact, take advantage of the Web's information rich environment.

Enforce The Discipline of Testing. No direct marketer I know would run a test without some objectives, and a control against which the test can be measured. Yet online stores are constantly being designed and redesigned without a reliable method for understanding the impact of a change.

The negative, of course, is that actually sitting down to specify what the next design iteration is trying to accomplish and how it will be measured takes time. And, most of us on the Web are a bit impatient. The reality is that if you don't do this, you'll look back later and wonder how you ever got to where you are and whether it was all really worth it.

Happily, testing principles on the Web are the same as they are in the direct mail world. But your feedback can be more immediate, and it is easier to isolate selected variables so that you can test several things simultaneously. On any given week, you may be testing five different home-page offers, three different creative approaches, five different “landing” areas on your site, and the varied navigational and order patterns of 20 different traffic sources.

The important takeaway is to design your tests with a specific learning goal in mind, and we have the analytical tools in place to get the data you need. In a short amount of time, you can learn a lot.

Focus on The Design Metrics that Matter. It is important to care about a number of high-level statistics related to site design and track them faithfully. Put them on the agenda at every board meeting or even up on a wall in your company so every employee plays a part in contributing to and acknowledging improvements (or lack thereof).

I think it was Dr. W. Edwards Deming, the well-known business management guru, who noted years ago that if you track something, it will improve. On the Web, it's no different. What are most important? Some examples include: conversion rates (the ratio of orders to visitors), stickiness (page views per visitor, and the ratio of “one-page” visits to total visits), shopping cart abandonment (what percentage of folks who place items in the cart actually place an order), and the rate at which folks personalize your site. Of course, what you care about will likely be different. Just make sure you decide and start tracking!

Benchmarks aren't Benchmarks. Finally, don't let your design and your ongoing efforts to improve be driven by a fruitless effort to compare your site's performance to others. An online store is a complex, dynamic and unique animal, and comparisons are all too frequently misleading and irrelevant.

For example, let's talk for a moment about conversion rates. Conversion rates are a function of the visitors to your site and the orders they place and are important to online marketers. If I get 1,000 visitors and 20 of them order, my conversion rate is 2 percent. Wouldn't it be great to know if 2 percent is better or worse than what others achieve?

Well, simply comparing one site's visitors to another is problematic. Visitor counts for a given site which are arrived at through different forms of measurement, such as Web user panels or log-file analysis tools, often vary by 50 percent or more. In our example, if I want to compare my site to others, I don't know whether the “equivalent” visitor number is 500 or 1,500. And, therefore, I don't know whether I really should be comparing myself to a 4 percent conversion or a 1.3 percent conversion.

It makes a big difference. So measure what's important to you, measure it consistently, and realize improvements on your own terms. Don't get overly distracted by what others are saying, unless you really understand what's behind their numbers.

Al Noyes is senior vice president of sales and marketing at SmarterKids.com, Needham, MA, an online store selling education products. His e-mail address is [email protected].

Total
0
Shares
Related Posts