Hitmetrix - User behavior analytics & recording

10 Tantalizing Tips for Better Testing

Mark Twain once said, “I have never let my schooling interfere with my education.” In other words, there are two kinds of knowledge: book learnin’ and real-world learnin’. And after years on the front lines of direct marketing, learning by doing and more often than not beating the pants off people with impressive marketing degrees, I can tell you without doubt that real-world learnin’ is a lot more valuable.

This is especially true when testing. You can read every marketing book in the library and you won’t have a solid grasp of testing until you actually do it. Maybe that’s because too many marketing textbooks are written by professors who have never sold a single widget, gadget or thingamabob. Or maybe because textbook concepts just don’t sink in until your job is on the line. In any case, here are a few tips for running better tests based on my years in the trenches:

· Think before you test. I know you’re bursting with excitement to begin your test, but slow down. Good testing starts with careful thinking. Before you test, ask yourself lots of questions: Why am I testing? What are my objectives? What do I hope to learn? What will I measure? What are the variables? How will I design the test? What is my budget for testing? You’ll want to skip this step, but don’t.

· Test what matters. For a new DM program, focus on big issues such as the product itself, lists or media, marketing strategy, prices and offers. For an established program, you may want to test offer enhancements, formats, creative execution and premiums. Don’t get caught up in minutia. Testing the tilt of a stamp may sound cool, but it’s pointless for most marketers.

· Don’t blindly copy competitors. There’s nothing wrong with borrowing ideas. But be careful. Most companies don’t test as carefully as they should (Oh, the stories I could tell you!). Just because you see a direct mail piece, ad or broadcast spot frequently doesn’t mean it’s a winner. Be particularly cautious about copying the ads of big companies whose main income is generated by means other than direct marketing.

· Start with simple creative. How do you test a list before you’ve tested a direct mail package? Good question. And how do you test a direct mail package before you’ve tested a list? When you start out, that’s the fix you’re in. Here’s the advice I give to novice clients: Create a basic mailer to test the basics. This means you start with an approach that is as straightforward and simple as possible, using proven formats and techniques and nothing too far out. This lets you confidently test lists, media, offers, prices, product configurations and other big issues. Later, you can do creative testing to boost results.

· Always be testing. When is the best time to test? Yesterday. Today. Tomorrow. Always. Test something with every mailing, every print ad, every broadcast spot, every e-mail. Otherwise, you waste valuable opportunities to learn and thus cut costs and boost profit.

· Test one element at a time. Sure, you can save money by testing lots of stuff at once. But you won’t learn anything, because testing more than one element makes it impossible to know what makes a difference in results. If you’re testing price, change only the price. If you’re testing an insert, don’t add anything else. Of course, you can add more “cells” to test other elements simultaneously. But each cell must test only one thing.

· Track results carefully. This is the boring part for many people, but it’s vital. Keep detailed reports on the number of pieces mailed, the number of responses, response source, conversion percentage, the income those responses generate, the average order, the percentage response, your income per thousand, your cost per order or cost per response, net profit, returns, bad debt and every other fact that you need in order to calculate how your promotions perform. Over time, this will be a gold mine of information.

· Analyze results in writing. I know it sounds like a school assignment, but it’s not that bad if you tracked results. You should have an official report for every mailing, ad and broadcast spot. It should contain a description of the test, the purpose of the test, the components of your package or the elements of your ad or spot, your mailing or media plan, statistical data, complete results, a numerical and verbal analysis and the action taken or advised as a result of the test.

· Look for the “why” in every test. If a lift note boosts response, don’t settle for “Well, lift notes seem to increase response, so we’ll use lift notes in all our packages.” People don’t respond to technique per se. They respond because a technique does something to persuade them. Why does this particular lift note work? What does it say? What objections does it meet? What function does it serve? When you know the why or can make a good guess, you can apply that knowledge intelligently.

· Retest to confirm results. Initial tests can be a fluke or downright wrong. Always retest, especially when you get a positive result or a significant change in results. It’s tempting to instantly toss anything that fails and adopt anything that wins. But it pays to be patient. Test again. See whether you get similar results before making big, expensive decisions.

One more thing – a pet peeve of mine. Trust your results rather than focus groups. I can’t count the number of times I’ve had a focus group tell me they hated my direct mail package or ad, only to then get fabulous results with a real test. Focus groups are great for many things, but they’re lousy at predicting response. What people say and what they do are seldom the same.

Remember, don’t let your schooling interfere with your education.

Total
0
Shares
Related Posts