Do You Make These 8 Testing Mistakes?

Share this article:
One sunny morning in 1939, Albert Einstein got up, grabbed his papers and walked to his office. As a Nobel Prize winner, he was used to people eyeballing him in public, but this morning people seemed to stare more than usual.


He had solved some of the deepest mysteries of relativity, gravitation and quantum mechanics, but this world-renowned genius couldn't figure out that he had left the house without his pants. This just goes to show you that smart people can do dumb things.


So you shouldn't think that just because you're smart, you're immune to making mistakes in your testing. Here are eight mistakes that I see all the time:


* Testing haphazardly or running sloppy tests. Testing is a discipline. You have to test continually and carefully. Otherwise, the numbers just won't mean anything. If you don't have the skill or patience for ongoing tests, number crunching and analysis, get someone else involved.


* Assuming that your tests are error-free. You should actively seek out mistakes on every level. Whether a given test comes out good or bad (but especially if it's bad), think through the whole process to track down errors. Were the addresses good? Were the return address and bar code correct? Are phone operators logging every response properly? Did your ad actually run? Are e-mails being dumped as spam? What else might be going wrong?


* Drawing the wrong conclusions. Too often, people look at test data and make a snap judgment: "That self-mailer bombed. Self-mailers don't work" or "We tested a Christmas appeal, and we lost money. Christmas is a bad time for promotions." This is usually the result of impatience combined with a poorly designed test. Ideally, you should test with the express purpose of measuring one variable. And you must test against a proven control. Otherwise, you may conclude that a particular variable affects results when it doesn't.


* Making decisions based on insignificant results. Every test must be statistically valid. This means you must reach enough of your audience to ensure that you have accurately sampled that audience and have received enough responses to accurately calculate your results. When you fall below certain minimums, your results are worthless. Testing is expensive, but you can't cut corners.


* Overlooking important results. Numbers are just data. You only have information once you analyze your numbers and draw conclusions. That's why it is important to do more than just list your results; you should play around with them. See what might be hiding in all those digits. Is there a trend? Are your results seasonal? How do your results compare with industry averages?


* Refusing to repeat tests to confirm results. An accurate test is repeatable. In other words, if your results are true, you'll be able to test again and get similar results. If you don't, there's something wrong somewhere. You should always confirm a dramatic change in results, good or bad, with another test.


* Filing away results instead of using them. Why test if you just calculate a response rate and throw your report into a filing cabinet? Those numbers are expensive to get, so use them. Analyze every test quantitatively and qualitatively. Show the numbers and write your thoughts and conclusions. Then share the test data with everyone involved. After every test you should know something useful: "This two-page letter works just as well as this four-page letter" or "This offer increased inquiries from our TV spot by 35 percent."


* Failing to keep a running record of conclusions. Over time, as you see the results of test after test, you will start to see patterns emerge from the numbers. This is the gold you're panning for. Organize and list this information as a guideline for future efforts. Building on your hard-won knowledge will dramatically increase your success rate.


So, do you make these mistakes?


Don't feel too bad if you do. I've never met anyone who didn't make at least one of these mistakes at one time or another. Just remember that while so much of what we do is craft or art, testing needs to be science. If you get too preoccupied with the creative part of your job, you may get caught without your pants.


For more ways to avoid mistakes, go to www.DirectCreative.com.


Share this article:
close

Next Article in Data/Analytics

Sign up to our newsletters

Follow us on Twitter @dmnews

Latest Jobs:

Featured Listings

More in Data/Analytics

SAP Ramps Up Canadian Cloud Investment

SAP Ramps Up Canadian Cloud Investment

It's one of 16 new data centers the software company will open this year to meet a 39% increase in demand for cloud services.

Top 5 Spending and Investment Insights from Marketers

Top 5 Spending and Investment Insights from Marketers

Confidence in data-driven marketing led marketers to set high Q1 2014 goals.

MeritDirect Introduces Predictive Marketing Suite

MeritDirect Introduces Predictive Marketing Suite

New solutions include next logical product and customer lookalike modeling. The long-time direct marketing player announces it will open a San Jose office in September.