Maximizing Direct Mail Results
A little forethought can make the difference between success and failure for financial-based direct marketing programs. The overall direct marketing plan, testing strategy, persistence and discipline in sticking to the program and selecting the right vendor partner are all part of the formula.
These tips can help financial marketers enhance the performance of direct mail programs.
Develop a direct mail plan for each program. Whether you are marketing home equity, investments, credit cards, insurance or certificates of deposits, every program needs a well-thought-out direct marketing plan. This plan provides the foundation for execution and evaluation. These topics are normally addressed:
• Program objectives: These should be clearly identified and measurable. Items measured may include number of accounts, loan or deposit dollars, cost per account and return on investment.
• Key selling proposition: Before developing a direct mail piece, the marketer should evaluate his product based on its competitiveness relative to other product offers. Also, evaluate the offer from the mail recipient's perspective. From this exercise, the key selling proposition should be identified.
• Testing strategies: Assuming the mailings are small (fewer than 200,000 pieces per mail period), testing should focus on variables that can deliver the greatest return on investment. They include, in order of importance, audience, offer and creative. The goal is to get the right offer to the right audience with the right message. Additionally, retain a no-mail population that is held out from the mailing. This population will be needed to accurately assess overall program effectiveness - this is especially true when targeting existing customers with product cross-sell initiatives.
• Audience selection: These decisions are among the most important in ensuring success. There are a variety of ways to define the audience, including demographics, geographic location, mortgage variables, net assets, credit usage and number of times mailed. For existing customers, variables also include household tenure, product ownership, number of products owned and relationship value. From this information, the audience can be profiled using either univariate or multivariate techniques, such as profile or response modeling.
• Offer positioning: This should tie directly with the key selling proposition. Also, relevant product features and benefits should be outlined.
• Creative: Effective creative is concept- and copy-driven. A rule of thumb to test a package's effectiveness is to read only the headline, Johnson box, call-outs and postscript. These areas should address the key selling proposition, features and benefits. There also should be a call-to-action that clearly outlines how to respond and creates urgency.
Before mailing, try to respond using the devices provided - fill in the mail piece, call the toll-free number, access the Internet, etc. Finally, ensure the package fully leverages and adheres to the company's branding guidelines.
• Analysis: Inspect what you expect. Before mailing, determine how response will be captured. Will the customer database be used to track response or a product application system? How will names be matched against the mail file? Will names be matched based on name and address, source-coding or a household match key? How long should the program's tracking window extend? What variables should be tracked?
• Media: If media are available to support the program (such as print, telemarketing, Internet, etc.), identify what media will be used and when it will be executed.
Think through the testing strategy. For financial services marketers, part of a program's effectiveness can be attributed to the proper combination of audience and offer. These two variables should make up the majority of the testing early in the program. After these two variables are fine-tuned, creative tests should be executed. These tests should include significant changes, such as the inclusion of a new component, testing of an interactive device or change in format. Assuming the goal of creative testing is to reduce the cost per account booked, this can be accomplished by testing new creative formats that reduce the cost per piece or increase response.
When developing a testing plan, it is important to isolate the variable being tested. For example, if you test a new offer, the creative and audience selection should remain unchanged, with minimal changes in the creative to support the offer test. The ultimate goal of testing is to identify the combination of audience, offer and creative that provides the optimal response. Once found, this combination becomes the control package from which other testing results are compared.
Testing can be expensive, so ensure each test has a sufficient sample size to be statistically significant. A common mistake by financial marketers is to have too many test cells with sampling so small that the results are not statistically meaningful.
Methodically work the plan - patience is a prerequisite. One frequent question is: When do you stop testing? Never. Enhancing programs is an evolutionary process requiring time, perseverance, continuity and tenacity. As long as sufficient mailing quantities are available for testing, the direct marketer should use every opportunity to enhance program effectiveness.
Another challenge, especially for projects where response rates tend to be low, such as marketing home equity products to prospects, is sticking with the program long enough to affect results. Because response rates are much lower in this group, it may take two or three mailings before test results are determined and response models can be built.
Here is an example of a home equity marketer that targets about 1 million prospects annually. This marketer mails four programs per year, feathered weekly. Because of the lower response rates associated with prospecting, timing required to book a home equity loan and the weekly volumes, it required about six months of mailing before response models could be built and test results determined. The following table reflects the testing executed during each mail period and the overall effect on response, net booking and cost per account.
Evaluate vendor relationships based on value added. In selecting and evaluating vendor-partner relationships, the marketer should evaluate his needs based on these factors:
• How much experience exists inhouse with marketing these types of products through direct mail?
• What services are available inhouse, and what should be outsourced? Specifically, who will be responsible for creative design, production management and analytics?
• Are people available inhouse to manage the daily execution?
Normally, one vendor partner will not meet every need for every program. These questions should help determine the type of relationship required. For example, if your institution has extensive experience in marketing the product through direct mail, production management staff, analysts and statisticians on-site, then dealing directly with a production vendor may be the best choice. In this case, because the vendor is not providing strategic support, analytic or creative expertise, the best measure to determine which company to work with would be reputation for quality and cost per piece.
If your institution has limited internal resources and experience in marketing the product, and the program requires strategic input, analytics, hands-on production management and creative development, a better fit might be a full-service, database marketing direct response vendor. It is important to recognize that these vendors often work as an intermediary with the production vendor. For this reason, these vendors will not be competitive with production vendors on a cost-per-piece basis. In this case, to select a vendor you may want to evaluate them based on quality, experience with the product and industry as well as on results, such as cost per account, since this reflects the effect of the value added - i.e., strategy, analytics and creative as well as production.