Integrated reporting is the thing
In today's world of merging and converging marketing channels, it's ever more important for communications strategists, media staffers and agency managers to be very clear about the methodologies relating to integrated reporting.
All marketing programs are part of an ongoing, evolving information stream. Whenever possible, reporting must be designed to deliver as much insight as possible to improve marketing operations, increase media efficiency and contribute to business intelligence overall.
When communications programs meet technology, precision becomes the linchpin of success. More insightful reporting means more complexity and set-up time, and employees with more technical knowledge. Exactitude in program set-up, data passing, report formatting, and the documenting and indexing of findings all play a role. Spreadsheet functions like macros,
V-lookup and pivot tables are all now well-established activities across communications-related departments.
The rapid growth of the Internet is also causing mid- and senior-level marketing, communications and media managers to be more involved with the methods and techniques of interface and applications design, database manipulation, server operation and networking. More advanced "pure" database functions are finding their way onto non-technical supervisors' desktops everyday.
Without well-established procedures for tracking activity across platforms, applications and ad units can make analysis an arduous, time consuming, error-prone exercise. Basing strategic decisions on extrapolated conclusions, unverifiable assumptions and gross estimations is, at best, frustrating. At worst, it results in a costly waste of manpower and valuable resources.
It's the responsibility of department directors and vice presidents to assure that higher-level reporting meets with cross-departmental requirements. They must also assure that staffing, tools development and training are adequate to meet increasing information-tracking demands, in an environment of evolving metrics and processes.
An online campaign tracking profile
The day-to-day burden of meeting the increasingly complex demand for more, and more accurate, data falls to line managers. A growing variety of technical specifications need to be implemented to assure that project results are accurately tabulated and clearly conveyed.
The following example illustrates just how complicated things can get.Pharma example
A prescription cosmetic drug has a user base of women between the ages of 25 to 44 years old. The DTC brand manager wants to know if there is greater receptivity and/or appeal for the product among the 25 to 34-year-old group vs. the 35-to-44 age group.
The in-house media department or ad agency purchases online inventory against the designated groups, in the specified geographic areas.
Three creative treatments are used to test messaging in 1) "traditional," click-only Flash banners, 2) expandable interactive rich media units, and 3) live in-banner surveys. The three different banner types are each being delivered from different ad serving companies. Each vendor will generate a report, in a different format. All of them have to then be reconciled.
The company's analytics department has determined that the ten DMAs (designated marketing areas) need to be divided into "best, middle and low performing" categories. Web site page visitation that is driven from this campaign also must be weighted to bring the results in line with sales forecasting assumptions.
In order to optimize results, performance in this six-week effort has to be checked every few days, to assure balanced delivery. Reporting will be bimonthly.
Trackable traffic sheets
Accuracy begins with the insertion orders. They must clearly indicate how many of each type of impressions (the Flash, rich and survey) are purchased for each age group, in each geographical area. Separate traffic sheets have to be prepared for each of the ad servers. Reflecting the IOs (input/output), each must indicate the six sites in the plan, as well as the DMAs and target groups. Also specific placement(s), budget allocation, impression count, creative sizes, rotation, alt text, default graphic and each banner's click-over URL(s) must be indicated. The post-click 1x1 pixel tracking tags are already in place on the Web pages to be tracked. If all the ad units connect to a single Web page, then the traffic sheet is relatively easy to prepare. If, however, different elements drive to different Web site pages, preparation of the traffic sheet can be a two-day job, in and of itself.
The Flash banners are easy. There are three creative executions in three sizes, across six sites yielding a fifty-four data point set.
Each of the three rich media banners has two expandable panels for a total of six click tags per banner. There are three creative executions, in three sizes across six sites. This means a 324 data point set for rich media banners.
The survey banners have three question areas with two answers each - six data points in all. There are also three creative executions running across six sites, but only one size. A 108 data point set. Across the Flash, rich media and survey ad units, there are 486 data points.
Each of these data sets has to be delivered by two age groups across ten DMAs. This means 486 elements, times two (for the 25-to-34 and 35-to-44 age groups), making 972 lines of code. Delivering those against the ten individual DMAs makes a total of 9,720 pieces of data. This is before Web site page visitation enters the picture.
The naming convention It's vital to develop a clear naming convention for the ad units to assure easy identification across reports. If one vendor uses the creative name, another uses the campaign name and a third some internally generated numerical coding system, a two-hour analysis job can become a two-day analysis job. Visually matching seven similar digit codes to specific creative elements across dozens (or hundreds) of ad unit elements is a maddening exercise. Assuring good reports
Start at the end. A detailed pro forma reporting document at the beginning of an effort, helps to focus a team's attention. Diagram exactly what is involved when implementing custom applications, coding, link-tagging, structuring data formats and the related tasks that assure clean and clear data. Following is a checklist to help guide the reporting process.
What departments are involved? What data is being gathered by each discipline? Are any custom applications needed? Who is responsible for specifying, developing, checking, implementing and administering it, or them? How is the custom application tracked? Where does the data in it reside? Who will have access to it? What permissions need to be allowed, and by whom?
How many systems are involved and need to be queried? To what format does each system output? What do the reports look like? After the download, what types of data manipulation are required for merging, purging and findings comparison inside each department, and interdepartmentally?
Who will be gathering, cleaning and formatting data at each step for each group, and interdepartmentally? Who are the back-up staffers? Are all contributors aware of the other's needs and expectations? Is one person's work dependent on someone else first completing an element of work? How long is each component of work expected to take? Has everyone signed-off on the timetable(s)? What is the check and review process?
Who will be drawing key conclusions? This may be obvious, but when big picture analysis applies, things change. Great performance for an individual aspect of a campaign or marketing program may, in fact, prove counter-productive, overall.
For example, getting the absolute lowest price for a statewide buy of fringe television time in Missouri is generally a good budget-saving tactic. However, no Web site visits are ever initiated from those commercials. So the negotiating time and effort spent to get those great prices is severely mitigated, when impact on site visitation is a key metric - because the Web site is absolutely the lowest-cost conversion-to-sale vehicle.
In the event of a conflict, for example, what should we be reporting? What is the mediation process, and who is ultimately the deciding party?
Good decisions are based on good information, which has its basis in clearly defined data delivered in a timely manner. Increasingly powerful intelligence-gathering and data-manipulation techniques come with an increase in the complexity of data-gathering processes.
It is absolutely imperative that campaign elements, their inter-relationships, place in the reporting sequence, desired output form, analysis requirements and delivery format all be clearly understood as soon as possible in the planning process. Ideally, reporting should be the first consideration, with everything tracking back from there. Otherwise, you'll never really know what you might get.
Joseph Serino is an independent digital marketing consultant.