Optimization for the New Millennium
To gain some perspective, let's review how data mining changed the face of the credit-card marketing landscape over the past decade or so and preview how optimization will reshape it again.
The history. Certainly Moore's Law - the amount of data storage that a microchip can hold doubles every year or at least every 18 months - was a key driver in this scenario, allowing more complex algorithms to be applied into quick, operational decision windows. Before the mid-1980s, credit was granted in a manual manner using judgmental criteria to make risk decisions. Experts developed behavior scores that used statistics to predict which potential customers would be the best and worst risks.
Scoring proved to be more consistent and accurate in predicting general risk than human judgment. In addition, technology improvements sped up the process, resulting in increased efficiency for issuers and customers. In this way, the world of math and science was introduced to the business of credit-card lending. However, the same could not be said for the marketing efforts to acquire and manage customers. Even in the late 1980s and early 1990s, credit-card offerings were homogeneous and consumers were blanketed with mass-market appeals; there was little price/feature competition.
A few innovative companies realized and seized an opportunity. New companies that hired Ph.D.s with predictive modeling, statistics and econometric backgrounds started to target best customers. Introducing science into the art of marketing, these companies used math to identify which customers would transfer balances and they introduced other novel ideas (low introductory rates, rewards, no fees, etc.) to customers based on their expected response. These little nonbanks, through smart marketing, created a new way of doing business and, thus, came to power in the credit-card industry.
In many ways, 10 years later, the credit-card industry has come full circle. The market is reduced to stealing customers from each other, pushing profitability break-evens beyond the point where most accounts end. More accurate predictions of the same view of the customer no longer provide new, operational information.
Though there are incremental gains that can be and are being made by building better predictive models, prediction alone is no longer a competitive advantage. Consider the analogy of the electron microscope: There comes a point when marginally increasing the magnification of the microscope - looking at the same cell - ceases to yield new information.
The future. A new microscope that is able to look beneath the cell and understand its interactions with other cells, does indeed yield new, operational information. Today, the true innovators are embracing the evolving science of optimization. The breakthrough is in shifting the goal beyond building incrementally more accurate models and, instead, expanding the depth and breadth of the understanding of the customers. Optimization serves to leverage the existing information by combining very accurate models into a broader decision framework that creates more opportunities to influence the customer over longer periods of time.
Unlike predictive modeling, which starts with inputs and predicts an output, optimization starts with an output and seeks to find the best inputs to achieve a certain goal. These solutions demand three criteria:
• The optimization output. This should be the strategic goal of the company. Lifetime customer value is a measure of all possible economic flows of a customer over time. LCV serves as an arbiter to consider aspects of the decision-making process that have historically been at odds with one another, such as response and risk. Just as predictive models captured more precise representations of customer behavior over judgment alone, optimization captures and structures the causal and interdependent behaviors of customer actions/reactions in a comprehensive manner. How do you create individualized plans to invest and harvest the most value possible from your customer-asset base?
• The ability to compare and simulate. Under a comprehensive metric, how does one measure which option or plan is preferable across the entire customer base? What about external influences? The optimization paradigm provides a metric for comparing apples to apples (customers to customers) through a common view, but what about under myriad assumptions? The ability to simulate, to run and compare multiple, structured scenarios enables decision makers to stress test plans and combine management intuition with optimization science to create compelling strategy.
• An integrated environment. A goal that unifies all aspects of the organization has no value if the functions cannot communicate. To be successful, optimization solutions will require coordination and communication among functions. As interdependencies are exploited and plans are proposed, how does that impact the actions of all of the functions?
The power of optimization science can best be exercised when the concerns and perspectives of the contributors can be facilitated, integrated and exchanged effectively.