Hitmetrix - User behavior analytics & recording

Data is Not Just a Commodity — Yet

Big data is pretty much like the universe, near infinite and always expanding. So what happens when data becomes a commodity that every marketing firm has? Does the ever-expanding sphere of data stop growing, turning flat?  

The challenge is how to use the techniques of big data to differentiate your firm from the competition. This is more about mindset and strategy rather than tactics and technique. 

Quantity Is Not Quality

The challenge is not having the most data, but the right data, contends Dr. Cynthia Beath of the University of Texas, Austin, and co-author of “You May Not Need Big Data After All”. Acquiring more data to analyze is less useful than “having the better ability to ask questions, and using the same data to answer the questions” she said. 

The starting point for analysis is the data a company already has. “Most companies are not very good at being data-driven” Beath said. An executive-driven company may find itself acting on someone’s good idea and then trying to scale it without knowing whether the good idea can be verified, sometimes pursuing a course of action because someone made a decision based on instinct, she explained. 

“You have to have the right data to answer the questions you have,” Beath said. That starts with customer data, sometimes combined with outside data, and here one can gain a competitive advantage based on how well the sum of inside and outside data is analyzed.  

“[T]here has to be a culture that appreciates how to use data.” Beath said. “Good scientists understand the difference between a tested and a non-tested statement.” A statement is “opinion and not fact unless you test it.” she said.

People think data becomes a commodity when the competitors who share the most market share have most of the same data, noted Randy Bartlett, statistical data scientist, consultant and author. “That is the perception held by those purchasing the data.  It is really the information that matters — (yet) we talk in terms of data.” he said. “This is not happening to the extent people think, because smart companies can usually generate their own data, through their interactions with their customers, etc.  Secrecy and expertise facilitate information asymmetry. “

There are two opportunities to gain advantage when everyone has the same data. First is to “create cost efficiencies,” using automation to extract information faster at lower cost, Bartlett pointed out. Then there is leveraging the data better than the rest. 

“This is the real race going on.  You can provide the information faster to the decision maker (think real time situations) or you can provide better/more relevant information for the decision.  The more information in the data, the more potential to create competitive advantage via better information. “ Bartlett said. This requires “best statistical practice”, which rests on three cornerstones — statistical qualifications, diagnostics and review. “To support BSP you need a strong data infrastructure (data collection, software, and management), and great organization, planning, and leadership. “

Here the human element comes into play, as running a good big data operation requires talent spotting. “It is a team thing.” Bartlett observed.   “Most companies new to the game fail at combining the right specialists.  They hire checker pieces instead of chess pieces.  Also, they struggle with leadership, organization, and planning.”

Theory Becomes Practice

Companies in data brokerage and digital marketing have to practice what they preach. They have to tame the ever-expanding sphere of data before making sense of it. And they do not see the data world turning flat any time soon. 

“As recently as five years ago, Acxiom saw an average of 15 to 25 data sources for complex marketing solutions.” said Chad Engelgau, VP, global identity and data product management. “Now, even mid-tier companies are arriving with 50 to 100 data silos which require integration, alignment of a common identity framework (anonymous and known) as well as privacy vetting.”

Inside all this volume is “the right data”. To get at it requires a lot of sifting and sorting. “if you look at lists like comScore’s Top 50 Multi-Platform Properties or Top 25 Ad Networks, you’ll see billions of unique visitors per month.” Engelglau noted. “ [T]he issue is reaching humans, not bots.” That means de-duplication of the data to find the unique audiences where marketing dollars go furthest.  Needed is “the ability to segment people into mutually exclusive and collectively exhaustive (MECE) segments.” Engelglau continued. “Applying big data allows for more strategic segmentation and prioritization of those consumer segments.”

“The algorithms have gotten really good at gleaning insights from larger, unsorted, dirty data sets,” said Andy Fisher, senior VP for solutions at Merkle. “Yes, make sure the data is of the highest quality,” he said. “Adding more tonnage does often help.”

A decade ago, the more likely outcome was GIGO (garbage in, garbage out), Fisher noted. “Ten years ago, the algorithms and the hardware were not good enough to glean insights from massive amounts of unstructured data,” he said. 

Here Fisher speaks as a “data optimist”. The world of data is “not going flat in the foreseeable future.” he said. “There is so much room for it to grow. There are so many new data sources coming online.” The Internet of Things, drones, facial recognition, blockchain — all these technologies will be generating massive data streams that will need to be analyzed. Keeping up with the developments is the challenge.

Yet despite all this, “first party data needs to be at the core of any strategy,” Fisher said. “Most clients have first-party data sets. Beginning to leverage [it] is the most valuable thing to do.” 

Once this is done, third party data can be added for further identification and segmentation of the market, provided the data is good, and you know where it is coming from. 

Total
0
Shares
Related Posts