Published on June 11, 2014 as a thought piece at

From speaking at Adweek Europe earlier this year, I gave my take on low-latency data, how it applies to my data modelling for sports and political events currently, and how brands can and will be able to use the insights drawn from data of this kind to tweak ad campaigns in real-time in the future. As the summer of football begins I though it would be interesting to try use data to predict the World Cup winners.

Brazil is 75% likely to win the opening game of the World Cup and 18% to draw, but if Brazil starts off poorly against Croatia those predictions will change with just a few second latency, so sports fans have updated quantifiable information all game. Which begs the question, why are these low latency and quantifiable statistics readily available for sports and other entertainment, but not for business, such as newly launched advertising campaigns?

It is useful for me to create statistics for politics, sports, and entertainment, because the raw data and outcomes are public and regular. Thus, when I build the infrastructure to capture and analyze data, and make the resulting statistics available for consumption, I am able to observe and update the process on a regular basis. These live events are big business themselves (my former boss did just buy the lowly NBA team the LA Clippers for $2 billion), but there is no questions I am in this forecasting business to answer business questions as well. And, the infrastructure is ready to supply business and advertisers low latency quantifiable statistics, but the demand from them to use it is not.

Providing statistics for live events is actually more difficult than equivalent business and advertising statistics, in that the speed is insanely fast. The infrastructure necessary to collect and analyze the data, and then publish the likelihood of victory following long pass, but before the next play, is unnecessarily robust for most business concerns. As with so many forms of meaningful technology sports and entertainment have actually led to the creation of an extremely robust infrastructure. The market intelligence community could provide companies with similarly low latency, quantitative answers to show, for example, how their advertising campaign is progressing with different demographics.

The problem is the lack of demand; advertisers are not using low latency and highly quantifiable answers to adjust their campaigns. I know stakeholders are consuming and using statistics for sports, and other live events, as I can see the readership of my articles and tables before, during, and after events. I have no doubt advertisers would read similar statistics about their live campaigns. But, it is easy to see how live events adjust what they deliver to the consumers as the event progresses; they provide advertisements and information that reflect the shifting outcomes (e.g., the stars of the game or who will be in the next round). Similarly, political campaigns adjust their spending daily or even hourly, sending different quantities and designs of advertisements to different demographics as the results shift. This contrasts with traditional advertisement, which does not shift their campaign dramatically from hour to hour or day to day.

Advertisers have both legitimate loss-aversion issues as well as a principal-agent problem. First, companies are legitimately more concerned with downside than upside. It was a phenomenon in the advertising world when Oreo tweeted out an advertisement that was directly responding to the events of the Super Bowl in 2013. But, they had a very expensive war room to ensure no mistakes were made. While they may have won the day with a great advertisement, a bad advertisement can sink a company for years. Second, agencies are even are more concerned than their employers, as a bad advertisement will get them fired even after strings of successes. I am fine if my live feed makes the occasional hiccup or a TV producer accidently produces the wrong statistic, but there is there is no room for bad advertisement, so there is a huge incentive to slow down and not take any risk.

Ultimately, if we deliver these low latency and quantifiable statistics from “big data” and no one uses it to allocate resources, than it is not a revolution, it is a parlor trick. I generally talk publically about work that my colleagues and I do in gathering and understanding “big data”, but we are also engaged in many projects to make it more efficient for people to utilize this data.

New technology, from both Microsoft and others, can streamline the vetting process to restrict downside loss. Digital creativity software is starting to allow advertisers to quickly and cheaply tailor advertisements for different demographics or outcomes. Focused delivery options allow advertisers to tightly target the advertisements they send out. Online focus groups can minimize the costs and time needed to ensure that an advertisement are not making some big mistake. Along with translation devices that can check that there is no mistake in multiple languages.

It seemed like magic when Disney created advertisement with the Super Bowl MSV right after the game and it is scary that it still seemed like magic 26 years later when Oreo tweeted out their advertisement. But, the same technology that makes it easy to for the TV producers and second-screen experiences to provide customized low latency answers as the game progress will soon allow advertisers to provide customized low latency advertisements as the game progress (or their any of their other advertisement campaigns progress).