The Sudbury Neutrino Observatory (SNO) is a highly sensitive subatomic particle detector suspended two kilometers underground within a nickel mine in Northern Ontario. Shielded from background radiation at the earth’s surface, the SNO makes it possible to precisely measure the most abundant subatomic particle in the known universe, the neutrino.
The SNO exemplifies pure scientific observation: the recording of data with limited interference from uncontrolled variables. This valuable data is used to verify, challenge, or enhance our theories of how the world works. The pioneering SNO helped unlock mysteries of the Big Bang and of matter itself, earning the project’s director the Nobel Prize for Physics in 2015.
Pure science is as admirable as it is demanding, requiring intricate equipment, sterile laboratories, laborious peer review/replication, ridiculously educated experts, and ample funding. Don’t let those sexy TED Talks fool you—real science is an arduous, meticulous, mentally-demanding marathon.
Digital marketers as scientists
With access to inexhaustible data reserves, highly skilled resources, and sophisticated testing tools like Adobe Audience Lab and Optimizely, it’s easy to see why modern digital marketers are beginning to feel like scientists. It’s just a shame that we’re never going to be scientists. It’s not for a lack of tools, talent, or money, however—it’s that we run our experiments on the surface in a fantastically dynamic and complex system called the economy.
The bulk of data points marketers analyze today describe a human being tapping, clicking, scrolling, or speaking into a computer. This happens in all sorts of contexts: in stores, in basement apartments, during meetings, in the back of Ubers, or while abandoning a marketing blog post. And each one of these actions can be influenced by the weather, by one’s mood, by the job market, or by something Kanye West just Tweeted.
Accounting for these variables is not only next-to-impossible: it’s intrusive and immoral. Yet as an industry we still strive to become more data-driven because it performs. In this “move fast and break things” digital era where the religious pursuit of disruption infects nearly all markets, the academic rigor necessary to be scientific will always be a bridesmaid to the promise of immediate gains in media performance.
Instead of aspiring to be scientists, we need to start thinking more in terms of econometrics.
What is econometrics?
Econometrics is an applied science that extracts quantifiable relationships (called estimators) in observable economic data. If interest rates change by X%, what can we expect to see in the condominium supply in a city? If a gasoline tax goes up by Y%, how might this affect an individual’s driving habits? Econometrics attempts to answer these kinds of questions.
We’ll never be able to nail these relationships down exactly, so the game is to use available data to uncover reliable estimators (ones that become even more reliable with more and more data) to be used in models that explain what we see and produce generally successful predictions.
As digital marketers, we’re uniquely positioned to apply a library of microeconometric techniques to our incredibly granular web analytics data. As for the unrecorded or imperfectly observed variables that underwrite individual actions (e.g., mood, preference, Kanye)? We absorb these factors as general disturbances or noise in our model.
For those with experience in the subject, so far this seems like a garden variety statistical model, doesn’t it? We typically exclude such factors from our models and rely purely on what’s generated in the media platforms or CRM databases. Focus on spend, campaign performance, phone calls, demographics, average position, funnel metrics, and we’re good to go, right?
When we limit our statistical analysis to trends in platform data, to a cross-sectional slice of our media data, or to an aggregate of many years of data, we tacitly assume our marketing program is insulated from very significant market forces. We wrongly work in a bubble.
A narrow focus
Have you ever seen a gifted media specialist at the controls of a media platform? It’s reminiscent of a kid (or, in the case of my friends, a balding father) masterfully playing Mario Kart on the Nintendo Switch, ducking and leaning through turns, jockeying for first position against formidable opponents in a pure flow state.
Immersed in the platform UI, specialists can easily believe that media optimization is a deterministic multiplayer video game, shielded from the noise of the outside world. Yet no matter how high our score is, the entire contest is profoundly affected by shifting device patterns, brick-and-mortar sales trends, the gig economy, aging populations, Kanye’s maturation as an artist, manufacturing costs, and more. Compounded, these factors can impact results more than our best media-optimizing drivers can ever hope to.
Banks know this. Insurance companies know this. Government policy analysts know this. But digital marketers, presented with all kinds of levers and campaign KPIs in a slick UI, get so immersed in gameplay that we miss a larger story that’s unfolding.
Promising clients YoY incremental results exclusively through video gaming tactics is not a sustainable business strategy.
Let’s say we had a stellar year in media performance for online sales of vitamin supplements. It also happened that the entire wellness-industrial complex had a banner year for online sales in 2018. Standing up in a boardroom and fearlessly taking credit for this improved performance while underestimating or ignoring situational factors is a classic case of attribution bias. (Good luck repeating that in the future.) Conversely, being blamed for poor YoY results in retail e-commerce while overlooking situational factors like Amazon’s global takeover is attribution bias on the part of the client. It’s a lose-lose situation in both cases.
We can avoid this trap if we augment our media-forecasting models in the spirit of econometrics. Strangely, the inclusion of admittedly imprecise and “unscientific” microeconomic data in our predictive modeling process will help us more realistically and honestly predict and evaluate performance.
Extending our purview
So, how does your organization begin to make the transition into econometric practice?
- Enlist creative people who know the business well to brainstorm meaningful trends and factors. Microeconometrics can tolerate spurious correlations, and if your factors are irrelevant, the modeling process will reveal it. Don’t be afraid to think divergently here.
- Research and compile relevant data. Speak to your Google reps for research material. Mine census data. Get an alias email account and sign up for white paper downloads where industry trends are shared.
- Learn and think about time-series analysis.
- Explore the many statistical methods developed in microeconometrics.
This doesn’t have to be a year-long exercise. This doesn’t have to be done client-by-client or completed by a single agency alone. Removing our lab coats and humbly acknowledging that we aren’t playing a video game two kilometers underground will paradoxically make us better at forecasting. Let’s get our data teams switching majors, from physics to econometrics.
Want to learn more about how leveraging microeconomic data can benefit your business? Let’s talk.