Last May I was approached to write an article for Marketing Tech Insights magazine. It was finally published today.
Here is a link to an online copy of the magazine (My article in on page 25)
Here is the article in full:
Stop Making it So Complicated
By Edward Nevraumont, CMO, A Place for Mom
Why do you have to go and make things so complicated?
- Avril Lavigne, Complicated
I had a friend in school who was always stressed. She was stressed about her grades. When she did well in school she was stressed about getting a job. When she got a job she was stressed about finding a boy. When she got married she was stressed about having kids. When she had it all, a great career, a great husband, a great family, she was still stressed about what came next.
Some people choose to be stressed.
In marketing the parallel is people who choose to make things complicated.
Back in the pre-internet days it was very difficult to measure marketing impact. It was clear that marketing drove value in general, but very difficult to pinpoint which marketing dollars were effective and which being thrown away. In this environment marketers fell into two camps:
- Salesman
- Technical Marketers
Salesmen were storytellers. The Mad Men of marketing. They told a compelling tale of why their marketing spend was the reason for a products success and/or why their marketing was not the reason for the failure. They ruled the roost in the marketing world, but the fluffy-ness of their world-view rarely got them to the CEO suite.
Technical marketers fell into two sub-camps. The academics developed tools like conjoint analysis and statistical customer segmentation models. The grinders meanwhile ran A/B tests on direct mail advertisements for book clubs and marble chess sets.
The grinders were doing real science in un-sexy verticals. The scientists meanwhile were doing math that had very little relevance to reality. Except that the salesmen could use their mathematical models to enhance their qualitative pitches.
Enter the internet.
Internet allowed marketers to use the tools previously limited to the direct mail crowd. Technical marketing “grinders” finally had their day in the sun. No longer was a salesman required to tell the story of why an orange background would sell more than a blue background. Now the grinder could just run an A/B test on the website and show their boss the statistical significance test. Who knew why orange was better – it just was. The test said so.
As the grinders got more power and prestige, marketing finally turned into a quantified profession (At least in some companies, as many marketing departments in giant brands are still run by salesmen). CMOs started being considered for CEO roles. Technical marketing became the only real respected marketing. Instead of salesmen using flowery language to convince CEOs to invest in their marketing pet projects you had math geeks using statistical equations to convince CEOs to invest in their marketing pet projects.
While it may seem more sophisticated, the mathematical language hides as many untruths as the flowery language of the Mad Men.
Small n
Unless you are a giant website like Amazon, Facebook or Expedia you will be limited in how much traffic your website gets. That means you will never be able to test everything – or even most things. You still have to make decisions about what to test. You don’t need to use flowery language to ‘prove’ orange is better than blue, but you do need to have some way to decide to test blue in the first place (vs. a million other things you could choose to test).
Reversion to the mean
Your test just showed the blue background gave you a 20 percent boost to your conversion. You run around high-fiving everyone in sight. You convert the entire site over to blue backgrounds and wait for the money to roll in. Months later you look back at before and after conversion rates. It doesn’t look much better. What happened?
Reversion to the mean happened.
When you get a great result from a test it is either because the test cell is actually truly better than the control cell, or it is random. We use significance testing to rule out the randomness – or so we believe. What we actually do (every time I have seen it done in a company) is we run a test and as soon as we get 95 percent confidence the test result is better, we stop the test and roll out the test cell to 100 percent of the traffic (and start the next test).
The issue is the 95 percent confidence doesn’t mean 95 percent confident that your result is 20 percent better (or whatever the average result says), it means that you are 95 percent confident the impact isn’t zero. The +20 percent is a combination of actual impact and random, but there is nothing that says the result couldn’t be +0.1 percent from actual and the rest random. That would fall within the 95 percent ‘rule’.
There is also a 5 percent chance your test is worse that the control. 5 percent is not very high, but if you are like most companies and you are doing dozens of tests a week, that 5 percent will hit you every couple of weeks.
Even if your ‘n’ is high, it is very easy to chase noise.
Big Data and Black box Statistics
I have lost count of the number of marketing consultants that promise they can get me a 20 percent improvement in performance in my marketing. 20 percent seems to be the number that is high enough to be impressive and low enough to be realistic. But when I ask these companies to offer a guarantee where we only pay if we actually get a 20 percent improvement they almost always decline my business.
Usually when pressed to answer the question, “How are you going to get a 20 percent improvement?” The answer is inevitably something to do with Big Data.
Big Data is a real thing. Google uses it to give you search results. Amazon uses it to recommend the next book you should buy. Facebook uses it to determine the best buzzfeed article to show in your newsfeed. But most companies should not be looking for a Big Data solution. Little Data will work just fine for most of us.
Majority of companies have not executed on the most basic data. Finding ways to use more data is a waste of everyone’s time. If you aren’t prioritizing your call backs based on whether they have budget to buy your product, what makes you think you will prioritize your call backs based on a fancy algorithm built with Big Data? You won’t.
But humans (even marketers) love the lottery ticket. They love the idea that if they can only get the next shiny toy they will be able to revolutionize the business (or at least get a 20 percent improvement).
Whether it’s a Big Data black box or a Mad Men black box, the result is the same: Over-complicating marketing to make someone look good. My pitch to all of us is to forget for a minute the next lottery ticket and instead look to see if there are simple execution issues you can fix. Only after you build so much great non-personalized content that you couldn’t possibly send it to everyone should you consider personalizing it to only send it to some people. Only consider the complicated when you are 100 percent sure you have mastered the basics.
And as an added benefit, focusing on the simple, un-complicated basics turns out to actually drive better performance.
And be a simple kind of man…
- LynyrdSkynyrd, Simple Man