First, World Statistics Day on the 20th and then “Back to the Future Day” on the 21st. For me, last week also included an in-depth, somewhat geeky discussion with a station about Time Spent Listening.
That was a trifecta that begged for a blog.
While World Statistics Day didn’t go completely unnoticed, we were tripping
over pieces on BTTF2 predictions that did or didn't come to pass.
The focus on predictions, statistics and the TSL discussion reminded me of an excellent read from a
few years back: Nate Silver’s
The Signal and the Noise: Why Most Predictions Fail – But Some Don’t. It's long on everyday examples from weather to baseball to poker and more. It's deep enough to be challenging (especially if you're like me - a non-statistician who like stats) yet it's not so deep that you’ll drown.
Statistics play a role in most businesses; they certainly do in radio with ratings at the forefront for most programmers.
Part of a consultant’s job is to provide the greatest understanding
of the factors that impact/impacted ratings – not just to explain what
happened, but to predict and recommend what changes (if any) could lead to better results.
The examination of ratings data is often the process of separating the signal from the noise: what factors were the primary drivers and what factors were ancillary or irrelevant.
To inspire you on your next
analysis (ratings or otherwise), here are few quotes on signal and noise from Nate.
“Immersion in a topic will provide disproportionately more
insight than an executive summary.”
If you really want to understand something, you’re likely going
to have to spend time under the hood.
That’s
not a new thought of course, but a good reminder that – especially where ratings are concerned – the deeper you dive, the more you’re likely to
discover.
Silver suggests using past as well as collective experiences
to form probability theories before diving into data.
"The Bayesian approach toward thinking...encourages us to hold a large number of hypotheses in our head at once, to think about them probabilistically, and to update them frequently when we come across new information that might be more or less consistent with them."
Armed with a list of as many possible factors/scenarios that could have contributed to the outcome allows you to “stop and smell the data” which, Silver says, leads to better decision making - the reason you’re doing the deep dive in the first place.
"The Bayesian approach toward thinking...encourages us to hold a large number of hypotheses in our head at once, to think about them probabilistically, and to update them frequently when we come across new information that might be more or less consistent with them."
Armed with a list of as many possible factors/scenarios that could have contributed to the outcome allows you to “stop and smell the data” which, Silver says, leads to better decision making - the reason you’re doing the deep dive in the first place.
A case in point was the Time Spent Listening discussion. The rise or
fall of TSL may be related to your most obvious on air components like music,
commercial load, or talent. But it's also very possible that any of the other 13 variables associated with TSL could be major factors, including 100+ QHR diaries, format partisans in the sample, weighting, how early
your first cuming occasions occur, etc.
Before jumping to a conclusion about what drove an increase or
decrease, examine each variable your scenarios suggest and
determine 1) whether or not that variable was a factor and, 2) if so, to
what degree.
As you work your way through the data, new information may
challenge or strengthen your original assumptions.
As it turns out, the TSL drivers in the station discussion last week did ultimately prove to be something different than the original hypothesis.
“Information becomes knowledge only when it’s placed in context. Without it, we have
no way to differentiate the signal from the noise…”
Trending data is a quick way to add context. Compare not only your
most recent performance to past performances but also format averages, audience composition, sample, and any other relevant data.
And about those times when, to the best of your knowledge you did
everything right, yet the outcome was disappointing?
“…sometimes the only solution when the data is very noisy – is to
focus more on the process than on results…Poker players tend to understand this
more than most other people…Play well and win; play well and lose; play badly and
lose; play badly and win: every poker player has experienced each of these
conditions so many times over that they know there is a difference between
process and results.”
Focusing on the process isn't a “pass.” Instead,
it’s an opportunity for self improvement and a review of the procedures that have been associated with success over the long term.
Bottom line: the next time you’re working through a report, seek to eliminate the noise by:
- Committing the time it takes to do a deep dive
- Creating multiple theories about what might have driven the results and a corresponding checklist of data to examine
- Evaluating and providing context for all the data relevant to your theories
- Review the process with an eye toward self-improvement
As Nate points out, "Good innovators typically think very big and very small. New ideas are sometimes found in the most granular details of a problem where few others bother to look...sometimes we let information that might give us a competitive advantage slip through the cracks."