Digital product teams emphasize the collection and analysis of quantitative data—and yet too many of these teams are missing opportunities. Why? Because they bend the information to a false narrative. They miss because they aren’t set up to act on new information. They miss because they focus on overly broad or lagging metrics.
When we see the data come in, will we do anything differently?
That’s the question we should be asking our leaders, our teams, and ourselves.
Better still, “What will we do if the data does X?”
When preparing to ask and answer those questions, consider these three practices:
We’ve all done it. We’ve noticed a change in our data and said, “I bet that’s a result of X.” But hypothesizing after results are known (or “HARKing,” coined by social psychologist Norbert Kerr) hurts our ability to improve our hypothesis-making skills. Our minds are wired to find patterns in data even when none exist, so we must avoid our natural bias by transparently stating our hypothesis before we test. We gain two critical insights by doing this:
With each tested hypothesis, we learn if the product got better and if the team’s skill increased.
You don’t need a perfect plan of action before you release—just a clear direction to head once you’ve seen the results. Ask:
With a high-level direction in mind, the team can spring into action the minute you’ve gathered enough data. By preplanning, you shorten the time from insight to action, which translates into positive outcomes such as increased speed to market.
If the answer is, “We’re not sure what we’d do,” there’s a strong likelihood that this isn’t a metric you need or are ready to focus on. It could be that the metric lags too much to be actionable or it could be that the metric doesn’t tell you enough to make decisions. In both cases you’ll need to try this next behavior.
There are many ways we can measure the effect of our digital products and services. There are popular metrics like Net Promoter Score (NPS), but what can you do if your NPS goes down? Gesture wildly at the user experience? It’s almost impossible to directly affect such a broad lagging metric. Instead, focus on:
Broad lagging metrics can help us understand the landscape of our product, but if we want to change the way our customers use our digital products, we must zoom in to key moments that matter.
Next time you release, make sure you hypothesize the outcome beforehand, state a high-level plan for what you’ll do based on the results (including when you’ll look at them together), and verify you have specific leading measures that can help you learn more about the behaviors demonstrated within your product. If you do that, you’ll be more likely to create positive impact for your business and your customers.