Use a Mixed-Pattern Approach to Instrumenting Your Product

To future-proof your instrumentation efforts, make sure to use a mixed-pattern approach.

Perspectives
December 2, 2020
Image of John Cutler
John Cutler
Former Product Evangelist, Amplitude
Use a Mixed-Pattern Approach to Instrumenting Your Product

This post is Part 2 of a 7-part series: Principles for Product Instrumentation Success. Click here for Part 1.

Mixing Approaches to Measurement

A couple months ago, a coworker and I pondered the following question:

Why are some teams able to make instrumentation look so effortless, and somehow predict what questions they’ll need to answer?

The delta is incredible. Some teams get amazing results with <100 well crafted event definitions. Other teams are swimming in thousands of events and can’t answer a single question. And other teams are so paralyzed by feeling they need to figure out each and every question beforehand, that they do nothing, or sacrifice data quality and usability by using an autotrack solution.

We looked more deeply into this question, and arrived at the following conclusion. It is not magic. The teams that have figured this out actually use multiple perspectives when it comes to instrumentation. They don’t rely solely on figuring out perfect questions. They don’t just instrument every feature. And they don’t rely on a single model. They use—often without realizing it—multiple approaches. Over time, it becomes a sixth-sense of sorts. They just know what to instrument.

Below are five common “frames” for instrumentation. Each has advantages and disadvantages, which is why we recommend mixing approaches.

Decisions and Questions

Scenario: My team is trying to decide whether to roll out a feature to all customers. To make that decision, I need to understand if use of this feature is having any negative downstream effects.

The advantage of starting with decisions and questions is that what you measure will be immediately actionable. The disadvantage is that we can’t predict the future (though some teams do a better job of this than others). Down the road we may have new types of questions, or need to make new types of decisions.

What decisions must you make in the near future? To make those decisions, what questions must you answer?

Specific KPIs and Metrics.

Scenario: The board has asked my startup to measure something specific, like daily active users or churn. Or maybe I’m at an ecommerce company and there are standard metrics like ‘Shopping cart abandonment rate’ that everyone seems to track.

The advantage of starting with a specific KPI or Product Metric is that in theory someone has figured out whether it is actually important. The disadvantage is that often we forget the “why” behind these types of metrics. Or they are too lagging, or too generic, to be useful.

What is an industry standard metric for your current product?

Workflow or Feature

Scenario: I have a specific workflow, journey, or feature I want to learn more about. How is it performing? How are people using it? Where are they getting stuck? How might I improve it? What could go wrong here, and how might we know that sooner rather than later?

The advantage of starting out with a workflow is that it is easy to identify key workflows in your product and customer journey. The disadvantage is that it is easy to end up collecting a lot of data you don’t use, or that you get too grounded in the interface and forget the customer’s goals.

What is a common workflow in your product you’d like to learn more about?

Goal/Objective

Scenario: My team is tasked with improving retention for a specific cohort of customers. I need to figure out whether we’re making progress, and figure out a reasonable set of leading indicators. To achieve the goal, I also need to unravel why certainty cohorts of customers already have high retention rates.

The advantage of starting with a goal/objective is that your organization has (hopefully) already determined the goal is important! The disadvantage is that you can develop tunnel vision and fall victim to confirmation bias. The goal is important now, but will it be important in the future? Is it the right goal?

What is your team goal? What must you learn to achieve that goal?

Model

Scenario: I’ve built a model to describe how I think something works (or will work)—a model for customer acquisition, retention, expansion, customer satisfaction, or customer lifetime value, for example. And now I want to attach measures and metrics to the inputs and outputs of that model. The North Star Framework is an example of developing a model.

The advantage of starting out with a model is that ostensibly you are modeling something that is important! The disadvantage is that we’re always improving models. In a sense they are hypotheses. So it is a win when we disprove the hypothesis, but we may have to start over.

What is something you’ve modeled recently…where you’ve broken a system into parts, and tried to understand how the individual pieces interact to produce a specific output?

How do you future-proof your instrumentation efforts?

Relying on one approach is rarely effective. For product analytics, it’s important to mix approaches. This can be difficult advice for the purist. And difficult for people pitching the perfect method. I’ll share my own story here.

I used to be very stubborn about starting with decisions, and framing questions based on those decisions. But then I experimented with using basic product walkthroughs and customer journey maps to identify workflows, and realized I was decent at picking key actions to instrument. I had no questions or decisions in mind, but the data proved to be useful in the future.

Next I got into using models, and that inspired different (but valuable) instrumentation choices focused more on loops and flywheels, the relationships between inputs and outputs, and assumptions about causation. Over time, I even warmed to “standard” metrics like Gross Merchandise Volume (GMV) and Average Order Value (AOV).

With practice, I learned that the approaches complement each other. And this pattern seems to hold with the teams that do a good job of instrumentation.


The full series:

Part 1: Measurement vs. Metrics

Part 2: Use a Mixed Pattern Approach to Instrumenting your Product

Part 3: Keeping the Customer Domain Front and Center

Part 4: Learning How to “See” Data

Part 5: The Long Tail of Insights & T-Shaped Instrumentation

Part 6: Asking Better Questions

Part 7: Working Small and Working Together

About the Author
Image of John Cutler
John Cutler
Former Product Evangelist, Amplitude
John Cutler is a former product evangelist and coach at Amplitude. Follow him on Twitter: @johncutlefish