Working Small and Working Together

Why do some teams have such an easy time of this (instrumentation and deciding what to measure), and other teams have so much trouble?

January 8, 2021
Former Product Evangelist, Amplitude
Working Small and Working Together

In the final post of this series, I am going to explore what makes instrumentation look so effortless for some teams, and so difficult for others. It boils down to working small and working together.

The delta is insane. We’re talking two hours to get started on one side, and four months to get started on the other. “Just part of how we work, no big deal” vs. “oh multiple teams burning lots of energy to get mediocre data we never use so we gave up.” “A new junior team member unearthed some helpful insights” vs. “even our top data scientist can’t make sense of this.” As someone who works with a lot of teams in this domain, I still can’t get over the range.

We will explore this using two contrasting real-world stories.

The Happy Path

I recently sat down with a cross-functional team to help get them started. Here’s how it went.

“Do you have a test environment?”

“Yup.”

“Cool. Let’s start super simple and do a Hello World-type experiment. You can’t mess anything up, and later we can set this up to pass data to QA, Staging, and Product environments dynamically. OK?”

“Sure.”

“We are going to instrument something super simple. What is your product’s promise to your customers? What does it help them do?”

“We help people move around town by bicycle, without the cost of owning a bicycle.”

“From the perspective of your customers, what is a moment of joy? When is trust truly earned?”

“When they finish their ride safely.”

“Cool. Let’s instrument that. Something like ‘Ride Completed’. Do you have tiers of customers?”

“Yep. Three tiers.”

“OK. Let’s pass that user property in. Just so you can see how it works. Let’s use this snippet of code from our JS SDK documentation. We identify the user this way. And pass that even this way.”

“Got it. Looks easy.”

“It is. There are some nuances when we work across domains, GDPR, etc., but that’s honestly pretty easy as well if you think it through.”

Ten minutes pass.

“OK, we should see something.”

“Great. I’ll go to this screen for the project we set up for QA, and check. There is the event! Nice. I see you went for it and recorded the ride length. Very cool. We also have this Chrome Extension that helps you test things without opening Amplitude. And this new feature called Event Explorer which lets you zero in on the event stream for your test user, or your own user.”

“Perfect. This was very helpful.”

Three hours, thirty tested events later, they are good to go. 

Three quarters, and three hundred tested events later, they are still going strong.

The Unhappy Path

Here is the unhappy path…

“How is it going?”

“Can you give me the exact specifications? I need to write this story and have it reviewed. I want to button everything up and have as many events specified as possible. This could be our last shot, so I’ve had the fifteen product managers stack rank fifty events each. There might be no going back. It should be about four weeks, and then Dana—I think—will give this a try. It is ticket #FUD9123.

From there, she will finish the Estimation Story, and from there we should have the engineering team’s estimate, so we can get that into next quarter’s planning session. Provided the sprint review goes well. Wait, I should probably add the security review. My god. That team is blocked for almost three months! Oh, do you have a rough estimate for the estimation story? Dana’s already at 30 points for that week. Is it under 3 points? That is about 3 hours. Can she do it in three hours?

If all goes to plan, from there, we’ll schedule actually implementing everything. And then show it to the business stakeholders for review—they are so busy, you know—and I think we’ll be good.”

Painful, huh?

Collaborative and Iterative

These are two extreme sides of the spectrum, but hopefully something stands out.

As with many things, instrumentation is best treated as collaborative and iterative. To work this way you need room to learn with a diverse group of people. You need the freedom to start small and start together. You need to think of measurement as less of a project, with a start and end, and more as a habit. You need to get the people with the right domain knowledge (the customer, business, interface, and “code”) in the same room. You need to, as Square PM Shreyas Doshi describes it, treat analytics “as a product”…with your team(s), and through them your customers out in the world, as the “customer.”

“Picking the wrong KPI is part of the process. Just get them out there and tune it. Most of them aren’t permanent anyway.” – Jacob Matson, Director of Digital Transformation at Transforming Age

In some uncomfortably high % of organizations, this flexibility does not exist. Engineers and designers are treated as cogs in a feature factory, their time filled like Tetris blocks. Product managers are nervous to stop the factory line, and engineering leaders incentivize output and high utilization rates. The teams aren’t even empowered to change course based on insights (analytics are more of a control mechanism, and less of a learning mechanism). This leaves developers highly skeptical about trying anything new … “come back when you’ve figured this all out, we’re slammed, and we doubt your resolve to actually do anything with this data!”

Empowered teams feel a sense of ownership when it comes to what they measure. Measurement itself is a path to engaging teams. “Making the measurement visible,” writes Harrison Lynch, Director of Product Management at Target, Connected Commerce, “drives interest, conversation & engagement. And that engagement creates ownership!”

Or they’ll argue that one group needs to “decide and agree on the KPIs and what they need to see on dashboards” and THEN, some other group goes off and does the work. First, this assumes they’ll be able to do that (refer back to ), and second they are suggesting teams leave out one of the most valuable parts of the process.

In a , quantitative UX researcher Randy Au had this incredible observation:

TL;DR: Cleaning data is considered by some people [citation needed] to be menial work that’s somehow “beneath” the sexy “real” data science work. I call BS. The act of cleaning data imposes values/judgments/interpretations upon data intended to allow downstream analysis algorithms to function and give results. That’s exactly the same as doing data analysis. In fact, “cleaning” is just a spectrum of reusable data transformations on the path towards doing a full data analysis.

Randy is talking about “cleaning data”, but I would propose that this act of sensemaking, exploring, deciding “what to track”, and instrumentation (adding code) is itself analysis. It is valuable—not a menial plumbing task, or a “non-customer facing feature.” These discussions, activities, and choices are at the heart of making sense of our products.

Alberto Brandolini, (a helpful complement to all this) describes these types of activities as “an act of deliberate collective learning.”

When teams see these things as valuable, and apply the appropriate level of “rigor” and creativity to the problem, great things can happen. When great things happen, that triggers a virtuous cycle. And this all becomes a habit.

In Closing

Hope you’ve enjoyed this long post. I wanted to leave you with some actionable things you can try next week.

  1. Experiment with
  2. Try the activity mentioned in this post about
  3. Try a customer journey mapping (or similar) exercise to explore the customer narrative
  4. Think about the key promise in your product and the moment that promise is kept. You can try a free demo of Amplitude and try measuring that moment in a test environment.
  5. Read our short book on the . This is a good starting point.
  6. Figure out those 30 Events! My team is testing out a new workshop on that if you’d like to give it a shot (see the ).

Go forth. Get your team together. Have a conversation. And instrument some events.


The full series:

Part 1: 

Part 2: 

Part 3: 

Part 4: 

Part 5: 

Part 6: 

Part 7:

About the Author
Former Product Evangelist, Amplitude
John Cutler is a former product evangelist and coach at Amplitude. Follow him on Twitter: @johncutlefish
More Perspectives
November 12, 2024
CEO & Co-founder
October 29, 2024
Principal Solutions Engineer
October 25, 2024
Community Manager, Amplitude
October 23, 2024
Director, Competitive Strategy at Amplitude
Platform
Resources
Support
Partners
Company
© 2024 Amplitude, Inc. All rights reserved. Amplitude is a registered trademark of Amplitude, Inc.