When asked, just about any professional in the field would say that product analytics is a team sport. The breadth of responsibilities and work required, from setting up data infrastructure to metric definition to performance reporting to delivering insights, is much more than one person could ever manage on their own.
Except somehow, they often do. Many data analysts toil under the radar, knowing that they’re a part of a team but feeling like a data team of one.
For the first twelve months in my role as a product data scientist at WeWork, I was one person supporting two products and five product managers, and also a resource for the design research team, and my data counterparts across the business. On a typical day, I might switch between dozens of tasks, skills, and contexts: querying the data warehouse, building analytics dashboards, gathering data to define a metric, or doing pre-experiment analysis.
So, while product analytics sounds like a team sport, it doesn’t always feel that way. I think that the problem starts with the phrase “data-driven product development.”
The issue with data-driven product development
Depending on an organization’s data culture or how data-savvy the leadership is, there’s a hidden assumption that the average data professional is an independent machine. People expect analysts to continuously surface insights and hand them off to the product team, who then incorporate them into the roadmap.
But product data and insights don’t materialize out of thin air—it’s always someone’s job to put the pieces together. At WeWork, it looks a little something like this:
- Instrumentation: Someone decides what we want to collect data on, and documents it, which gives us the groundwork to get started. (In my case, it’s usually front-end user activity.)
- Implementation: Someone writes the code to implement this activity tracking.
- QA: Someone (truly an unsung hero) validates that the implementation is generating the data as expected.
- Governance: Someone manages data governance, ensuring that we’re sending the cleanest possible data, and handling it appropriately.
- Modeling: Someone generates our data warehouse data model, from data ingestion to architecting a scalable structure that will meet the needs of the business.
- Performance monitoring: Someone uses the data collected to monitor the performance of the product, answer essential questions, and give concrete numbers to stakeholders—while putting it into a context that crystallizes what’s meaningful and what’s not.
- Hypothesis testing: Someone identifies meaningful hypotheses and performs experiments and analysis to (hopefully) drive product decisions and shape the future of our product.
It takes a village to surface data insights, and a team of one just won’t cut it.
Good data is the byproduct of a systematic process that requires multiple disciplines and team members. It takes a village to surface data insights, and a team of one just won’t cut it. The smaller the data team, the higher the risk of sloppy data, missed performance issues, and an analyst that’s spread too thin to deliver value
Overcoming data intimidation with 1:1 training
Our dream at WeWork was to bring more people onto the data ‘team’, but we struggled with adopting analytics tools, including Looker and Tableau. It’s a canonical problem— I think most data professionals have delivered at least one dashboard that went completely unused or ignored. Even though self-service analytics isn’t new, we never got much traction—but we’re getting there with Amplitude Analytics.
As a data professional, it’s easy to overlook how intimidating data can be. For people who don’t operate in this space, it’s still a mysterious entity, that only experts can make sense it. Overcoming this intimidation barrier is critical to driving data literacy. Once people realize that data is just information- information that can tell a story about products and customers- their natural curiosity takes over.
My mission was to create the ‘light bulb moments where people discover how satisfying it is to ask a question and answer it quickly, using self-service data tools. I knew that intimidation is a form of fear, so I started by meeting the bogeyman.
Some stakeholders believe only experts can make sense of data, and overcoming this intimidation barrier is critical to driving data literacy.
In the beginning, this started with a lot of one-on-one coaching, covering the fundamentals: how tracking works, how we identify users, and what an event stream is. Then, we walked through the platform and discussed how to think about our data, and how to ask questions that could be answered in Amplitude. I taught my users (my product stakeholders)how to be curious about the data, and how to make charts to satisfy that curiosity.
Now, whenever my product team has a question they can’t answer, I ask them to put time on my calendar so that we can work through it together. If anything notable—positive or negative—comes up in our dashboard review, I encourage the team to dig into the data. I empower each team member to step into the driver’s seat independently, but I’m always willing to look into problems as a team.
One of the biggest benefits of this training process was familiarity. The more comfortable my team became with Analytics, the less intimidating ‘data’, as a whole, became. They also became more comfortable with me, and that trust has led to better collaboration.
The slow path to changing data culture
Learning and habit-building take time and repetition. Many of us tech workers have been taught that success comes when we ‘move fast and break things, but if you’re trying to change data culture, you’ll need to temper your expectations.
I myself had to do this too. I made the assumption that once people were familiar with Analytics, they would develop their own rituals around viewing and using dashboards in the platform. And yet, I kept fielding questions that were already clearly answered, on existing charts and dashboards. It was evident that my team wasn’t using the platform as often as I’d hoped.
So, I knew I needed to help build the habit muscle. To that end, I set up a weekly dashboard review where the PMs and I scrutinize our core metrics, using Amplitude dashboards. These regular reviews inevitably surface other questions that we can investigate together in Analytics. So, not only did we make it a habit to start the week by aligning on metrics, but by doing so, we set ourselves up for more ‘light bulb moments.
My efforts got us somewhere, but what also helped was a clear message, and some accountability, from product leadership to their product teams. When product leadership made it clear that they expected the product teams to own their metrics, not just the data partners, we began to see more and more people not just viewing dashboards, but doing some exploration on their own. That was a huge win.
Since I started working on Analytics evangelization, we’ve seen decent growth in active users. I’m proud of that, but I’m even more proud of our growth in learning users, people who aren’t just viewing dashboards for their own knowledge but actually creating and sharing content with others.
The goal: data-fueled product development
Forget being data-driven. We’re aiming for data-fueled product development—product development that’s driven by the product team but fueled via a partnership with the data team. It just doesn’t make sense for all of the data exploration, insights seeking and analysis to be limited to people with the word ‘data’ in their title. Amplitude Analytics is built to enable the whole product team to explore their data; in some sense, for anyone on the product team to be a member of the ‘data team’. And the bigger the ‘data team’, the more ‘data fuel’ you add to product development.