At Amplitude, we’re constantly working to evolve our product offerings so that we can continue providing best-in-class analytics and help our customers build better products. But product and feature launches are full of unknowns.
How will customers respond to the product? How will it interact with our current offerings? Will technical issues arise as we scale? These uncertainties make decision-making challenging, and it’s easy to become overwhelmed. The key is to gather as much information as possible to inform your decisions while maintaining momentum.
As a director of product management at Amplitude, I collaborate with customers and colleagues to deliver features and products that solve our customers' problems. In this post, I’ll walk you through the process we used to launch our new product—and show how we used Amplitude to guide our decisions.
- The Amplitude team uses the Amplitude platform to support decision-making during our product launches.
- We use market and customer research to define the product vision, ensuring our new products solve real customer problems and make business sense.
- Our product development lifecycle includes alpha and beta phases, where we gather user feedback and iterate on the product.
- Preparing for launch is a whole-organization effort: We educate the team and other stakeholders about the new product, update demo environments and internal systems, create marketing materials, and more.
- We use Amplitude to track key metrics such as activation and retention rates and a North Star Metric throughout testing, launch, and post-launch.
- The Amplitude platform enables us to quickly identify and address any issues. We use Session Replay to gain qualitative insights into customer behavior—in addition to our core analytics offering for quantitative insights—which helps us know how to improve the product.
1. Define the product vision
The first step in our process is defining the product vision by identifying the problem the product will solve and its market opportunity. We use market and customer research to pinpoint the best way to solve that customer problem and validate our ideas.
Amplitude’s helps teams build better digital experiences with qualitative insights by showing a visual rendering—or “replay”—of customer behavior in a product. Combined with Amplitude’s best-in-class , Session Replay helps teams resolve friction points to improve conversion rates and drive growth.
Launching Session Replay was slightly different from rolling out a standard new feature because it represented a new revenue-generating product for us. That meant we had to build a strong business case for the product by creating a financial forecast based on market research.
2. Build and test
Once we had the vision, our designers and engineers worked together to create the first version of the product. We ran different stages of the lifecycle, starting with an alpha phase (a test with a small group of customers) followed by a closed beta (where we had more features available, supported more use cases, and included more customers with different requirements).
Gather feedback from users
During the alpha and closed betas, we focused on gathering feedback via live calls, emails, , and to address any issues. Engaging with customers and prospects helped us further validate the feature set, solve any problems, and iterate on the product.
We also tracked user behavior in the product to understand how customers engaged with it by looking at aggregate and individual usage patterns. For example, customers can use Session Replay from different parts of the Amplitude platform, so we analyzed where they were accessing it using some of our quantitative analytics charts.
Session Replay enables you to watch replays of the steps a user takes in a product. We actually used the Session Replay product ourselves during these phases to analyze how people were using Session Replay. Pairing the quantitative analyses with the qualitative replays gave us a whole new level of insight into our customer’s product experience. We could look at “macro” behavior with charts and then drill deeper into individual user “micro” experiences with Session Replay.
Define product metrics
During the testing phase, we established success criteria for the product and its business performance. For new products, it’s important to track the to see whether customers find value and the to see if they continue to find value over time.
In addition to tracking activation and retention, we set a for Session Replay: the number of replays viewed. Given the lack of baseline metrics for the new product, we used proxies from the existing Amplitude platform to set our targets. We hypothesized that viewing a replay is similar to viewing an Amplitude chart, so we based our segment targets on how often similar segments view charts.
We also defined other metrics to track, which ladder up to that North Star. For example, customers can’t view a replay if they haven’t piped their data into Amplitude, so we track the number of customers who’ve ingested replays, too.
We implemented a strategy by giving customers a taste of Session Replay at no extra cost. Existing customers can watch a limited number of replays depending on their current plan, with the option to purchase more. On the business side, we set our North Star as the attachment rate: the percentage of customers purchasing additional Session Replay capabilities.
3. Prepare the entire organization for the launch
A product launch, especially for a new revenue-generating product, involves much more than just the product team. To successfully launch our new Session Replay product, we needed to ensure every part of the organization was prepared and aligned.
Educate and inform
To make sure everyone was well-versed in the product’s features and benefits and the support team could assist customers, we established a group of Session Replay subject matter experts who educated the organization on the new product.
At Amplitude, we have an internal demo environment our team uses to present the product to customers and prospects. We updated this demo to include Session Replay and wrote new demo scripts for our go-to-market team.
Our marketing efforts included various activities leading up to and following the launch day. The marketing team created content, social media campaigns, press releases, and to engage with customers across different time zones.
Session Replay has helped us grow our platform. That means it was important to work with the analyst community, partners, investors, and other stakeholders before the launch to educate them about the product.
We also updated our internal systems to support the launch. For instance, by making sure our sales team could track new opportunities in Salesforce, our customer success team had resources to support users, and our reporting and pricing systems were configured to handle the new product.
Set pricing and packaging strategy
To decide on pricing for Session Replay, we used feedback from the closed beta and did competitive research to determine the optimal price while also considering our unique value proposition with our offering. We also collaborated closely with the marketing team to develop the product’s positioning strategy, identifying the most effective ways to present the product to our target audience.
4. Launch and refine
Our work doesn’t stop when the product launches. We closely monitor performance and take action if our metrics aren’t where we want them to be.
Quickly identify and respond to issues
Months after launch, the first thing I do every morning is check an to see how our main metrics are trending. We also report on how metrics are trending relative to our quarterly targets every week. If any numbers are unusually low or high, we analyze the data to identify what’s causing that change.
Naturally, as more customers started to use Session Replay, we saw more bugs and issues. Our Amplitude dashboards have enabled us to quickly identify those problems and work to resolve them.
Combine quantitative and qualitative data to better understand users
When quantitative metrics indicate there’s a problem, we use qualitative data to uncover what’s happening and determine how to improve it. After launch, for example, I saw the retention rate for a certain segment of users was low. I watched replays of their experience in Amplitude to see if anything unique was going on, and we introduced a thumbs-up and thumbs-down button so users could give us direct feedback on their experience. This enables us to easily catch bugs and identify incremental experience improvements that drive retention.
Collaborate cross-functionally
Collaboration across the organization is crucial in the lead-up to a product launch and post-launch. I had never worked so closely with the marketing team on any product launches in the past. With Session Replay, we quickly noticed that our activation levels in the first few weeks were not meeting our expectations.
From a product perspective, we identified a few quick enhancements to remove friction and encourage customers to start viewing replays. But it wasn’t just the product team that responded. The growth marketing team ramped up external communications using emails and in-app prompts to remind customers about the new product. We also improved our support and developer documentation to make it easier for customers to get set up. The ability for cross-functional teams to access streamlined collaboration.
That “all hands on deck” approach, along with real-time behavioral data from the Amplitude platform, enabled us to quickly identify the areas where we needed to make adjustments or add support—ultimately helping us boost our activation rate.
Constant learning
, but we’ll keep tracking how customers behave within the product and speak with them directly so we can learn what they need and continuously improve. The data we gather from Session Replay and our experiences during the launch will help us make our next product launch even more successful.
Learn more about or .