How I Amplitude Series

Running and Validating Activation Experiments

How do you identify growth opportunities, run experiments, and validate results using Amplitude? Join Weston Clarke, Director of Product Management at Amplitude, as he takes us through a real experiment to move the needle on activation rates.

Company logo

“It’s usually better to measure success within a shorter time window, like a day or a week, rather than waiting three weeks to get only incremental insights. This balance between speed and accuracy is key when setting up product metrics.”

author photo
Weston Clarke
Amplitude
Director, Product Management

In this How I Amplitude session, , Director of Product Management at Amplitude, walks us through his workflow on how he identifies growth opportunities, runs experiments, and validates results using Amplitude.

Watch how he pairs both quantitative and qualitative data as he takes us through a real experiment to optimize the Amplitude homepage to drive more sign-ups.

Driving Product Growth Through Experimentation

I’d like to share some examples of activation experiments we’ve run recently.

One experiment focused on changes we made to our homepage to drive more sign-ups and get more users into the product.

If you’ve visited our homepage, you know there’s a “Get Started” call-to-action (CTA), a public demo, and ways to contact sales or explore product information. We wanted to improve the visibility of these features, especially by making them more engaging. One of the changes we tested was adding GIFs and animations to better showcase product features.

The feedback we’d been getting was that our was too abstract and marketing-heavy. So, with these changes, we aimed to highlight specific product capabilities. If you’re familiar with our product, you can access these updates in the navigation.

Driving Product Growth Through Experimentation

We ran this particular experiment over a month, and the goal was to increase clicks on the “Get Started” CTA. We had about 33,000 exposures and 1,500 conversions, with a 2% lift in the treatment group.

The experiment tracked users from the CTA click to further actions, like actually signing up and sending in data. By viewing this data in Amplitude, we were able to create and see how actions, such as clicks on the homepage, translated into users doing things in the product.

For instance, we could track the user journey from a page view on our website to clicking the CTA, and then to events like sending in data or saving their first chart. All of this data is easily accessible because of how we .

is also a best practice. They help us track top-level metrics, like CTA clicks or sales inquiries, and ensure everything is ready before launching experiments. It’s important to set clear expectations and success metrics upfront, rather than running experiments indefinitely and continuously re-evaluating the goals.

Exploring Events to Uncover Optimization Insights

Now, let’s dive into some examples of how we diagnose traffic and sign-ups.

Each week, our growth team looks at several factors: traffic to our site, CTA clicks, and sign-up conversions.

These are all separate levers we can optimize.

For example, we might focus on SEO to drive traffic, or on improving CTA placement across content like blogs. We also work on removing friction in the sign-up process, such as optimizing SSO (single sign-on).

One key tool we use is , which allows us to track:

  • clicks
  • page views, and
  • form updates across the product.

Let’s take a look at page views on our site and how we analyze funnel conversion rates over time.

For example, we can build a funnel from a page view on amplitude.com to a CTA click. The ‘CTA Clicked’ event fires when any CTA on the page is clicked, so it shows engagement with the page. We can filter it by factors like where on the page it was clicked, what the text is, etc.

Using the Chrome extension, you can see the events firing on the page, and then dig into their properties.

Exploring Events to Uncover Optimization Insights

In this dataset, we see a conversion rate of around 4%. We can also look at how this conversion rate changes over time by adjusting the view to weekly or daily.

This can help us identify trends and understand the average time it takes users to complete the funnel stages e.g. convert from page views to CTA clicks.

We can further break down this data by attributes like device type, which allows us to see if certain devices convert at a higher rate. This kind of segmentation helps us fine-tune our growth strategies.

We have an example event here for creating a sign-up account plan, which shows what happens when someone creates a plan in our product. Over the last twelve weeks, we’ve had about 170,000 visitors in this demo data. Out of that, we’ve had about 2,000 sign-ups, which gives us a sign-up conversion rate of around 1%.

One interesting thing we can do is dig deeper into the data and see if there are specific pages or device types driving this conversion. For example, we can look at the device family—whether users are on Windows, Mac, Android, etc.—to understand where these sign-ups are coming from.

In this dataset, it turns out that Windows and Mac OS X account for the majority of sign-ups. Desktop sign-ups, in particular, are driving most of the conversions, with Android, Chrome OS, and Linux contributing smaller, noisier numbers. This difference in volume is important to keep in mind, as percentage-based analyses can sometimes mask the significance of absolute numbers.

This is a valuable insight because it tells us that most of our sign-ups come from desktop users. For things like paid marketing campaigns, we might want to focus more on optimizing for desktop platforms, given the higher conversion rates we’re seeing there. This feedback loop helps refine our marketing strategies.

We might also find similar patterns if we go deeper into the product, looking at actions like saving the first chart. If certain platforms show higher user intent, that could be valuable to know as well. For example, macOS users might exhibit similar behaviors, while Linux users, despite high percentages, still show very low actual numbers.

Product-Led Growth (PLG) funnel

After someone signs up, there are multiple paths they can take.

On our team, we track various post-sign-up behaviors like saving a chart, watching a session replay, or syncing a cohort.

These actions vary by user, and just like with platform differences, we see different behaviors based on the actions users take or the data sources they set up.

So our setup page is where users get started with Amplitude. Previously, we had a lengthy onboarding checklist that asked users about their use cases, available data sources, and coding abilities. This helped people find the right data sources but also hurt our onboarding rates, as many users dropped out before completing the checklist.

Now, we’ve streamlined the process with a recommended onboarding experience that uses a single line of code to install Amplitude, offering session replay and auto-capture along with the full browser SDK. Over the past year, we’ve also introduced new data sources like the WordPress plugin and a Shopify app, as well as more traditional options like the HTTP API and mobile SDKs.

Now, looking at our , we can see how users are engaging with the setup page. For example, by using the event explorer, we can track which data options users are clicking on the most. In this demo dataset, “View All” seems to be a popular option, which suggests users want to explore more than the default tiles we’re showing. Google Analytics also ranks high, followed by options like the HTTP API and CSV file upload.

Using these insights, we can adjust the tiles we display to better match user preferences. By tracking user interactions on this page, we can better understand which data sources users find most valuable, and adjust our strategy accordingly.

Product-Led Growth (PLG) funnel

Matching Experiment Length to Behavior Type

Now let’s dive into another important action: saving the first chart. This is a key event for us at Amplitude, and in this dataset, we can see how long it typically takes users to complete this action after signing up.

And we look to answer questions like how often does it happen? How long does it take?

For example, let’s look at users who save a chart within a week of signing up. Around 5.5% of users do this within seven days, which is up from 4% on day one. This suggests that the majority of users who will save a chart do so within the first week. Waiting longer—like 21 days—only increases that percentage slightly, to 6.5%.

Matching Experiment Length to Behavior Type

This insight is crucial because it tells us that most users who are going to take an action will do so quickly. So, when we’re making decisions or running experiments, it’s usually better to measure success within a shorter time window, like a day or a week, rather than waiting three weeks to get only incremental insights. This balance between speed and accuracy is key when setting up product metrics.

Although, it can vary based on the context of a test. For product metrics, like the PLG funnel we’ve been discussing, faster is usually better. You want to get insights quickly so you can iterate on decisions within a quarter. However, there are cases, like sales cycles, where you may need to wait longer to fully understand user behavior.

For example, when we look at users on our free Starter plan, the time it takes for them to upgrade to a paid Plus plan is relatively short—usually within a few weeks. However, the majority of our users are still on an older Starter plan, and they may take longer to convert. In cases like this, where you're analyzing deeper engagement, it makes sense to look at longer timeframes, like 45 to 90 days, to see meaningful behaviors like chart creation, dashboard usage, and team collaboration.

So, the key takeaway is that for most product metrics, you want to optimize for the shortest time period that gives you actionable data. But in cases where you're looking at more long-term behaviors, like retention or engagement over months, it’s okay to extend the window to get a clearer picture.

Join the community!

Connect with Weston and other practitioners in our community. The Cohort Community exists to help people become better analysts. We focus on actionable programs, sharing best practices, and connecting our members with peers and mentors who share in the same struggles.

Platform
Resources
Support
Partners
Company
© 2024 Amplitude, Inc. All rights reserved. Amplitude is a registered trademark of Amplitude, Inc.