Four Stories From the Weird World of Startup Analytics

We love hearing stories about how analytics helped uncover the unexpected cause of some strange phenomenon.

Customer Stories
May 5, 2016
Image of Archana Madhavan
Archana Madhavan
Instructional Designer
Four Stories From the Weird World of Startup Analytics

In 2012, a data competition on Kaggle tasked participating teams with identifying what factors contributed to a car being a “lemon,” or a bad purchase. But it was the factor that most connected with car reliability that was really surprising. Orange cars, the Kaggle teams found, were universally less likely to have after-purchase problems. Kaggle’s founder stepped in to provide analysis: “Orange is an unusual color,” he said, “You’re probably going to be someone who really cares about the car [if you bought orange] and so you looked after it better than somebody who bought a silver car.”

It sounds absurd, but all of it came straight out of a massive data-set analyzed by hundreds of scientists and coders. There are all kinds of strange phenomena just like this out there. There are apps that soar in popularity for a day, then tumble back down into obscurity. There are sudden bursts when your home page converts at a rate of 20%, then goes back down to 15% with no explanation. But the only way to harness these phenomena to improve is to use analytics. When you witness something unexpected and you’re able to go back and understand it with your data, what you’re doing is bumping up against the limits of your idea of the world—and pushing it a little further. You’re learning. You’re making progress.

At Amplitude, helping startups make progress and learn about the mechanics of their own apps is what drives us. We love hearing stories about how analytics helped uncover the unexpected cause of some strange phenomenon. That’s why we recently asked some of our data-driven startup friends for their own personal “orange car” stories.

We asked for the odd, the out-there, the realizations that turned businesses around, or the discoveries that overturned long-held beliefs. Here they are: four plain surprising stories from four startup data teams.

Paul Berkovic — Cofounder/CMO — ScribblePost

Cofounder Paul Berkovic of ScribblePost had looked at enough email open rates to know that people were sick and tired of hearing about “email replacements.” He had a feeling that it wasn’t necessarily the _idea _of an email replacement that turned people off. Through A/B testing, he confirmed this suspicion. People still found email to be a huge pain point, they just didn’t necessarily want to hear about reinventing it anymore.

It’s a core part of the product, so we were working that messaging back into our narrative. Basically we started inventing all sorts of creative marketing language to hide the fact that we are helping make email better. So many products have come out claiming to kill email, we assumed that people would lump us in with those products, and any claims about email would be uninteresting.

Berkovic realized that ScribblePost couldn’t just come out and say that it would make email better. Too many companies had already claimed to have the “next best thing,” and all they’d done was mess with people’s trust in that kind of claim.

We started testing messaging that explicitly talks about email as a problem, and how we plan to solve the problem. It turns out that none of those existing products have really delivered on their promise, and people are still really excited to find a better way to manage email!

Looking at the data from their A/B tests of particular subject headers, Berkovic and his team realized that the problem was how people were framing the idea of an “email replacement.” They were talking about killing email, or email dying off, or other fantastical ideas. Emails like that were getting terrible open rates. But the data also showed that people still considered email a pain point: they just wanted solutions, not rhetoric. Analytics are for more complex than A/B testing whether two exclamation points or three generate more opens. Analytics helped Berkovic and his team shift their entire messaging strategy toward what their users actually valued.

Ty Magnin — Director of Content Marketing — Appcues

User experience design is a complex art, one that has a lot to do with your intuition for how people will react to a change—and that’s exactly why you need to incorporate data collection into your process. Even the most subtle modification can have far-reaching and powerful consequences. Director of Content Marketing at Appcues, Ty Magnin, demonstrated this for us by showing us pictures of the Appcues home page before and after a small UX shift.

Our old marketing site converted at a good clip, ~10%. We only had 3 main pages: homepage, features and pricing.

Appcues-home-page-1

I thought that if we added a navigation bar it might help people learn more about the product and then sign up.

Here’s what the site looked like when they made that change.

Appcues-homepage-2

Turns out, conversions dropped! A lot!

Putting a navigation bar on their homepage dropped their average conversion rate to 4.1%. After about three or four weeks of this, they got rid of it—immediately, their rate jumped back up to 8.2%. It was a highly counter-intuitive discovery. Adding more information to the home page, and not even in a particularly obtrusive way, was very harmful to their overall conversion rate.

You might expect that kind of thing with a total design overhaul, but all they did was add a navbar. It’s so counter-intuitive that there’s virtually no way you could see something like that coming. If you weren’t paying close attention to your analytics, you would just wind up sitting around and wondering why your conversion rate had dropped—you probably would never think that a handful of extra pixels in the corner of your site was at fault.

Because Ty and his team paid attention to their analytics, they were able to both reverse their mistake and gain useful insights into how people interact with and perceive the Appcues homepage. Adding the options to their navbar was clearly a failed experiment, but no one who monitors their analytics for any meaningful amount of time is going to avoid having a few failures. The key is to embrace those failures, because even a failed experiment is going to teach you things about your users you would never have learned otherwise.

Andrew Capland — Growth Lead — Wistia

When business video host Wistia was working on improving its activation rate through better onboarding, growth lead Andrew Capland decided they were going to get rid of tool-tips. These came in the form of pop-ups, and Wistia had been using these inside the product for a long time to demonstrate features to new users.

Our team had been slowly updating the empty states of tools in the product. Our goal was to eventually remove the pop-ups and have the product explain itself without the additional pop-ups.

They were doing all of this without making the changes public. They wanted to get all of the empty states of their different tools ready before they launched their new on-boarding process. But then something strange happened.

We were 3 months into this project when we accidentally removed the pop-ups before we were ready. We didn’t notice this change for 2 weeks, but dug into the data and realized our activation rates went up!

Without data to look back on, this whole affair would have been regarded as a massive error. Accidentally taking out the pop-ups could have been seen as rolling back those 3 months of progress on the project. Instead, it was seen for what it was—a happy accident. We talk about growth and retention as “experiment-driven” processes so much that we sometimes lose sight of the fact that many discoveries occur by chance. Happy accidents will happen. They can only happen, however, if you’ve been tracking the data you need to work back to the root cause later. Even if taking out the pop-ups had been a mistake, then Andrew and his team would have still had something to show for it—data demonstrating that their users weren’t responding to the change. Without analytics, there will be no productive mistakes. With it, all of your mistakes will be productive one way or another.

**Patrick Campbell — CEO — ProfitWell/Price Intelligently **

When you’re part of a SaaS pricing company, releasing a free SaaS product has a bit of irony—especially when you’ve aggressively argued about the misuse of freemium before.

It’s basically SaaS gospel that you shouldn’t offer anything, even a side project, for free. It devalues your main product, the thinking goes, and you don’t reap any value from the work you put into developing it. No one knew this better than Patrick Campbell, CEO and co-founder of Price Intelligently. His company’s main business was consulting other startups on how they should price their products. They were intensely data-driven in their methods. They tracked churn, conversions, and upgrades over time. They tracked every change they made to produce the best possible pricing plan for each of their customers. So when they came up with the idea for a tool that would plug into Stripe and provide users with accurate SaaS financial metrics, they were almost surprised to even be debating price. They were, as Patrick says, “the SaaS pricing people—why in the world would we _not _charge for something?” As you might have expected, they used data to come to a decision.

We are the pricing folks, so we knew how to collect some data that would settle the internal argument. What we found was really interesting. While companies were definitely willing to pay for the product, we found that the scale to which they were willing to pay didn’t dramatically change as the company became bigger. Sure, a HubSpot sized company may __pay four figure MRR, but the data indicated that the overall average for ProfitWell fell closer to below $100 per month.

In other words, ProfitWell wouldn’t be able to scale just by charging for their core product.

When we saw this data, it only took a little bit of back of the envelope math to make a clearer decision for free or paid. We estimated based off some public data that there were between 10,000 and 20,000 SaaS companies out there, but most of those companies weren’t that large. Even at 50% market share, that would make ProfitWell an awesome company, but not huge.

We did some additional research and thought of where we would make revenue and determined that if we lowered our CAC dramatically by giving the product away for free, we could then make money off premium add-ons that we could put on top of Profitwell – things like Retain, Recognized, etc. Thankfully, our data paid off 🙂

Pricing products is complicated. There are tenured behavioral economists who spend much of their time researching the way we respond to different price points. The #1 mistake you can make is pricing your product mindlessly. You have to use data. You have to track conversion, bounce, and click-through rates and you have to monitor them as you change your variables. You have to track churn, retention, and do market research to understand how your pricing strategy might ultimately limit your business. Do that, and you’ll be able to turn the pricing of your product into an asset, a boon, and not a weak link.

Happy Accidents And Surprise Discoveries

We’ve written in the past about how data-driven teams can leverage their analytics to become more creative. But the discoveries you make in your data can also be surprising. They can take you totally off-guard, make you question previously-fundamental aspects of your business. The teams competing in that Kaggle competition made such a discovery when they found that orange cars were the best to buy pre-owned. Analytics can help startup founders discover that a business model they hold in deep esteem is secretly flawed in a particular case. It can help refine messaging, overturn the obvious, and demonstrate a better path forward. Analytics help us craft thoughtful experiments which, after we run them, can be assessed not in terms of right or wrong, but in terms of the information that they bring us. And then there are all the discoveries that occur outside of any kind of experiment. The strange things that just seem to happen, the mysterious spike in your bounce rate, the sudden cart-adding frenzy you’re at a loss to explain. These are all massive learning experiences if you have the right set of tools.

We only learn when we are surprised by what we find and have the data to work back to a root cause. When we spot a pattern that we didn’t think was possible, and then analyze it to figure out why it’s happening, we build up, brick by brick, our understanding of how the world works.

About the Author
Image of Archana Madhavan
Archana Madhavan
Instructional Designer
Archana is an Instructional Designer on the Customer Education team at Amplitude. She develops educational content and courses to help Amplitude users better analyze their customer data to build better products.