4 Ways Bias Sneaks Into Your Mobile Analytics

You know that analytic errors can cost your business time, effort, and — unfortunately — money. That's why it's important to identify where you might go wrong, before you make the mistake.

Best Practices
December 13, 2016
Image of Archana Madhavan
Archana Madhavan
Instructional Designer
4 Ways Bias Sneaks Into Your Mobile Analytics

In 2013, Helen Turnbull described feeling uneasy when she saw a woman was piloting her flight instead of an older, white man. Dr. Turnbull (who has a PhD in internalized oppression) actually considered getting off to wait for the next plane. Biases are predispositions to one thing over another, and they affect everything from casual conversations to hiring decisions. If Dr. Turnbull falls prey to the wiles of bias, so do all of us. Though most of us understand biases in the world around us — does your workplace have left-handed scissors? — it can sometimes be tricky to spot them in statistics. Numbers don’t lie, right? Well, numbers might not, but we do. We need to be conscious about eliminating bias no matter what we’re doing — including when we’re interpreting our data. [Tweet “4 Ways Bias Sneaks Into Your #Mobile #Analytics”]

Let Your Data Speak for Itself

Here’s an example. Let’s say it’s time to figure out the next step to spur your company’s growth. There are several tactics that you look at, but you’ve been hearing a lot about pricing. As you do your research, you figure that your pricing isn’t right and you’ll benefit from restructuring it. Welcome to confirmation bias: where you find what you’re looking for, because you’re looking for it. Or, at least, you heavily weigh what supports your preconceived conclusion — even if you didn’t realize you had one. The SaaS scene, like everything else, cycles through popular ideas. Even what we talk about at Amplitude — vanity metrics, retention frameworks — is informed by what people in the mobile analytics world think is, or isn’t, important. It’s not bad to pay attention to other SaaS companies, or to piggyback off their ideas. Where you get into trouble is:

  • Not digging deep into your analysis because you have an inkling that something is right.
  • Adopting metrics because other people use them, even if they aren’t right for you.
  • Jumping to conform to a new standard practice, like being data-driven, without knowing what you’re doing.
    There’s a lot to learn from smart people in the industry, but once your brain has processed an idea, it can come up later in ways you don’t even realize — and that’s where confirmation bias comes from. Avoid confirmation bias by: maintaining an internal focus, and setting your own benchmarks. Absorb the good stuff out there about mobile analytics, but keep your eyes on the prize: the metrics that best suit your business, even if they go against your “instincts.”

Fight Laziness

Say you’re trying to figure out why your app’s users don’t make purchases. They could be jumping out of the checkout sequence, so you track that. But the days go by and people seem to be making it through the checkout all right, so you go back to the drawing board until you run out of things you want to look at. Suppose that the issue was actually the way your search function ordered items presented to customers — that’s not nearly as easy to figure out as a bounce rate during checkout, so you just didn’t check it. That’s the streetlight effect — looking for answers where they’re easy to find (“well lit”) instead of where it’s best to look (“dark”). Maybe you’re in a time crunch, so you feel the best choice is to try the easiest solution, or maybe you aren’t comfortable with analytics. Whatever the reason, falling prey to the streetlight effect is a waste of time and will frustrate your ability to make decisions. Consider our exploration of DAU measurements. It seemed like it would be simple to just look at who’s opening your app every day. But we said:

“It’s not about looking at a single kind of action over a period of time. It’s about looking at multiple actions and the people that perform them multiple times over a period of time. It’s about going through your app feature by feature and seeing how often people come back to use them.”

Going through your app feature by feature is a lot harder — a lot more in the dark — than just looking at who is opening your app but it also offers you real value. **Avoid the streetlight effect by: **taking more time to frame your questions in a way that forces you to dig deep into data. Looking at a process through a series of questions is a lot easier to handle than going straight to the numbers, especially if you’re not comfortable with numbers.

Cut Your Losses

You set up a custom metric and, even though it isn’t great, you don’t want to ditch it because of the time and effort you put in. In a nutshell, that’s escalation of commitment: you cling onto something that isn’t working, and rationalize using it simply because you’ve invested so much in it. This isn’t just for single metrics, either. Maybe you’re doing stats in-house and it’s not productive anymore, but the setup was so arduous you keep it. Or maybe you need an upgrade from the analytics software from your launch, but you keep justifying why it works. No, really, it does! The bottom line is that escalation of commitment keeps you attached to bad data and inefficient systems. Figuring out the best stats for your business can be a consuming process, so don’t prolong it by being unwilling to scrap things simply because of the effort they took to set up. Avoid escalation of commitment by: finding a way to divorce that emotional attachment from your decision making. Try describing the problem to someone outside of the team, charting out the objective pros and cons, or shifting your focus from the energy you’ve lost to the value you’ll gain.

Don’t Hover

Let’s say you start tracking a new stat you’re really excited about. You might be eager to see how things are going and peek at your data when it starts being collected. You think that you see some early indications of an exciting result! What you’re actually experiencing is the clustering illusion, where you see a pattern in random results, usually because of an insufficient amount of data. This doesn’t sound like a big problem, right? You made a mistake with early data, and you’ll correct it when you get the full picture. Except on a psychological level, that’s not actually what happens. Once the idea of a pattern or conclusion lodges into your head, it can affect the way you perceive the rest of the data, even though the initial result is erased altogether. Avoid the clustering illusion by: setting time frames for you to check in on your stats so you don’t check them too early. Looking at data can be addictive, but sometimes you have to just let it roll.

Back to Basics

Bias affects everything, and mobile analytics are no exception. You know that analytic errors can cost your business time, effort, and — unfortunately — money. That’s why it’s important to identify where you might go wrong, before you make the mistake. [Tweet ” Analytics errors can cost your business time, effort, and money.”] A lot of bias is unconscious, and we’re all trying to do the best we can to gather accurate data. Being methodical and critical of your views, data and conclusions will help push your business forward in the right direction.

About the Author
Image of Archana Madhavan
Archana Madhavan
Instructional Designer
Archana is an Instructional Designer on the Customer Education team at Amplitude. She develops educational content and courses to help Amplitude users better analyze their customer data to build better products.
More Best Practices
Image of Darshil Gandhi
Darshil Gandhi
Principal Product Marketing Manager, Amplitude
Image of Darshil Gandhi
Darshil Gandhi
Principal Product Marketing Manager, Amplitude