Amplitude AI Builders: Brian Giori Discusses MCP

Connect Amplitude data to your favorite AI systems so you can use behavioral data in new ways.

Inside Amplitude
October 22, 2025
Adam Bonefeste headshot
Adam Bonefeste
Senior Manager, Content Marketing
Brian Giori talks MCP

This post is part of our Amplitude AI Builder series. Each one will feature an Amplitude engineer discussing an AI product that they are building.

A few months ago, I started seeing “” pop up on slide decks and meeting agendas. I had no idea what it was. I knew it was something the team was working on, but it seemed so technical and complex that I had no idea where to dive in. , a senior engineering manager at Amplitude, saved me. He walked me through how MCP works and why it’s a valuable part of the long-term vision at Amplitude. He did such a good job walking me through it that people now come to me asking the same questions I originally asked him.

For this post, I talked to Brian about how MCP took off at Amplitude. We covered how a couple of successful hacks turned into a real product, how Amplitude customers will use MCP, what tools it connects to, and more.

What is MCP?

MCP stands for model context protocol. It’s a fairly new protocol that Anthropic developed as a way for LLM applications, or even internal agents, to access and act upon external data in a standardized way.

MCP replaces a lot of manual work in a way that is secure, easy to set up, and easy to use. It dictates how LLMs should access data and which types of data are accessible.

Since everyone agrees on these guardrails, teams only need to build one MCP server, and that can be used in many different places. It’s much better than needing to build ad hoc connectors for every individual data source. That's really important for the AI space, because great AI applications require great context, which means feeding in a lot of data.

I like to think of MCP as the USB-C for application protocols for AI applications. In the early 2000s, you had a million different connectors for different purposes. For display alone, you had VGA, DBI, HDMI, right? Your laptop ended up looking like a brick of just all these various connectors. Since then, we’ve standardized. Now, USB-C is the de facto port for everything. It does data transfer, charging, display, etc. all in that one port. MCP is a similar all-in-one connection for AI apps.

How did the MCP project at Amplitude start?

It happened in three phases that all moved pretty fast. MCP started as part of the product development organization’s . That was a top-down initiative for everyone within the product development org to focus on building with AI. I had noticed that people in our go-to-market teams were relaying requests from customers about accessing Amplitude via MCP, so I decided I would build it. The goal for my AI Week team was just to get a simple MCP server stood up in production. It was basically just three of us. We worked all week to get it up and running, did internal authentication, and it worked! It was only internal, but it worked.

About a month later, there was another company hackathon. For that hackathon, I pitched that my team would get MCP usable by customers. Our primary goals were to solve authentication and add high-value tools, like query tools, to get data out of our system. We formed a team of 3 or 4 engineers. By the end of hack week, we got OAuth working. It could connect to Cursor and Claude and run our initial toolset to search content and query basic content in Amplitude. It was really a customer-ready MVP. It wasn’t perfect. It broke. It needed some enhancements, but it was ready. It won the Ship It Award at the hackathon.

And then those sprints turned into a full product build?

Exactly. After the hackathon, the floodgates really opened. Every couple of days after that hackathon, I had someone from leadership Slacking me about the MCP server. I had other work to do, but there was a lot of excitement around MCP and I had run with it. I worked on it during the weekends and after hours. I started reaching out to interested customers to incorporate their use cases and get it into their hands.

Then we started to expand out to more customers. We let them kick the tires on it and listened to them about what else they needed. Two months after the project started, I even did a demo for our board of directors. There was a lot of excitement. The board members saw this as a great opportunity for Amplitude.

How does MCP help Amplitude customers?

I think that MCP has two primary points of strength. The first is mixing Amplitude data with other data. Especially data that your team has access to, but that you don't already give Amplitude access to. For example, you might have really secure or personal data that only exists in your database because of security concerns. Normally, you don’t mix that data across tools. MCP lets you build an internal LLM that accesses all of that data you own from third-party tools, access your codebase, and adds in your unique context. It opens up a lot of doors.

The second point is using MCP as a protocol within Amplitude to power our own AI applications. Imagine I build a tool that creates a . I build it, verify that it works, check the inputs and outputs, etc. With MCP, our customers can use those notebooks in third-party applications. Our internal applications, like our own Agents tool, have access to that tool as well. So any improvements made on that tool are multiplied.

What do you think are the most common applications customers are going to connect to their Amplitude data with MCP?

I think they’re going to start by answering basic questions. They might want to use Claude or another MCP-compatible client to pull in data that they have in Amplitude. It will be much faster than finding a dashboard, clicking a chart, exporting that to a CSV, pasting the CSV, adding the context about the event data, etc. With MCP, there are semantic layers of context in Amplitude that can automatically connect to an LLM to answer questions for our customers.

For example, if they use Claude, users can ask Claude to analyze a dashboard. MCP will do all those analysis steps to find answers. It will search through their dashboards, query their charts, and summarize the information. It can also mix in data from finance tools, JIRA tickets, Glean, internal documentation, etc., to ask follow-up questions and pinpoint a root cause.

How does MCP interact with our dashboard Agent?

The dashboard summary Agent does a very specific thing: It summarizes Amplitude data and pulls in context to tell an accurate story. The dashboard Agent does a thorough analysis of a dashboard to provide periodic summaries and updates.

MCP expands the data and context set that the Agent can access. It brings in external sources so the Agent can do a lot more. Depending on your client, MCP will generally get the dashboard, query some charts, and give you a quick overview. But you need that Agent for the expert analysis.

How do you see MCP and Agents collaborating in the future?

It starts with the foundational level of MCP—we call it L1 MCP. That’s where we are now. L1 MCP is basic search, data access, creation, and query. These MCP tools can read a notebook, analyze a chart, query an experiment, basic stuff that a user could do in the product UI. This level of MCP provides an outline of the way people work.

Then you could build higher-level tools on top of that: L2 MCP. You could build a tool that actually runs purpose-built Agents that then calls other tools.

MCP turns out to be a sandbox for building better Agents and better AI applications on Amplitude, letting teams build custom or bespoke things that can be used by more tools in more places.

About the Author
Adam Bonefeste headshot
Adam Bonefeste
Senior Manager, Content Marketing
Adam is a senior content marketing manager at Amplitude. He writes about how data teams can use technology to answer questions about their customers and their products.

Tags