Learn How to Build an MCP-Powered AI Agent
Anthropic's Model Context Protocol (MCP) is the hottest new technology in town. Learn how to build an AI agent using MCP protocol. Approximate implementation time: 1 hour!
Large language models (LLMs) are powerful, but they don’t work in isolation—they need structured access to real-world data. Traditionally, integrating them with tools meant wrestling with REST APIs, GraphQL, or custom SDKs for every data source.
Enter Anthropic’s Model Context Protocol (MCP). Launched in November 2024, MCP is like USB for AI, an open standard that lets LLMs talk to external tools through a unified, structured interface. With growing adoption across the industry, including by OpenAI, it’s quickly becoming foundational.
In this tutorial, we’ll show you how to build MCP servers and connect them to Claude or Ollama. Learning this will help you come up to speed with the buzzy new technology everyone’s talking about.
Let’s start!
MCP Architecture
Let’s pop the hood and see how MCP works. At its core, it’s a client-server setup. Your application connects to one or more MCP servers, each acting as a gateway to different tools or data sources.
When you’re building an AI agent, your system plays the role of the MCP client. It doesn’t need to understand the specifics of Slack, GitHub, or any API, it just knows how to talk to MCP servers.
Think of the client as the “brain” (the LLM), reaching into a network of tools to fetch the right context, like grabbing the right wrench from a well-organized toolbox.
MCP Server
To connect your AI assistant to tools like Notion, Jira, or even your internal services, you’ll need an MCP server. It acts as a translator converting LLM queries into API calls, and returning structured, context-rich responses that the model can reason over.
You can build your own server or use open-source ones. Either way, the LLM doesn’t need to know how each API works—it just talks to the MCP layer.
Secure, Session-Based Access
When your agent needs context, it opens a session with an MCP server. The server verifies identity, scopes access, and gives the agent just what it needs, whether that's data browsing, targeted queries, or even real-time updates.
Sessions are temporary and revocable, so you're always in control of what the agent can see.
Clean Separation of Concerns
Tool builders create MCP servers. Agent builders simply connect to them. This means your assistant code stays lean, focused on reasoning, not wiring up APIs.
MCP Clients = LLM-powered agents
MCP Servers = Gateways to tools and data
Sessions = Secure, scoped connections
So, to summarize, you don’t need to build custom adapters for every integration, just plug into an MCP server and go. With wide adoption already underway, this is the future of scalable AI integration.
Building an MCP-Powered AI Agent
Let’s dive in and actually build an agent, because the best way to understand MCP is to see it in action.
To show how the protocol works end-to-end, we’ll build both an MCP client and two servers.
Our example: a financial portfolio news tracker agent. It will demonstrate how to connect to both internal and external data sources using MCP.
Here’s what the agent will do:
Connect to one MCP server to fetch your stock symbols from a database.
Connect to another MCP server that pulls news based on those symbols.
Return a combined response that shows you relevant news across your entire portfolio.
This pattern, composing responses from multiple sources, is exactly what makes MCP so powerful.
Creating the MCP Server to Fetch News
Let’s first create the MCP server that would fetch news from the News API.
Install uv
Start by installing uv, which simplifies Python project and environment setup.
curl -LsSf https://astral.sh/uv/install.sh | sh
You will need to restart your terminal after installation.
Set up the MCP Server Project
Next, set up the project.
# create project directory
uv init financial_news_tracker
cd financial_news_tracker
# initialize Python virtual environment
uv venv
source .venv/bin/activate
# install dependencies
uv add "mcp[cli]" httpx
# create a new server file
touch news.py
Import packages and set up the FastMCP instance
Let’s import the required packages and initialize a FastMCP instance.
from typing import Any
import httpx
from mcp.server.fastmcp import FastMCP
# Initialize FastMCP server
mcp = FastMCP("financial_news")
FastMCP automatically generates tool definitions and makes it easy to create and maintain MCP tools.
Function to query and format the News API
Next, we will write the function to query the News API. Head to newsapi.org and get an API key (free for non-commercial use).
# Constants
NEWS_API_KEY = "<your_news_api_key>"
async def make_news_api_request(symbol: str) -> dict[str, Any] | None:
"""Make a request to the News API with proper error handling."""
url = f"https://newsapi.org/v2/everything?q={symbol}&apiKey={NEWS_API_KEY}"
async with httpx.AsyncClient() as client:
try:
response = await client.get(url, timeout=30.0)
response.raise_for_status()
return response.json()
except Exception:
return None
This function would fetch the news around any stock symbol. We now need a function to format the responses.
def format_news(article: dict) -> str:
"""Format an news response into a readable string."""
return f"""
Title: {article.get('title', 'Unknown')}
Description: {article.get('description', 'No description available')}
Source: {article.get('source', 'Unknown')}
Link: {article.get('url', 'Unknown')}
"""
Putting together the MCP tool
We can now put together the MCP tool that responds to requests from MCP clients.
@mcp.tool()
async def get_news(symbol: str) -> str:
"""Get news for a stock symbol.
Args:
symbol: The symbol to get news for
"""
data = await make_news_api_request(symbol)
if not data or "articles" not in data:
return "Unable to fetch articles or no articles found."
if not data["articles"]:
return "No articles found for this symbol."
articles = [format_news(article) for article in data["articles"]]
return "\n---\n".join(articles)
Then, add these lines in the end:
if __name__ == "__main__":
# Initialize and run the server
mcp.run(transport='stdio')
That’s it. Our first server is ready. Let’s now build a second server to manipulate the database.
Creating the MCP Server to Connect to the DB
Next, let’s create another MCP server that connects to the database and allows LLMs to manipulate the database.
touch symbols.py
Open symbols.py in your favorite code editor, and then add the following functions.
Initialize the MCP Server instance
Similar to how we did before, let’s import FastMCP and create an instance of the MCP server.
import sqlite3
import os
from typing import List, Optional
from mcp.server.fastmcp import FastMCP
# Initialize FastMCP server
mcp = FastMCP("my_stock_symbols")
Connect to the DB
Next, let’s write a function to initialize and connect to the DB.
# Database file path
DB_PATH = os.path.join(os.path.dirname(__file__), "stocks.db")
def init_db() -> None:
"""Initialize the database with a stocks table if it doesn't exist."""
conn = sqlite3.connect(DB_PATH)
cursor = conn.cursor()
# Create the stocks table if it doesn't exist
cursor.execute('''
CREATE TABLE IF NOT EXISTS stocks (
id INTEGER PRIMARY KEY AUTOINCREMENT,
symbol TEXT NOT NULL UNIQUE,
company_name TEXT,
sector TEXT,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
)
''')
conn.commit()
conn.close()
def get_db_connection():
"""Create a connection to the SQLite database."""
return sqlite3.connect(DB_PATH)
Add a Stock
The powerful aspect of using LLMs is that you can use them to manipulate your database without creating UIs or admin systems.
Let’s create a function to add stock symbols.
def add_stock_symbol(symbol: str, company_name: Optional[str] = None, sector: Optional[str] = None) -> str:
"""
Add a new stock symbol to the database.
Args:
symbol: The stock symbol to add.
company_name: Optional company name.
sector: Optional sector.
Returns:
True if successful, False otherwise.
"""
try:
conn = get_db_connection()
cursor = conn.cursor()
cursor.execute(
"INSERT INTO stocks (symbol, company_name, sector) VALUES (?, ?, ?)",
(symbol, company_name, sector)
)
conn.commit()
conn.close()
return f"Added stock symbol {symbol} to the database."
except sqlite3.IntegrityError:
# Symbol already exists
return f"Stock symbol {symbol} already exists in the database."
except Exception:
return f"Failed to add stock symbol {symbol} to the database."
Fetch Stocks
Now, we can add a function to fetch all stock symbols present in the database.
def get_stock_symbols() -> List[tuple]:
"""
Retrieve all stock symbols and their details from the database.
Returns:
List of tuples containing (symbol, company_name, sector).
"""
conn = get_db_connection()
cursor = conn.cursor()
cursor.execute("SELECT symbol, company_name, sector FROM stocks ORDER BY symbol")
stocks = cursor.fetchall()
conn.close()
return stocks
We also need a utility function to format the response.
def format_symbols(symbol: str, company_name: str, sector: str) -> str:
"""Format an news response into a readable string."""
return f"""
Symbol: {symbol}
Company Name: {company_name}
Sector: {sector}
"""
Function to Delete a Stock
Let’s also add a function that allows us to delete stock symbols from the database.
def remove_stock_symbol(symbol: str) -> str:
"""
Remove a stock symbol from the database.
Args:
symbol: The stock symbol to remove.
Returns:
True if successful, False otherwise.
"""
try:
conn = get_db_connection()
cursor = conn.cursor()
cursor.execute("DELETE FROM stocks WHERE symbol = ?", (symbol,))
conn.commit()
conn.close()
return f"Removed stock symbol {symbol} from the database."
except Exception:
return f"Failed to remove stock symbol {symbol} from the database."
Create MCP Server Tools
Finally, let’s create the MCP Server tools that will be the interface to the LLMs.
@mcp.tool()
def get_my_stock_symbols() -> str:
"""Get information about all stock symbols in the database."""
stocks = get_stock_symbols()
if not stocks:
return "No stock symbols found in the database."
formatted_stocks = []
for symbol, company_name, sector in stocks:
formatted_stocks.append(format_symbols(
symbol,
company_name or "Not specified",
sector or "Not specified"
))
return "\n---\n".join(formatted_stocks)
@mcp.tool()
def add_my_stock_symbol(symbol: str, company_name: Optional[str] = None, sector: Optional[str] = None) -> str:
"""Add a new stock symbol to the database."""
return add_stock_symbol(symbol, company_name, sector)
@mcp.tool()
def remove_my_stock_symbol(symbol: str) -> str:
"""Remove a stock symbol from the database."""
return remove_stock_symbol(symbol)
Finalize the Server
Finally, add the lines to initialize the DB and run.
init_db()
if __name__ == "__main__":
# Initialize and run the server
mcp.run(transport='stdio')
That’s it! Our second MCP server is also ready. Now, let’s connect it to an LLM and see how that works.
Building the MCP Client
We now have two MCP servers - one that fetches news around stock symbols, and another that allows us to manipulate a database of stock symbols.
The easiest next step would be to use Claude Desktop as the LLM. Anthropic has made it really simple to use these servers with Claude.
However, to explain the full potential of MCP, we will also showcase how it can be used with open source LLMs.
Claude Desktop as MCP Client
To use the MCP servers with Claude Desktop, install it first, and then modify the claude_desktop_config.json file.
On MacOS/Linux
code ~/Library/Application\ Support/Claude/claude_desktop_config.json
On Windows
code $env:AppData\Claude\claude_desktop_config.json
Then add the MCP servers to it:
{
"mcpServers": {
"financial_news": {
"command": "uv",
"args": [
"--directory",
"/ABSOLUTE/PATH/TO/FOLDER/financial_news_tracker",
"run",
"news.py"
]
},
"my_stock_symbols": {
"command": "uv",
"args": [
"--directory",
"/ABSOLUTE/PATH/TO/FOLDER/financial_news_tracker",
"run",
"symbols.py"
]
}
}
}
Note: if you encounter errors, try using the full path to the uv
executable.
Once done, you can restart Claude Desktop, and you will see the MCP Servers listed.
Click on the hammer in the icons.
Now you can execute natural language queries like:
And, fetch news.
As you can see, it uses the MCP Server tools to complete the tasks.
Great — so our setup works. Now, let’s build an MCP Client on an open-source LLM.
Open Source LLMs as MCP Client
Since MCP is a protocol, it can be used with any open-source LLMs.
We’ll use Ollama to deploy a Gemma-3 model, and then use it as an MCP client. Let’s start.
Deploy LLM
First, download and install Ollama, if you haven’t already. Then, pull Gemma-3 4B model.
ollama run gemma3:4b
Install dependencies
Next, in a new directory, set up a virtual environment and install dolphin-mcp.
mkdir mcp_client
cd mcp_client
python3 -m venv .venv
source .venv/bin/activate
pip install dolphin-mcp
Clone the repository and install dependencies
Then, clone the dolphin-mcp repository.
git clone https://github.com/cognitivecomputations/dolphin-mcp.git
cd dolphin-mcp
Then install dependencies.
pip install -e .
pip install lmstudio
Create a .env file
Now, create the .env file and set up whatever API keys you want to use MCP with.
cp .env.example .env
vim .env # use any editor
So, in case you want to use OpenAI’s 4o as an MCP client, you can set up the API keys here.
Create the MCP config JSON
Next, edit the mcp_config.json. The contents will be like this:
{
"mcpServers": {
"financial_news": {
"command": "/Users/soum/.local/bin/uv",
"args": [
"--directory",
"/ABSOLUTE/PATH/TO/FOLDER/financial_news_tracker",
"run",
"news.py"
]
},
"my_stock_symbols": {
"command": "/Users/soum/.local/bin/uv",
"args": [
"--directory",
"/ABSOLUTE/PATH/TO/FOLDER/financial_news_tracker",
"run",
"symbols.py"
]
}
},
"models": [
{
"title": "gemma3",
"provider": "ollama",
"model": "gemma3:4b"
}
]
}
You can see that we have added Gemma3 as the model in the models section. You can add any other model of your choosing here.
Run the MCP Client
You can now test the client using the CLI.
dolphin-mcp-cli --model gemma3 --config mcp_config.json "which symbols are present in the database"
Write a custom client app
You can now use this MCP client in your app and build agents. To do that, here’s an example code:
import asyncio
from dolphin_mcp import run_interaction
async def main():
result = await run_interaction(
user_query="Latest news on AAPL",
model_name="gemma3",
config_path="mcp_config.json",
quiet_mode=False # Optional, defaults to False
)
print(result)
# Run the async function
asyncio.run(main())
That’s it! You now have a flexible MCP Server/Client setup that works.
Applications of Model Context Protocol (MCP)
Now that you’ve seen how MCP works, let’s look at what you can actually build with it. Once your AI agent has structured real-time context from your tools, the possibilities multiply fast.
Here are a few practical, high-impact ideas that you can start building.
AI Analyst for Live Business Data
Skip the dashboards, your agent can pull live metrics from Postgres, Spark, Elasticsearch, or APIs via MCP.
You can build it to compare sales trends, explain anomalies, and deliver insights in plain English.
AI Production Assistant for Manufacturing
Hook into MES and ERP systems using MCP, and let your agent answer questions like:
“Why did Line 4 pause yesterday?” You can use it to analyze machine logs, detect anomalies, and link to related maintenance actions, all through natural language.
AI QA & Insights Bot for Support
Connect to Zendesk, Intercom, or your CRM and let your agent surface ticket trends, flag product bugs, or suggest help docs. Ask: “Are complaints rising after our latest update?” The agent can cluster issues and give you root-cause insights.
Inventory Analyst for Logistics & Supply Chain
MCP makes it easy to connect databases, shipment trackers, and vendor systems.
Your agent can monitor stock levels, predict delays, and answer: “Which suppliers are consistently late?” You can get it to respond with ranked lists and actionable insights.
Final Notes
Hopefully, this walkthrough will get you started with building MCP-powered AI agents. With MCP, you can keep your agent logic clean, your integrations modular, and your system future-proof. Whether you're prototyping something lightweight or designing a production-grade AI system, this protocol gives you the right foundation.
In the next episode of this newsletter, we will return with another exciting project that you can dive into. Happy coding!