Steps to Build a Multi-Tool Agent Using Google’s ADK
Learn how to build a multi-tool agent using Google's ADK. Basic Python programming knowledge required. Implementation time: 2 hours.
Last week, we built an MCP-powered agent. This week, we will tackle an even more powerful framework: Google’s Agent Development Kit (ADK) that was released a week ago!
Google’s Agent Development Kit (ADK) is a powerful new open-source framework designed to simplify the creation of intelligent, LLM-powered agents. You can even create ‘agent teams’ - or multi-agent systems (MAS) - which team up to solve problems.
This week’s walkthrough will guide you through building an agent using Google’s ADK.
Here’s what you will end up learning in the process:
How Google’s ADK works
How to use LiteLLM to call any LLM in your application
How to use LLM to convert natural language queries into SQL queries
Steps you can take to grow the agent into a powerful application
The implementation will take you a maximum of 2 hours. By the end of it, you will understand why agents will power numerous future interfaces.
(Note - if you don’t want to read this in your inbox, click on this link to read through and follow the next steps in this tutorial).
We believe that by bringing natural language interfaces to execute tasks, (Agentic) AI will completely transform human-computer interaction (HCI), like what the iPhone (and its touchscreen) did to smartphones.
Go through this tutorial, feel free to modify it to build your agent and publish on GitHub. Don’t forget to forward this to your friends, colleagues, or your team if you think they will benefit from it!
Let’s start!
What We Will Build
We will build a personal ‘Siri’ or ‘Alexa’- like agent that can:
News: Get news on any topic, for any date
Weather: Get weather information for any location
Journaling: Save notes that you share with it, for any date
History: Retrieve notes that you have saved
Track health: If your notes contain any health or diet-related information, it will extract it and save it in the database
Health summary: If you have saved health data, you can ask for it again at a later date.
Phew! That’s a lot in just 2 hours – but that’s the power of modern AI. You get superpowers if you learn it! You can grow this easily, as we will show you how to make it modular.
Prerequisites
Let’s start. First, you need to create a Python virtual environment.
1- Create a virtual environment
Open your terminal, create a working directory, and do this:
python3 -m venv .venv
source .venv/bin/activate
Now, create a subdirectory and change directory:
mkdir personal_agent
cd personal_agent
2- Create a .env file
You will need an .env file to store the API keys. You will need API keys from:
OpenAI API Key (you can also use Gemini as well, but we will use OpenAI to explain the point that multi-LLM systems are easy to build)
After you have them, save them in a .env file in your working directory (personal_agent folder).
GOOGLE_GENAI_USE_VERTEXAI=FALSE
GOOGLE_API_KEY=<your_google_ai_studio_key>
OPENAI_API_KEY=<your_openai_api_key>
NEWS_API_KEY=<your_news_api_key>
OPENWEATHER_API_KEY=<your_openweather_api_key>
3- Create an __init__.py
Then, in the same directory (personal_agent) create __init__.py file with the following:
from . import agent
4- Create a tools folder
Your tools will live in this folder. Create this:
mkdir tools
touch tools/__init__.py
Now we are set to code.
Implementation Steps
Ok - first, we will build the tool for fetching news and weather. This is similar to what you had done in the last tutorial - except it's a bit more advanced.
Web API Tools
We will first create the function to fetch news on any topic, for any date. Create a file - web_tool.py, in the tools folder.
1- Import statements
Add these import statements:
# add docstring to this module
"""Tools module for the customer service agent."""
import logging
from datetime import datetime, timedelta
import requests
from typing import Dict, Optional
import os
from dotenv import load_dotenv
# Load environment variables
load_dotenv()
logger = logging.getLogger(__name__)
# API Configuration
WEATHER_API_KEY = os.getenv("OPENWEATHER_API_KEY")
NEWS_API_KEY = os.getenv("NEWS_API_KEY")
2- Function to fetch news
We will now create a function that calls the News API and fetches news on a topic.
def search_news(topic: str, from_date: Optional[str] = None, to_date: Optional[str] = None) -> Dict:
"""
Search for news articles about a specific topic within a date range.
This function uses the News API to fetch articles related to the specified topic.
If no date range is provided, it defaults to the last 7 days.
Results are sorted by publication date (newest first).
Args:
topic (str): Topic to search for in news articles
from_date (Optional[str]): Start date in YYYY-MM-DD format (inclusive)
to_date (Optional[str]): End date in YYYY-MM-DD format (inclusive)
Returns:
Dict: A dictionary containing:
- status (str): "success" or "error"
- total_results (int): Total number of articles found (if successful)
- articles (List[Dict]): List of article objects with the following fields:
- title (str): Article title
- description (str): Brief description of the article
- url (str): Link to the full article
- published_at (str): Publication date and time
- source (str): Name of the news source
- author (str): Article author
- message (str): Error message (if status is "error")
Raises:
Exception: Handled internally, returns error status with message
"""
try:
# Set default date range if not provided
if not from_date and not to_date:
to_date = datetime.now().strftime("%Y-%m-%d")
from_date = (datetime.now() - timedelta(days=7)).strftime("%Y-%m-%d")
logger.info(f"No date range provided, using default range: {from_date} to {to_date}")
# Build the query
news_url = "https://newsapi.org/v2/everything"
params = {
"q": topic,
"apiKey": NEWS_API_KEY,
"language": "en",
"sortBy": "publishedAt"
}
if from_date:
params["from"] = from_date
if to_date:
params["to"] = to_date
response = requests.get(news_url, params=params)
data = response.json()
if response.status_code != 200:
return {
"status": "error",
"message": f"News API error: {data.get('message', 'Unknown error')}"
}
# Format the articles
articles = []
for article in data.get("articles", []):
articles.append({
"title": article["title"],
"description": article["description"],
"url": article["url"],
"published_at": article["publishedAt"],
"source": article["source"]["name"],
"author": article["author"]
})
return {
"status": "success",
"total_results": data["totalResults"],
"articles": articles
}
except Exception as e:
logger.error(f"Error searching news: {str(e)}")
return {
"status": "error",
"message": f"Failed to search news: {str(e)}"
}
Few things to note:
Docstring: The long docstring you see below the function name is necessary. This is used by the agent framework to figure out what this function will do.
Date: The function defaults to today’s date if the date is not provided.
Return is a JSON with articles as an array.
2- Function to fetch weather info
We will also add a weather tool that can get weather for any location, for a date you specify.
def get_weather(date: str, location: str = "New York") -> Dict:
"""
Get weather information for a specific date and location.
This function retrieves historical weather data for the specified date and location
using the OpenWeather API. It first geocodes the location to get coordinates,
then fetches the weather data for those coordinates on the given date.
Args:
date (str): Date in YYYY-MM-DD format for which to retrieve weather data
location (str): City name to get weather for (default: "New York")
Returns:
Dict: Weather information including:
- status: "success" or "error"
- date: The requested date
- location: The requested location
- temperature: Dict containing current, feels_like, min, and max temperatures in Celsius
- conditions: Dict containing weather description, humidity, etc.
Raises:
No exceptions are raised as errors are returned in the response dictionary
with status="error" and an error message.
"""
try:
# Convert date string to datetime
if date:
target_date = datetime.strptime(date, "%Y-%m-%d")
else:
target_date = datetime.now()
# Get coordinates for the location
geocode_url = f"http://api.openweathermap.org/geo/1.0/direct?q={location}&limit=1&appid={WEATHER_API_KEY}"
geocode_response = requests.get(geocode_url)
geocode_data = geocode_response.json()
if not geocode_data:
return {
"status": "error",
"message": f"Location '{location}' not found"
}
lat = geocode_data[0]["lat"]
lon = geocode_data[0]["lon"]
# Get historical weather data
weather_url = f"https://api.openweathermap.org/data/3.0/onecall/timemachine"
params = {
"lat": lat,
"lon": lon,
"dt": int(target_date.timestamp()),
"appid": WEATHER_API_KEY,
"units": "metric" # Use metric units
}
response = requests.get(weather_url, params=params)
data = response.json()
if response.status_code != 200:
return {
"status": "error",
"message": f"Weather API error: {data.get('message', 'Unknown error')}"
}
# Extract relevant weather information
weather_data = data["data"][0] # Get the first (and only) data point
weather_info = {
"status": "success",
"date": date,
"location": location,
"temperature": {
"current": weather_data["temp"],
"feels_like": weather_data["feels_like"],
"min": weather_data["temp_min"],
"max": weather_data["temp_max"]
},
"conditions": {
"main": weather_data["weather"][0]["main"],
"description": weather_data["weather"][0]["description"],
"icon": weather_data["weather"][0]["icon"]
},
"wind": {
"speed": weather_data["wind_speed"],
"direction": weather_data["wind_deg"]
},
"humidity": weather_data["humidity"],
"pressure": weather_data["pressure"]
}
return weather_info
except Exception as e:
logger.error(f"Error fetching weather data: {str(e)}")
return {
"status": "error",
"message": f"Failed to fetch weather data: {str(e)}"
}
As you can see, the API calling tools are simple, and you can hook up any API call. Just add another function!
LLM Parsing and DB Manipulation Tools
We will now create a tool that allows us to parse notes, save them, retrieve them, and also track health data.
1- Create notes_tools.py and add import statements
Create a new file - notes_tools.py - in the tools folder and add the following:
import sqlite3
import logging
from datetime import datetime, timedelta
from typing import Dict, List, Optional, Tuple
import json
from litellm import completion
from pydantic import BaseModel, Field
from typing import List, Optional
import statistics
logger = logging.getLogger(__name__)
We will use SQLite3 as a DB - but you can easily switch it with any other database if you choose. In production applications, this might be PostgreSQL or MySQL.
2- Create Pydantic structures
Let’s create structures for our data.
# Pydantic models for structured output
class HealthData(BaseModel):
calories_burnt: Optional[float] = Field(description="Calories burnt during exercise")
calories_intake: Optional[float] = Field(description="Calories consumed")
steps: Optional[int] = Field(description="Number of steps taken")
weight: Optional[float] = Field(description="Weight in kg")
sleep_hours: Optional[float] = Field(description="Hours of sleep")
water_intake: Optional[float] = Field(description="Water intake in liters")
mood: Optional[str] = Field(description="Mood rating (e.g., happy, stressed, tired)")
class StructuredNote(BaseModel):
title: str = Field(description="A concise title for the note")
content: str = Field(description="The main content of the note")
category: str = Field(description="Category of the note (e.g., Work, Personal, Ideas, Tasks, Health)")
tags: List[str] = Field(description="Relevant tags for the note")
date: datetime = Field(description="The date when the note was created")
health_data: Optional[HealthData] = Field(description="Health-related data extracted from the note")
class QueryParameters(BaseModel):
category: Optional[str] = Field(description="Category to filter by")
tags: List[str] = Field(description="Tags to filter by")
search_text: Optional[str] = Field(description="Text to search in title and content")
date_range: Optional[tuple[datetime, datetime]] = Field(description="Date range to filter by")
These Pydantic classes will be used to force the LLM output structure into a JSON format.
3- Database tables
Let’s create tables where we will store data:
# Database schema
CREATE_NOTES_TABLE = """
CREATE TABLE IF NOT EXISTS notes (
id INTEGER PRIMARY KEY AUTOINCREMENT,
title TEXT NOT NULL,
content TEXT NOT NULL,
category TEXT,
tags TEXT, -- Stored as JSON array
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
date TIMESTAMP DEFAULT CURRENT_TIMESTAMP
)
"""
CREATE_HEALTH_DATA_TABLE = """
CREATE TABLE IF NOT EXISTS health_data (
id INTEGER PRIMARY KEY AUTOINCREMENT,
note_id INTEGER,
calories_burnt REAL,
calories_intake REAL,
steps INTEGER,
weight REAL,
sleep_hours REAL,
water_intake REAL,
mood TEXT,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (note_id) REFERENCES notes(id)
)
"""
And a function to create the tables:
def init_db():
"""Initialize the SQLite database."""
conn = sqlite3.connect('notes.db')
cursor = conn.cursor()
cursor.execute(CREATE_NOTES_TABLE)
cursor.execute(CREATE_HEALTH_DATA_TABLE)
conn.commit()
conn.close()
init_db()
4- Prompt to get structured data from any Note
Let’s now create an extensive prompt to fetch structured data.
# Create LLM prompt to return structured output
NOTE_STRUCTURING_PROMPT = """You are a helpful assistant that structures raw notes into a standardized format.
Extract a title, categorize the note, add a date if present (today is {today}), or keep date empty if not, and identify relevant tags.
If the note contains health-related information, infer the following data to the best of your ability:
- Calories burnt during exercise
- Calories consumed
- Number of steps taken
- Weight
- Hours of sleep
- Water intake
- Mood
Return the response in the following JSON format:
{{
"title": "string",
"content": "string",
"category": "string",
"tags": ["string"],
"date": "YYYY-MM-DD",
"health_data": {{
"calories_burnt": number or null,
"calories_intake": number or null,
"steps": number or null,
"weight": number or null,
"sleep_hours": number or null,
"water_intake": number or null,
"mood": "string" or null
}}
}}
Raw note:
{raw_note}"""
A better approach would have been to use StructuredOutput API from OpenAI or other LLM providers. But since we want the ability to switch LLMs, we have kept the response structure within the prompt. This method works for a wider range of LLMs.
5- Function to convert user note into structured data
We can now write a function that converts the user’s notes into structured data:
def structure_note_with_llm(raw_note: str) -> Dict:
"""
Use LLM to structure the raw note into a standardized format.
Args:
raw_note (str): The raw text of the note
Returns:
Dict: Structured note with title, content, category, and tags
"""
try:
# Format the prompt
prompt = NOTE_STRUCTURING_PROMPT.format(
today=datetime.now().strftime("%Y-%m-%d"),
raw_note=raw_note
)
# Get LLM response
response = completion(
model="gpt-4o-mini",
messages=[{"role": "user", "content": prompt}],
temperature=0
)
# Parse the response
structured_note = StructuredNote.parse_raw(response.choices[0].message.content)
logger.info(f"Raw note: {raw_note}")
logger.info(f"Structured note: {structured_note}")
return {
"title": structured_note.title,
"content": structured_note.content,
"category": structured_note.category,
"tags": structured_note.tags,
"date": structured_note.date,
"health_data": structured_note.health_data
}
except Exception as e:
logger.error(f"Error structuring note with LLM: {str(e)}")
# Fallback to basic structure if LLM fails
return {
"title": "Untitled Note",
"content": raw_note,
"category": "General",
"tags": [],
"date": datetime.now(),
"health_data": None
}
We are using the completion function provided by LiteLLM. This means - if you want to switch the model, simply change the model string. You are not tied to a specific LLM in this case.
6- Function to save a note
Let’s now write a function to save the note:
def save_note(raw_note: str) -> Dict:
"""
Save a note to the database after structuring it with LLM.
Args:
raw_note (str): The raw text of the note
Returns:
Dict: Status and message of the operation
"""
try:
# Structure the note using LLM
structured_note = structure_note_with_llm(raw_note)
# Connect to database
conn = sqlite3.connect('notes.db')
cursor = conn.cursor()
# Insert the note
cursor.execute("""
INSERT INTO notes (title, content, category, tags, date)
VALUES (?, ?, ?, ?, ?)
""", (
structured_note['title'],
structured_note['content'],
structured_note['category'],
json.dumps(structured_note['tags']),
structured_note['date']
))
note_id = cursor.lastrowid
# If health data exists, save it
if structured_note['health_data']:
health_data = structured_note['health_data']
cursor.execute("""
INSERT INTO health_data (
note_id, calories_burnt, calories_intake, steps,
weight, sleep_hours, water_intake, mood
)
VALUES (?, ?, ?, ?, ?, ?, ?, ?)
""", (
note_id,
health_data.calories_burnt,
health_data.calories_intake,
health_data.steps,
health_data.weight,
health_data.sleep_hours,
health_data.water_intake,
health_data.mood
))
conn.commit()
conn.close()
return {
"status": "success",
"message": "Note saved successfully",
"note_id": note_id
}
except Exception as e:
logger.error(f"Error saving note: {str(e)}")
return {
"status": "error",
"message": f"Failed to save note: {str(e)}"
}
In the function above, we are checking if health data exists in the note, and if so, then saving that separately.
For instance, if the user says in their note:
“Yesterday was a sunny day, so I ran 5 km.”
In this note, you have health data already present. So, we are going a step ahead of normal note-taking tools, parsing it and saving it separately.
That’s it with the note saving! Now on to query handling.
7- Prompt to convert the user query into a structured query
The user will be typing natural language text, so we need to also parse the user’s query into a structured format, so we can run SQL queries.
Note - we could have used Vector Databases here – but with LLMs becoming increasingly powerful with text to SQL, we can simply stick to good-old SQL for this tutorial. You can easily change this to use a vector DB (or use pgvector with PostgreSQL).
QUERY_PROCESSING_PROMPT = """You are a helpful assistant that converts natural language queries into structured search parameters.
Extract categories, tags, search text, and date ranges from the query. If the query is about a specific date, use the date as the date range.
Today is {today}.
Return the response in the following JSON format:
{{
"category": "string or null",
"tags": ["string"],
"search_text": "string or null",
"date_range": ["YYYY-MM-DD", "YYYY-MM-DD"] or null
}}
Query:
{query}"""
8- Function to parse user query
We will now create a utility function to parse the user query:
def process_natural_language_query(query: str) -> Dict:
"""
Use LLM to convert natural language query into SQL query parameters.
Args:
query (str): Natural language query
Returns:
Dict: Structured query parameters
"""
try:
# Format the prompt
prompt = QUERY_PROCESSING_PROMPT.format(
today=datetime.now().strftime("%Y-%m-%d"),
query=query
)
# Get LLM response
response = completion(
model="gpt-4o-mini",
messages=[{"role": "user", "content": prompt}],
temperature=0
)
# Parse the response
query_params = QueryParameters.parse_raw(response.choices[0].message.content)
logger.info(f"Query params: {query_params}")
return {
"category": query_params.category,
"tags": query_params.tags,
"search_text": query_params.search_text,
"date_range": query_params.date_range
}
except Exception as e:
logger.error(f"Error processing query with LLM: {str(e)}")
# Fallback to basic search if LLM fails
return {
"category": None,
"tags": [],
"search_text": query,
"date_range": None
}
9- Query DB to retrieve notes
We can now write a function to query the DB and retrieve notes:
def retrieve_notes(query: str) -> Dict:
"""
Retrieve notes based on natural language query.
Args:
query (str): Natural language query
Returns:
Dict: Status and retrieved notes
"""
try:
# Process the natural language query
query_params = process_natural_language_query(query)
# Connect to database
conn = sqlite3.connect('notes.db')
cursor = conn.cursor()
# Build the query
sql_query = """
SELECT n.*, h.calories_burnt, h.calories_intake, h.steps,
h.weight, h.sleep_hours, h.water_intake, h.mood
FROM notes n
LEFT JOIN health_data h ON n.id = h.note_id
WHERE 1=1
"""
params = []
if query_params['category']:
sql_query += " AND n.category = ?"
params.append(query_params['category'])
if query_params['tags']:
sql_query += " AND n.tags LIKE ?"
params.append(f"%{query_params['tags'][0]}%")
if query_params['search_text']:
sql_query += " AND (n.title LIKE ? OR n.content LIKE ?)"
params.extend([f"%{query_params['search_text']}%"] * 2)
if query_params['date_range']:
sql_query += " AND n.date BETWEEN ? AND ?"
params.extend(query_params['date_range'])
# Execute query
cursor.execute(sql_query, params)
notes = cursor.fetchall()
# Format results
formatted_notes = []
for note in notes:
health_data = None
if note[8]: # If there's health data
health_data = {
"calories_burnt": note[8],
"calories_intake": note[9],
"steps": note[10],
"weight": note[11],
"sleep_hours": note[12],
"water_intake": note[13],
"mood": note[14]
}
formatted_notes.append({
"id": note[0],
"title": note[1],
"content": note[2],
"category": note[3],
"tags": json.loads(note[4]),
"created_at": note[5],
"updated_at": note[6],
"date": note[7],
"health_data": health_data
})
conn.close()
return {
"status": "success",
"notes": formatted_notes
}
except Exception as e:
logger.error(f"Error retrieving notes: {str(e)}")
return {
"status": "error",
"message": f"Failed to retrieve notes: {str(e)}"
}
As you can see, the function parses the user query, and also fetches health data if available.
That’s it! We are done with all the tools we need.
Root Agent
Now, you can create the root agent that will bring it all together.
In the personal_agent folder, create a file agent.py.
1- Import statements
import os
import dotenv
from google.adk.agents import Agent
from google.adk.models.lite_llm import LiteLlm # For multi-model support
from .tools.notes_tool import save_note, retrieve_notes
from .tools.web_tool import get_weather, search_news
As you can see, we are importing the tools we created above. We will use them now.
2- Define the model and set keys
dotenv.load_dotenv()
os.environ["GEMINI_API_KEY"] = os.getenv('GOOGLE_API_KEY')
os.environ["OPENAI_API_KEY"] = os.getenv('OPENAI_API_KEY')
AGENT_MODEL = 'gemini/gemini-2.0-flash'
We are going to use the Gemini-2.0-Flash model as the master agent model. Note - since you are using LiteLLM, you can easily switch this to another LLM.
We are also setting the API keys for the two LLMs we are using.
3- Agent
Now we can set the final agent code:
root_agent = Agent(
name="personal_agent",
model=LiteLlm(model=AGENT_MODEL),
description=(
"Personal agent to save and retrieve notes, fetch news and weather."
),
instruction=(
"You are a helpful agent who can save and retrieve notes."
),
tools=[save_note, retrieve_notes, get_weather, search_news],
)
The more tools you add, the more powerful this agent becomes!
Now, we can run the agent.
Running the Agent
With ADK, you get a great interface to run your agent, see what it is doing. We will use that. Once you have tested it, you can easily use your own UI instead (future tutorial).
Go to the terminal and type this:
adk web
You will get an output like this:
INFO: Started server process [24561]
INFO: Waiting for application startup.
+----------------------------------------------------------------------+
| ADK Web Server started |
| |
| For local testing, access at http://localhost:8000. |
+----------------------------------------------------------------------+
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
ADK web dashboard uses FastAPI. You can now head over to
http://0.0.0.0:8000
to launch your agent UI.
This will come up:
Now you can query it. Let’s ask for latest news on a president who is in the news daily:
You will be able to see the tools the Agent is invoking. Let’s try saving a note:
Now let’s see if we can retrieve the notes:
As you can see, all our tools are working, and we have a functioning agent to work with.
ADK is incredibly powerful, and if you want to get a taste of what it takes to build agents, it’s the one to go with.
Future Notes
In less than 2 hours, we built a functioning agent with multiple tools that does several things:
Uses LLMs to transform text to SQL queries
Can call web APIs
Can manipulate the DB
Can understand our language and respond
Uses any LLM you want
Now it's your turn! We are excited to see what you build with this. Start today and send this tutorial to any friend who may benefit from learning AI. Send us a message and show us what you have built!