MCP Agents & TOON Integration
How TOON format optimizes Model Context Protocol agents for efficient communication
What is Model Context Protocol (MCP)?
Model Context Protocol (MCP) is an open standard for connecting AI assistants to external data sources and tools. MCP agents are autonomous systems that use LLMs to make decisions, call tools, and interact with their environment.
Key MCP Components
MCP is gaining adoption with implementations from Anthropic (Claude Desktop), Cursor, and open-source communities building servers for databases, APIs, and file systems.
Why TOON for MCP Agents?
1. Reduced Token Usage in Tool Arguments
MCP agents frequently call tools with structured data. TOON reduces the token overhead of passing arguments, allowing more tool calls per context window.
{
"tool": "query_database",
"arguments": {
"records": [
{"id": 1, "name": "Alice", "age": 30},
{"id": 2, "name": "Bob", "age": 25},
{"id": 3, "name": "Charlie", "age": 35}
]
}
}tool: query_database
arguments:
records[3]{id,name,age}:
1,Alice,30
2,Bob,25
3,Charlie,35
.-2. Faster Tool Response Parsing
When MCP servers return large datasets (database queries, API responses), TOON's compact format reduces parsing time and LLM processing overhead.
# MCP Server returns TOON
results[150]{user_id,email,signup_date,status}:
1001,alice@example.com,2024-01-15,active
1002,bob@example.com,2024-01-16,active
... (148 more rows)
.-LLM can quickly scan tabular structure vs. parsing nested JSON arrays with keys repeated 150 times.
3. Multi-Agent Communication Efficiency
In multi-agent systems, agents pass messages and state between each other. TOON reduces the communication overhead, allowing more agents to collaborate within the same context window.
# Agent A → Agent B: Share discovered opportunities
agent: data_analyzer
to: decision_maker
timestamp: 2025-01-15T14:30:00Z
findings[4]{opportunity,confidence,expected_roi,risk_level}:
expand_asia,0.87,2.4x,medium
automate_ops,0.92,3.1x,low
new_product,0.73,4.2x,high
partnership,0.81,1.8x,low
.-Implementation Example: MCP Server with TOON
Python MCP Server Example
from mcp.server import Server
from toon import encode, decode
# Initialize MCP server
server = Server("database-query-server")
@server.tool()
async def query_users(filters: dict) -> str:
"""Query users from database and return in TOON format"""
# Fetch from database
users = await db.query("SELECT * FROM users WHERE ...", filters)
# Convert to TOON for efficient transmission
toon_result = encode(users, delimiter=",")
return toon_result
@server.tool()
async def analyze_data(toon_data: str) -> dict:
"""Receive TOON data from agent and analyze"""
# Decode TOON back to Python objects
records = decode(toon_data)
# Perform analysis
analysis = {
"total": len(records),
"avg_age": sum(r["age"] for r in records) / len(records),
"top_city": most_common([r["city"] for r in records])
}
return analysis
if __name__ == "__main__":
server.run()TypeScript MCP Client Example
import { Client } from "@modelcontextprotocol/sdk/client/index.js";
import { encode, decode } from "toon-format";
const client = new Client({
name: "agent-orchestrator",
version: "1.0.0"
});
// Connect to MCP server
await client.connect(transport);
// Agent calls tool with TOON-encoded data
const filters = [
{ field: "status", operator: "=", value: "active" },
{ field: "age", operator: ">", value: 25 }
];
// Encode filters as TOON
const toonFilters = encode(filters);
// Call MCP tool
const result = await client.callTool({
name: "query_users",
arguments: {
filters: toonFilters // Pass TOON instead of JSON
}
});
// Result comes back in TOON format
const users = decode(result.content[0].text);
console.log(`Retrieved ${users.length} users with 40% fewer tokens`);Real-World MCP + TOON Use Cases
🤖 Autonomous Database Agent
An MCP agent monitors database metrics, detects anomalies, and takes corrective actions. TOON reduces the token cost of streaming real-time metrics by 45%.
metrics[12]{timestamp,cpu_pct,mem_pct,disk_io,query_time}:
14:00,45.3,62.1,1234,89
14:01,47.8,63.4,1456,92
... (10 more samples)
.-💬 Multi-Agent Research Team
Three specialized agents (Researcher, Analyzer, Writer) collaborate to produce reports. TOON enables efficient state sharing between agents within a single context window.
- Researcher → Analyzer: Sources and data (TOON encoded)
- Analyzer → Writer: Insights and statistics (TOON encoded)
- Writer → User: Final report (Markdown)
🔧 Development Tool Agent
MCP agent analyzes codebases, suggests improvements, and executes refactoring. TOON efficiently represents file structures, dependencies, and code metrics.
Performance Benchmarks
| Scenario | JSON Tokens | TOON Tokens | Savings |
|---|---|---|---|
| Tool call with 50 records | 843 tokens | 412 tokens | 51.1% |
| Agent state transfer | 1,234 tokens | 687 tokens | 44.3% |
| Database query response (200 rows) | 3,456 tokens | 1,823 tokens | 47.2% |
| Multi-agent message batch (10 msgs) | 2,108 tokens | 1,134 tokens | 46.2% |
Getting Started with MCP + TOON
# Python pip install mcp # JavaScript/TypeScript npm install @modelcontextprotocol/sdk
# Python pip install toon-format # JavaScript/TypeScript npm install toon-format
Update your MCP server tools to return TOON-encoded data instead of JSON:
# Before return json.dumps(records) # After return toon.encode(records)
Inform your agent that tool responses are in TOON format and provide parsing guidance in the system prompt.
Convert Your Data to TOON Format
Try the online converter to see token savings for your MCP agent workflows
Try the Converter