News30 Jun 20246 min read

OpenAI Announces Model Context Protocol: Standardizing AI Agent Tool Use

OpenAI adopts Model Context Protocol (MCP) for standardized tool access -implications for agent builders, integration ecosystem, and comparison to proprietary function calling.

MB
Max Beech
Head of Content

The News: OpenAI announced adoption of Model Context Protocol (MCP) on June 25, 2024 -a standard originally created by Anthropic for LLMs to access external tools and data sources (OpenAI Blog).

What MCP enables:

  • Standardized way for LLMs to connect to tools (databases, APIs, file systems)
  • Write tool integration once, works across all MCP-compatible LLMs
  • Growing ecosystem of pre-built MCP servers (GitHub, Google Drive, Slack, etc.)

Why this matters: Removes vendor lock-in. Before, tools built for OpenAI function calling didn't work with Claude (different formats). With MCP, same tool works everywhere.

What is Model Context Protocol?

Problem before MCP:

OpenAI function calling format:
{
  "type": "function",
  "function": {
    "name": "search_database",
    "parameters": {...}
  }
}

Anthropic tool use format:
{
  "type": "tool_use",
  "name": "search_database",
  "input": {...}
}

Google Gemini function calling format:
{
  "function_call": {
    "name": "search_database",
    "args": {...}
  }
}

Every vendor had different format. Build tool for OpenAI → doesn't work with Claude.

With MCP:

Standard protocol all LLMs speak:

MCP Server (database):
- Exposes standardized endpoints
- Works with OpenAI, Claude, Gemini, Llama (any MCP client)

Developer: Write once, use everywhere

How MCP Works

Architecture:

LLM (OpenAI, Claude, etc.)
  ↓ MCP Client
  ↓ (Standard protocol)
  ↓ MCP Server (tool provider)
  ↓ Actual tool (database, API, etc.)

Example: Database MCP server

// MCP Server: Exposes database as tool
import { Server } from "@modelcontextprotocol/sdk/server";

const server = new Server({
  name: "database-server",
  version: "1.0.0"
});

// Define tool
server.setRequestHandler("tools/list", async () => ({
  tools: [{
    name: "query_database",
    description: "Query SQL database",
    inputSchema: {
      type: "object",
      properties: {
        query: { type: "string" }
      }
    }
  }]
}));

// Handle tool execution
server.setRequestHandler("tools/call", async (request) => {
  if (request.params.name === "query_database") {
    const result = await db.query(request.params.arguments.query);
    return { result };
  }
});

LLM uses tool (any MCP-compatible LLM):

from openai import OpenAI  # Or Anthropic, Google, etc.

client = OpenAI()

# Connect to MCP server
mcp_client = MCPClient("http://localhost:3000")

# LLM calls tool via MCP
response = client.chat.completions.create(
  model="gpt-4",
  messages=[{"role": "user", "content": "Find users from California"}],
  tools=mcp_client.list_tools()  # Standard MCP format
)

Result: Same database MCP server works with OpenAI, Claude, Gemini.

MCP Ecosystem

Pre-built MCP servers (as of June 2024):

ServerProviderWhat It Does
GitHub MCPOfficialSearch repos, create issues, review PRs
Google Drive MCPCommunityRead/write Google Docs, Sheets
Slack MCPCommunitySend messages, read channels
PostgreSQL MCPOfficialQuery databases
File System MCPOfficialRead/write local files
Web Search MCPCommunitySearch web (via Brave, Google APIs)

Total servers: 50+ (growing rapidly).

Developer impact: Instead of building custom integrations for each tool, just connect to MCP server.

OpenAI's Implementation

Timeline:

  • Q3 2024: MCP support in OpenAI SDK (beta)
  • Q4 2024: General availability
  • 2025: Deprecate proprietary function calling format (migrate to MCP)

Backward compatibility:

  • Existing function calling still works
  • Automatic translation layer (converts old format to MCP)
  • 12-month migration window

Example migration:

# Old (OpenAI function calling)
tools = [{
  "type": "function",
  "function": {
    "name": "get_weather",
    "parameters": {...}
  }
}]

# New (MCP)
from mcp import MCPClient
mcp_client = MCPClient("weather-server-url")
tools = mcp_client.list_tools()  # Standard MCP format

Migration effort: 1-2 hours for typical agent.

Benefits for Agent Builders

1. Portability

Build agent with OpenAI today, switch to Claude tomorrow (no code changes to tool integrations).

2. Ecosystem

50+ pre-built MCP servers (GitHub, Slack, databases) ready to use. Don't rebuild integrations.

3. Vendor neutrality

Not locked into OpenAI. Can multi-model (GPT-4 for complex reasoning, Claude for long context, Llama for cost).

4. Community contributions

Anyone can build MCP server, share with community. Network effects accelerate ecosystem growth.

Competitive Response

Anthropic (MCP creator): "Excited OpenAI adopted MCP. This accelerates standardization."

Google: No official statement. Gemini still uses proprietary function calling (as of June 2024).

Meta (Llama): Committed to MCP support in Llama 3.2 (expected Oct 2024).

Industry prediction: MCP becomes de facto standard by 2025. Holdouts (Google?) adopt to stay competitive.

Implications

1. Faster agent development

Before: Build custom integration for each tool × each LLM vendor = N×M integrations. After: Build MCP server once, works everywhere = N integrations.

Speedup: 3-5× faster to build multi-tool agents.

2. New business model: MCP-as-a-Service

Opportunity: Companies selling MCP servers as SaaS.

  • Example: Stripe MCP server ($99/month, handles payments)
  • Example: CRM MCP server ($49/month, integrates Salesforce/HubSpot)

Market size: Every SaaS tool needs MCP server (TAM: thousands of tools).

3. Enterprise adoption unlocked

Blocker: Enterprises reluctant to lock into single LLM vendor.

Solution: MCP enables multi-vendor strategy (use best LLM for each task, same tool integrations).

Impact: Accelerates enterprise AI adoption.


Bottom line: OpenAI adopting Model Context Protocol standardizes AI tool access across vendors. Build tool once, works with OpenAI, Claude, Gemini, Llama. 50+ pre-built MCP servers (GitHub, Slack, databases) ready to use. Migration from proprietary function calling takes 1-2 hours. Enables portability, accelerates development 3-5×, unlocks enterprise multi-vendor strategies.

Further reading: MCP specification | MCP server directory