Model Context Protocol (MCP): The Missing Link That Makes AI Actually Useful
Three months ago, I was juggling five different AI tools for my daily work. ChatGPT for writing, Claude for coding, a specialised AI for data analysis, another for email management, and yet another for calendar scheduling. Each lived in its own silo, unable to talk to the others or access my actual data.
Then I discovered Model Context Protocol (MCP), and everything changed.
Today, my self-hosted Ollama setup can read my emails, check my calendar, analyze spreadsheets, fetch live data from APIs, and even control my smart home devices. All through a single interface, using models that run entirely on my laptop.
If you've ever felt frustrated by AI tools that can't access your real data or work together, this post is for you. I'll explain MCP in simple terms, show you why it matters, and walk you through building your first MCP server that works with self-hosted AI models.
What Is MCP and Why Should You Care?
Think of MCP as a universal translator for AI systems. Just like how USB standardized how devices connect to computers, MCP standardizes how AI models connect to external tools, data sources, and services.
The Problem MCP Solves
Before MCP, every AI application was like a locked room:
ChatGPT: Brilliant at conversation, but can't read your files Claude: Excellent at coding, but can't access your database Local AI: Runs on your machine, but can't connect to anything Specialized AI tools: Great at specific tasks, but isolated from everything else
The Restaurant Analogy
Imagine you're running a restaurant with multiple chefs:
Without MCP: Each chef works in isolation
- Italian chef can only make pasta (but can't access the tomato supply)
- Indian chef knows spices (but can't check inventory)
- Pastry chef makes desserts (but doesn't know customer dietary restrictions)
With MCP: All chefs share a common communication system
- Any chef can check ingredients, inventory, and customer preferences
- They can coordinate complex meals together
- New chefs can plug into the same system immediately
That's exactly what MCP does for AI models and tools.
Real-World Examples That Make Sense
Let me show you practical scenarios where MCP transforms how AI works:
Example 1: The Productive Morning Routine
Before MCP:
- Check weather on phone
- Open calendar app to see meetings
- Open email to read overnight messages
- Switch to AI chat to ask for day planning help
- Manually type out all the information the AI needs
With MCP:
- Ask AI: "Plan my morning based on weather, calendar, and urgent emails"
- AI automatically:
- Fetches weather from API
- Reads your calendar
- Scans emails for urgency
- Suggests optimized schedule
- Books follow-up meetings if needed
Example 2: Content Creation Workflow
Before MCP:
- Research topic manually in browser
- Export data to spreadsheet
- Copy-paste information to AI
- Get response, manually format it
- Upload to content management system
With MCP:
- AI researches topic using web search tools
- Analyzes data using spreadsheet tools
- Generates content using writing tools
- Formats using document tools
- Publishes using CMS tools
- All in one conversation flow
Example 3: Business Intelligence
Before MCP:
- Sales data trapped in CRM
- Marketing data in different platform
- Financial data in accounting software
- Manual export and analysis required
With MCP:
- AI connects to all data sources
- Generates comprehensive reports automatically
- Identifies trends across platforms
- Suggests actionable insights
How MCP Actually Works
The Technical Foundation
MCP is built on three core concepts:
- Servers: Provide capabilities (like reading files, accessing APIs)
- Clients: AI applications that use those capabilities
- Protocol: Standard way they communicate
Think of it like electrical outlets in your house:
- Servers = Power outlets (provide electricity)
- Clients = Devices (use electricity)
- Protocol = Standard plug design (ensures compatibility)
Communication Flow
Here's what happens when you ask AI to "check my calendar":
- AI receives request: "Check my calendar for today"
- AI identifies needed tool: Calendar MCP server
- AI sends MCP request: "Get events for 2025-08-10"
- MCP server processes: Connects to Google Calendar API
- Server returns data: List of today's meetings
- AI incorporates response: "You have 3 meetings today..."
Why This Matters
Standardization: Any AI can use any MCP server Security: Servers control access and permissions Flexibility: Add new capabilities without changing AI Interoperability: Tools work together seamlessly
Available MCP Servers (The Ecosystem)
The MCP ecosystem is growing rapidly. Here are categories of servers you can use today:
File and Document Management
- File System: Read/write local files
- Google Drive: Access cloud documents
- SharePoint: Corporate document management
- Notion: Knowledge base integration
Communication and Collaboration
- Gmail: Email reading and sending
- Slack: Team communication
- Discord: Community management
- Calendars: Google, Outlook, Apple
Development and Technical
- GitHub: Repository management
- Database: SQL query execution
- APIs: Web service integration
- Docker: Container management
Business and Productivity
- CRM: Customer data access
- Analytics: Business intelligence
- E-commerce: Sales platform integration
- Project Management: Task and timeline tools
Smart Home and IoT
- Home Assistant: Smart device control
- Weather: Real-time weather data
- Transportation: Travel and logistics
- Health: Fitness and medical data
Building Your First MCP Server
Let's build a simple but useful MCP server that provides weather information. This server will fetch real weather data and make it available to any MCP-compatible AI.
Prerequisites
You'll need:
- Python 3.8+ installed
- Basic familiarity with command line
- Text editor (VS Code, Sublime, or even Notepad)
- OpenWeatherMap API key (free at openweathermap.org)
Step 1: Project Setup
Create a new directory and set up the project:
# Create project directory
mkdir weather-mcp-server
cd weather-mcp-server
# Create virtual environment
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Create project structure
mkdir src
touch src/__init__.py
touch src/weather_server.py
touch requirements.txt
touch config.yaml
Step 2: Install Dependencies
Add these to requirements.txt
:
mcp==0.9.0
requests==2.31.0
pydantic==2.5.0
python-dotenv==1.0.0
pyyaml==6.0.1
Install dependencies:
pip install -r requirements.txt
Step 3: Configuration Setup
Create config.yaml
:
weather:
api_key: "your_openweathermap_api_key_here"
base_url: "http://api.openweathermap.org/data/2.5"
default_units: "metric" # metric, imperial, or kelvin
server:
name: "Weather MCP Server"
version: "1.0.0"
description: "Provides current weather and forecast data"
Create .env
file for sensitive data:
OPENWEATHER_API_KEY=your_actual_api_key_here
Step 4: Core Server Implementation
Create the main server in src/weather_server.py
:
import asyncio
import json
import os
import logging
from typing import Any, Dict, List, Optional
from urllib.parse import urlencode
import requests
import yaml
from dotenv import load_dotenv
from mcp.server import Server
from mcp.server.models import InitializationOptions
from mcp.server.stdio import stdio_server
from mcp.types import (
Resource,
Tool,
TextContent,
ImageContent,
EmbeddedResource,
)
from pydantic import BaseModel
# Load environment variables
load_dotenv()
# Configure logging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
class WeatherConfig(BaseModel):
api_key: str
base_url: str = "http://api.openweathermap.org/data/2.5"
default_units: str = "metric"
class WeatherMCPServer:
def __init__(self, config_path: str = "config.yaml"):
"""Initialize the Weather MCP Server"""
self.server = Server("weather-mcp-server")
self.config = self._load_config(config_path)
self._setup_tools()
self._setup_resources()
def _load_config(self, config_path: str) -> WeatherConfig:
"""Load configuration from file and environment"""
try:
with open(config_path, 'r') as f:
config_data = yaml.safe_load(f)
# Override with environment variable if available
api_key = os.getenv('OPENWEATHER_API_KEY') or config_data['weather']['api_key']
return WeatherConfig(
api_key=api_key,
base_url=config_data['weather']['base_url'],
default_units=config_data['weather']['default_units']
)
except Exception as e:
logger.error(f"Failed to load config: {e}")
raise
def _setup_tools(self):
"""Register available tools with the MCP server"""
@self.server.list_tools()
async def handle_list_tools() -> List[Tool]:
"""Return list of available tools"""
return [
Tool(
name="get_current_weather",
description="Get current weather for a specific city",
inputSchema={
"type": "object",
"properties": {
"city": {
"type": "string",
"description": "City name (e.g., 'London', 'New York, NY')"
},
"units": {
"type": "string",
"enum": ["metric", "imperial", "kelvin"],
"description": "Temperature units",
"default": self.config.default_units
}
},
"required": ["city"]
}
),
Tool(
name="get_weather_forecast",
description="Get 5-day weather forecast for a specific city",
inputSchema={
"type": "object",
"properties": {
"city": {
"type": "string",
"description": "City name (e.g., 'London', 'New York, NY')"
},
"units": {
"type": "string",
"enum": ["metric", "imperial", "kelvin"],
"description": "Temperature units",
"default": self.config.default_units
}
},
"required": ["city"]
}
)
]
@self.server.call_tool()
async def handle_call_tool(name: str, arguments: Dict[str, Any]) -> List[TextContent]:
"""Handle tool execution"""
try:
if name == "get_current_weather":
result = await self._get_current_weather(
arguments.get("city"),
arguments.get("units", self.config.default_units)
)
elif name == "get_weather_forecast":
result = await self._get_weather_forecast(
arguments.get("city"),
arguments.get("units", self.config.default_units)
)
else:
return [TextContent(
type="text",
text=f"Unknown tool: {name}"
)]
return [TextContent(
type="text",
text=json.dumps(result, indent=2)
)]
except Exception as e:
logger.error(f"Tool execution failed: {e}")
return [TextContent(
type="text",
text=f"Error: {str(e)}"
)]
def _setup_resources(self):
"""Setup MCP resources"""
@self.server.list_resources()
async def handle_list_resources() -> List[Resource]:
"""Return list of available resources"""
return [
Resource(
uri="weather://current",
name="Current Weather",
description="Get current weather for any city",
mimeType="application/json"
),
Resource(
uri="weather://forecast",
name="Weather Forecast",
description="Get 5-day weather forecast for any city",
mimeType="application/json"
)
]
async def _get_current_weather(self, city: str, units: str = "metric") -> Dict[str, Any]:
"""Fetch current weather data from OpenWeatherMap"""
if not city:
raise ValueError("City name is required")
params = {
"q": city,
"appid": self.config.api_key,
"units": units
}
url = f"{self.config.base_url}/weather?{urlencode(params)}"
try:
response = requests.get(url, timeout=10)
response.raise_for_status()
data = response.json()
# Format the response for better readability
formatted_data = {
"city": data["name"],
"country": data["sys"]["country"],
"temperature": {
"current": data["main"]["temp"],
"feels_like": data["main"]["feels_like"],
"min": data["main"]["temp_min"],
"max": data["main"]["temp_max"],
"units": "°C" if units == "metric" else ("°F" if units == "imperial" else "K")
},
"weather": {
"main": data["weather"][0]["main"],
"description": data["weather"][0]["description"]
},
"humidity": f"{data['main']['humidity']}%",
"pressure": f"{data['main']['pressure']} hPa",
"visibility": f"{data.get('visibility', 'N/A')} meters",
"wind": {
"speed": data["wind"]["speed"],
"direction": data["wind"].get("deg", "N/A")
}
}
return formatted_data
except requests.exceptions.RequestException as e:
raise Exception(f"Failed to fetch weather data: {e}")
except KeyError as e:
raise Exception(f"Unexpected weather data format: {e}")
async def _get_weather_forecast(self, city: str, units: str = "metric") -> Dict[str, Any]:
"""Fetch 5-day weather forecast from OpenWeatherMap"""
if not city:
raise ValueError("City name is required")
params = {
"q": city,
"appid": self.config.api_key,
"units": units
}
url = f"{self.config.base_url}/forecast?{urlencode(params)}"
try:
response = requests.get(url, timeout=10)
response.raise_for_status()
data = response.json()
# Format forecast data
forecasts = []
for item in data["list"][:8]: # Next 24 hours (8 x 3-hour intervals)
forecasts.append({
"datetime": item["dt_txt"],
"temperature": {
"temp": item["main"]["temp"],
"feels_like": item["main"]["feels_like"],
"units": "°C" if units == "metric" else ("°F" if units == "imperial" else "K")
},
"weather": {
"main": item["weather"][0]["main"],
"description": item["weather"][0]["description"]
},
"humidity": f"{item['main']['humidity']}%",
"wind_speed": item["wind"]["speed"]
})
formatted_data = {
"city": data["city"]["name"],
"country": data["city"]["country"],
"forecast_count": len(forecasts),
"forecasts": forecasts
}
return formatted_data
except requests.exceptions.RequestException as e:
raise Exception(f"Failed to fetch forecast data: {e}")
except KeyError as e:
raise Exception(f"Unexpected forecast data format: {e}")
async def run(self):
"""Run the MCP server"""
logger.info("Starting Weather MCP Server...")
async with stdio_server(self.server) as (read_stream, write_stream):
await self.server.run(
read_stream,
write_stream,
InitializationOptions(
server_name="weather-mcp-server",
server_version="1.0.0",
capabilities=self.server.get_capabilities()
)
)
def main():
"""Main entry point"""
try:
server = WeatherMCPServer()
asyncio.run(server.run())
except KeyboardInterrupt:
logger.info("Server stopped by user")
except Exception as e:
logger.error(f"Server error: {e}")
raise
if __name__ == "__main__":
main()
Step 5: Testing the Server
Create a test script test_server.py
:
import asyncio
import json
import subprocess
import sys
from typing import Dict, Any
async def test_weather_server():
"""Test the weather MCP server functionality"""
print("Testing Weather MCP Server...")
print("=" * 50)
# Test current weather
test_requests = [
{
"type": "current",
"city": "London",
"description": "Current weather in London"
},
{
"type": "current",
"city": "New York",
"description": "Current weather in New York"
},
{
"type": "forecast",
"city": "Tokyo",
"description": "Weather forecast for Tokyo"
}
]
for test in test_requests:
print(f"\n🌤️ {test['description']}")
print("-" * 30)
# Simulate MCP tool call
if test["type"] == "current":
# You would normally send this through MCP protocol
# For testing, we'll create a mock scenario
print(f"Tool: get_current_weather")
print(f"Arguments: {{'city': '{test['city']}'}}")
print("✅ Tool call would be sent to server")
else:
print(f"Tool: get_weather_forecast")
print(f"Arguments: {{'city': '{test['city']}'}}")
print("✅ Tool call would be sent to server")
if __name__ == "__main__":
asyncio.run(test_weather_server())
Run the test:
python test_server.py
Step 6: Integrating with Ollama
Now let's connect our MCP server to Ollama. Create ollama_integration.py
:
import asyncio
import json
import subprocess
import os
from typing import List, Dict, Any
class OllamaMCPIntegration:
def __init__(self, mcp_server_path: str):
self.mcp_server_path = mcp_server_path
self.mcp_process = None
async def start_mcp_server(self):
"""Start the MCP server as a subprocess"""
try:
self.mcp_process = subprocess.Popen(
[sys.executable, self.mcp_server_path],
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
text=True
)
print("✅ MCP Weather Server started")
except Exception as e:
print(f"❌ Failed to start MCP server: {e}")
async def send_mcp_request(self, tool_name: str, arguments: Dict[str, Any]) -> str:
"""Send a request to the MCP server"""
if not self.mcp_process:
return "Error: MCP server not running"
# Format MCP request (simplified for demo)
request = {
"jsonrpc": "2.0",
"id": 1,
"method": "tools/call",
"params": {
"name": tool_name,
"arguments": arguments
}
}
try:
# Send request to MCP server
self.mcp_process.stdin.write(json.dumps(request) + "\n")
self.mcp_process.stdin.flush()
# Read response (simplified)
response = self.mcp_process.stdout.readline()
return response.strip()
except Exception as e:
return f"Error communicating with MCP server: {e}"
def stop_mcp_server(self):
"""Stop the MCP server"""
if self.mcp_process:
self.mcp_process.terminate()
print("🛑 MCP Weather Server stopped")
class OllamaWithMCP:
def __init__(self):
self.mcp_integration = OllamaMCPIntegration("src/weather_server.py")
async def setup(self):
"""Setup Ollama with MCP integration"""
await self.mcp_integration.start_mcp_server()
async def enhanced_chat(self, user_message: str) -> str:
"""Chat with Ollama, using MCP tools when needed"""
# Simple keyword detection for weather requests
weather_keywords = ["weather", "temperature", "forecast", "rain", "sunny", "cloudy"]
if any(keyword in user_message.lower() for keyword in weather_keywords):
# Extract city from message (simplified)
city = self._extract_city(user_message)
if city:
# Determine if user wants current weather or forecast
if "forecast" in user_message.lower():
weather_data = await self.mcp_integration.send_mcp_request(
"get_weather_forecast",
{"city": city}
)
else:
weather_data = await self.mcp_integration.send_mcp_request(
"get_current_weather",
{"city": city}
)
# Combine weather data with Ollama response
enhanced_prompt = f"""
User asked: {user_message}
I have retrieved the following weather data:
{weather_data}
Please provide a helpful response based on this weather information.
"""
# Call Ollama with enhanced prompt
ollama_response = await self._call_ollama(enhanced_prompt)
return ollama_response
# Regular chat without MCP tools
return await self._call_ollama(user_message)
def _extract_city(self, message: str) -> str:
"""Extract city name from user message (simplified implementation)"""
# In a real implementation, you'd use NLP or more sophisticated parsing
common_cities = [
"london", "new york", "tokyo", "paris", "berlin", "sydney",
"mumbai", "delhi", "bangalore", "chennai", "hyderabad",
"san francisco", "los angeles", "chicago", "toronto", "vancouver"
]
message_lower = message.lower()
for city in common_cities:
if city in message_lower:
return city.title()
# Try to extract "in [city]" pattern
import re
match = re.search(r'\bin\s+([A-Za-z\s]+)', message)
if match:
return match.group(1).strip().title()
return "London" # Default fallback
async def _call_ollama(self, prompt: str) -> str:
"""Call Ollama with the given prompt"""
try:
# Use Ollama API or subprocess call
process = subprocess.run([
"ollama", "run", "llama3.1:8b"
], input=prompt, text=True, capture_output=True, timeout=30)
if process.returncode == 0:
return process.stdout.strip()
else:
return f"Error calling Ollama: {process.stderr}"
except subprocess.TimeoutExpired:
return "Error: Ollama request timed out"
except Exception as e:
return f"Error: {e}"
async def cleanup(self):
"""Clean up resources"""
self.mcp_integration.stop_mcp_server()
# Example usage
async def main():
chat = OllamaWithMCP()
await chat.setup()
print("🤖 Ollama + MCP Weather Server Integration Ready!")
print("Ask me about weather in different cities...")
print("Type 'quit' to exit\n")
try:
while True:
user_input = input("You: ")
if user_input.lower() in ['quit', 'exit', 'bye']:
break
response = await chat.enhanced_chat(user_input)
print(f"AI: {response}\n")
except KeyboardInterrupt:
print("\nGoodbye!")
finally:
await chat.cleanup()
if __name__ == "__main__":
asyncio.run(main())
Step 7: Advanced Features
Let's add some advanced features to make our server more robust:
Create src/advanced_features.py
:
import asyncio
import sqlite3
import time
from typing import Dict, Any, Optional
from datetime import datetime, timedelta
class WeatherCache:
"""Simple caching system for weather data"""
def __init__(self, cache_duration: int = 600): # 10 minutes default
self.cache_duration = cache_duration
self.cache = {}
def _cache_key(self, city: str, data_type: str) -> str:
return f"{city.lower()}:{data_type}"
def get(self, city: str, data_type: str) -> Optional[Dict[str, Any]]:
"""Get cached data if it exists and is not expired"""
key = self._cache_key(city, data_type)
if key in self.cache:
data, timestamp = self.cache[key]
if time.time() - timestamp < self.cache_duration:
return data
else:
# Remove expired data
del self.cache[key]
return None
def set(self, city: str, data_type: str, data: Dict[str, Any]):
"""Cache weather data"""
key = self._cache_key(city, data_type)
self.cache[key] = (data, time.time())
def clear(self):
"""Clear all cached data"""
self.cache.clear()
class WeatherHistory:
"""Store weather data history in SQLite database"""
def __init__(self, db_path: str = "weather_history.db"):
self.db_path = db_path
self._init_database()
def _init_database(self):
"""Initialize SQLite database"""
conn = sqlite3.connect(self.db_path)
cursor = conn.cursor()
cursor.execute("""
CREATE TABLE IF NOT EXISTS weather_requests (
id INTEGER PRIMARY KEY AUTOINCREMENT,
city TEXT NOT NULL,
request_type TEXT NOT NULL,
data TEXT NOT NULL,
timestamp DATETIME DEFAULT CURRENT_TIMESTAMP
)
""")
conn.commit()
conn.close()
def store_request(self, city: str, request_type: str, data: Dict[str, Any]):
"""Store weather request and response"""
conn = sqlite3.connect(self.db_path)
cursor = conn.cursor()
cursor.execute("""
INSERT INTO weather_requests (city, request_type, data)
VALUES (?, ?, ?)
""", (city, request_type, json.dumps(data)))
conn.commit()
conn.close()
def get_recent_requests(self, city: str, hours: int = 24) -> List[Dict[str, Any]]:
"""Get recent weather requests for a city"""
conn = sqlite3.connect(self.db_path)
cursor = conn.cursor()
since_time = datetime.now() - timedelta(hours=hours)
cursor.execute("""
SELECT request_type, data, timestamp
FROM weather_requests
WHERE city = ? AND timestamp > ?
ORDER BY timestamp DESC
""", (city, since_time))
results = []
for row in cursor.fetchall():
results.append({
"request_type": row[0],
"data": json.loads(row[1]),
"timestamp": row[2]
})
conn.close()
return results
class EnhancedWeatherServer(WeatherMCPServer):
"""Enhanced weather server with caching and history"""
def __init__(self, config_path: str = "config.yaml"):
super().__init__(config_path)
self.cache = WeatherCache()
self.history = WeatherHistory()
async def _get_current_weather(self, city: str, units: str = "metric") -> Dict[str, Any]:
"""Enhanced current weather with caching"""
# Check cache first
cached_data = self.cache.get(city, "current")
if cached_data:
logger.info(f"Returning cached weather data for {city}")
return cached_data
# Fetch fresh data
data = await super()._get_current_weather(city, units)
# Cache the result
self.cache.set(city, "current", data)
# Store in history
self.history.store_request(city, "current", data)
return data
async def _get_weather_forecast(self, city: str, units: str = "metric") -> Dict[str, Any]:
"""Enhanced forecast with caching"""
# Check cache first
cached_data = self.cache.get(city, "forecast")
if cached_data:
logger.info(f"Returning cached forecast data for {city}")
return cached_data
# Fetch fresh data
data = await super()._get_weather_forecast(city, units)
# Cache the result
self.cache.set(city, "forecast", data)
# Store in history
self.history.store_request(city, "forecast", data)
return data
Step 8: Production Configuration
Create production_config.yaml
:
weather:
api_key: "${OPENWEATHER_API_KEY}"
base_url: "http://api.openweathermap.org/data/2.5"
default_units: "metric"
cache_duration: 600 # 10 minutes
rate_limit: 60 # requests per minute
server:
name: "Weather MCP Server"
version: "1.0.0"
description: "Production weather server with caching and rate limiting"
log_level: "INFO"
max_concurrent_requests: 10
database:
path: "weather_
Member discussion