Model Context Protocol

The open standard connecting AI assistants to your data and tools

What is MCP?

Model Context Protocol is an open protocol that enables seamless integration between AI assistants and external data sources, giving every developer the ability to build powerful AI integrations.

🔌

Universal Protocol

MCP provides a standardized way for AI models to interact with external tools, APIs, and data sources. No more custom integrations for each AI platform.

🔒

Secure by Design

Built with security in mind, MCP uses authentication, authorization, and sandboxing to ensure your data stays protected while enabling powerful AI capabilities.

Real-time Context

Give AI models access to live data and real-time information from your systems, databases, and APIs, enabling more accurate and contextual responses.

🛠️

Tool Integration

Expose your tools and services as MCP servers, allowing AI assistants to perform actions, execute commands, and interact with your infrastructure.

🌐

Open Standard

MCP is an open protocol supported by major AI platforms including Anthropic's Claude, OpenAI, and others, ensuring wide compatibility and future-proof integrations.

📊

Rich Data Access

Connect AI to databases, file systems, APIs, and more. MCP servers can expose structured data, search capabilities, and complex queries to AI models.

Why Use MCP?

MCP transforms how AI assistants interact with your tools and data

🚀

Faster Development

Build once, deploy everywhere. Create MCP servers that work with any compatible AI assistant without platform-specific code.

🎯

Better AI Responses

Give AI models access to your specific data and context, resulting in more accurate, relevant, and actionable responses.

🔄

Bidirectional Communication

Enable AI to not just read data but also perform actions, update systems, and integrate deeply with your workflows.

📈

Scalable Architecture

MCP servers can be deployed as standalone services, supporting multiple clients and scaling independently from your AI applications.

🔐

Enterprise Ready

Built-in support for authentication, rate limiting, and access control makes MCP suitable for enterprise deployments.

🌟

Future Proof

As an open standard backed by industry leaders, MCP ensures your integrations remain compatible with evolving AI technologies.

Getting Started

Connect your MCP servers to Claude Code or OpenAI's API in minutes

Claude Code

Use MCP servers with Claude Desktop and Claude Code CLI

1

Configure Claude Desktop

Add your MCP server to the Claude configuration file

// ~/Library/Application Support/Claude/claude_desktop_config.json { "mcpServers": { "mealie": { "command": "docker", "args": [ "run", "-i", "--rm", "-e", "MEALIE_URL=https://mealie.example.com", "-e", "MEALIE_USER_TOKEN=your_token", "registry.teyssedre.ca/mealie-mcp:latest" ] } } }
2

Restart Claude Desktop

The MCP server will connect automatically on startup

3

Start Using Tools

Claude can now access all 247 Mealie tools via MCP

OpenAI API

Connect MCP servers via HTTP/SSE transport

1

Start MCP HTTP Server

Deploy your MCP server with HTTP transport enabled

# docker-compose.http.yml docker-compose -f docker-compose.http.yml up -d # Server available at: # http://localhost:8000/sse
2

Configure OpenAI Client

Point your OpenAI integration to the MCP endpoint

MCP_ENDPOINT="http://localhost:8000/sse" MCP_AUTH_TOKEN="your_bearer_token"
3

Access Tools via API

OpenAI can now discover and use MCP tools via the HTTP endpoint