Knowledge Graph MCP Server
The Graphiti MCP Server is an experimental implementation that exposes Graphiti’s key functionality through the Model Context Protocol (MCP). This enables AI assistants like Claude Desktop, Cursor, and VS Code with Copilot to interact with Graphiti’s knowledge graph capabilities, providing persistent memory and contextual awareness.
The Graphiti MCP Server bridges AI assistants with Graphiti’s temporally-aware knowledge graphs, allowing assistants to maintain persistent memory across conversations and sessions. Unlike traditional RAG methods, it continuously integrates user interactions, structured and unstructured data, and external information into a coherent, queryable graph.
Key Features
The MCP server exposes Graphiti’s core capabilities:
- Episode Management: Add, retrieve, and delete episodes (text, messages, or JSON data)
- Entity Management: Search and manage entity nodes and relationships
- Search Capabilities: Semantic and hybrid search for facts and node summaries
- Group Management: Organize data with group_id filtering for multi-user scenarios
- Graph Maintenance: Clear graphs and rebuild indices as needed
- Pre-configured Entity Types: Structured entity extraction for domain-specific use cases
- Multiple Database Support: FalkorDB (Redis-based, default) and Neo4j
- Flexible LLM Providers: OpenAI, Anthropic, Gemini, Groq, and Azure OpenAI
- Multiple Embedding Options: OpenAI, Voyage, Sentence Transformers, and Gemini
Quick Start
This quick start uses OpenAI and FalkorDB (default). The server supports multiple LLM providers (OpenAI, Anthropic, Gemini, Grogu, Azure OpenAI) and databases (FalkorDB, Neo4j). For detailed configuration options, see the MCP Server README.
Prerequisites
Before getting started, ensure you have:
- Python 3.10+ installed on your system
- Database - Either FalkorDB (default, Redis-based) or Neo4j (5.26+) running locally or accessible remotely
- LLM API key - For OpenAI, Anthropic, Gemini, Groq, or Azure OpenAI
Installation
- Clone the Graphiti repository:
- Navigate to the MCP server directory and install dependencies:
Configuration
Configuration follows a precedence hierarchy: command-line arguments override environment variables, which override config.yaml settings.
Set up your environment variables in a .env file:
Running the Server
Start the MCP server:
For development with custom options:
MCP Client Integration
The MCP server supports integration with multiple AI assistants through different transport protocols.
Claude Desktop
Configure Claude Desktop to connect via the stdio transport:
Cursor IDE
For Cursor, use the SSE transport configuration:
VS Code with Copilot
VS Code with Copilot can connect to the MCP server using HTTP endpoints. Configure your VS Code settings to point to the running MCP server.
Available Tools
Once connected, AI assistants have access to these Graphiti tools:
add_memory- Store episodes and interactions in the knowledge graphsearch_facts- Find relevant facts and relationshipssearch_nodes- Search for entity summaries and informationget_episodes- Retrieve recent episodes for contextdelete_episode- Remove episodes from the graphclear_graph- Reset the knowledge graph entirely
Docker Deployment
For containerized deployment, use the provided Docker Compose setup:
This starts both the database (FalkorDB or Neo4j) and the MCP server with SSE transport enabled. Docker Compose can launch services in unified or separate containers with sensible defaults for immediate use.
Performance and Privacy
Performance Tuning
Episode processing uses asynchronous queuing with concurrency controlled by SEMAPHORE_LIMIT. The MCP server README provides tier-specific guidelines for major LLM providers to prevent rate-limiting while maximizing throughput.
Telemetry
The framework includes optional anonymous telemetry collection that captures only system information. Telemetry never exposes API keys or graph content. Disable telemetry by setting:
Next Steps
For comprehensive configuration options, advanced features, and troubleshooting:
- Full Documentation: See the complete MCP Server README
- Integration Examples: Explore client-specific setup guides for Claude Desktop, Cursor, and VS Code
- Custom Entity Types: Configure pre-configured entity types for domain-specific extraction
- Multi-tenant Setup: Use group IDs for organizing data across different contexts
- Alternative LLM Providers: Configure Anthropic, Gemini, Groq, or Azure OpenAI
- Database Options: Switch between FalkorDB and Neo4j based on your needs
The MCP server is experimental and under active development. Features and APIs may change between releases.