The Memgraph MCP Server acts as a bridge between Memgraph and LLMs, enabling AI-driven database interactions through a lightweight implementation. With this server, you can:
- Run queries: Execute Cypher queries against Memgraph using the
run_query()
tool - Access schema information: Retrieve database structure details with the
get_schema()
resource (requires--schema-info-enabled=True
) - Interact conversationally: Connect LLMs like Claude to interact with your database directly from a chat interface
Important
This repository has been merged into the Memgraph AI Toolkit monorepo to avoid duplicating tools.
It will be deleted in one month—please follow the MCP integration there for all future development, and feel free to open issues or PRs in that repo.
🚀 Memgraph MCP Server
Memgraph MCP Server is a lightweight server implementation of the Model Context Protocol (MCP) designed to connect Memgraph with LLMs.
⚡ Quick start
1. Run Memgraph MCP Server
- Install
uv
and createvenv
withuv venv
. Activate virtual environment with.venv\Scripts\activate
. - Install dependencies:
uv add "mcp[cli]" httpx
- Run Memgraph MCP server:
uv run server.py
.
2. Run MCP Client
- Install Claude for Desktop.
- Add the Memgraph server to Claude config:
MacOS/Linux
Windows
Example config:
Note
You may need to put the full path to the uv executable in the command field. You can get this by running which uv
on MacOS/Linux or where uv
on Windows. Make sure you pass in the absolute path to your server.
3. Chat with the database
- Run Memgraph MAGE:The
--schema-info-enabled
configuration setting is set toTrue
to allow LLM to runSHOW SCHEMA INFO
query. - Open Claude Desktop and see the Memgraph tools and resources listed. Try it out! (You can load dummy data from Memgraph Lab Datasets)
🔧Tools
run_query()
Run a Cypher query against Memgraph.
🗃️ Resources
get_schema()
Get Memgraph schema information (prerequisite: --schema-info-enabled=True
).
🗺️ Roadmap
The Memgraph MCP Server is just at its beginnings. We're actively working on expanding its capabilities and making it even easier to integrate Memgraph into modern AI workflows. In the near future, we'll be releasing a TypeScript version of the server to better support JavaScript-based environments. Additionally, we plan to migrate this project into our central AI Toolkit repository, where it will live alongside other tools and integrations for LangChain, LlamaIndex, and MCP. Our goal is to provide a unified, open-source toolkit that makes it seamless to build graph-powered applications and intelligent agents with Memgraph at the core.
You must be authenticated.
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Tools
A lightweight server implementation of the Model Context Protocol that connects Memgraph database with LLMs, allowing users to interact with graph databases through natural language.
Related MCP Servers
- -securityAlicense-qualityA Model Context Protocol server that enables LLMs to interact directly with MongoDB databases, allowing users to query collections, inspect schemas, and manage data through natural language.Last updated -340TypeScriptMIT License
- -securityAlicense-qualityA Model Context Protocol server that enables LLMs to interact directly with MongoDB databases, allowing users to query collections, inspect schemas, and manage data through natural language.Last updated -340MIT License
- -securityAlicense-qualityA Model Context Protocol server that enables LLMs to interact with GraphQL APIs by providing schema introspection and query execution capabilities.Last updated -7331MIT License
- -securityAlicense-qualityA Model Context Protocol server that enables LLMs to interact with databases (currently MongoDB) through natural language, supporting operations like querying, inserting, deleting documents, and running aggregation pipelines.Last updated -TypeScriptMIT License