Quick Start
Get started with Zep Community Edition
Looking for a managed Zep service? Check out Zep Cloud.
- No need to run Neo4j, Postgres, or other dependencies.
- Additional features for startups and enterprises alike.
- Fast and scalable.
Learn more about the differences between Zep Community Edition and Zep Cloud.
Starting a Zep server locally is simple.
- Clone the Zep repo
- Configure your Zep server by editing the
zep.yaml
file.
If you’d like to use an environment variable as the value for any of the configuration options, you can use a template string to insert the value. For example, if you wanted to use an environment variable to set the Postgres password, you could do the following:
You can name your environment variable whatever you want.
- Start the Zep server:
Make sure to set the secret
value in the zep.yaml
configuration file.
Additionally, make sure that you expose an OPENAI_API_KEY
environment variable either in a local .env file or by running:
This will start a Zep API server on port 8000
and Graphiti service on port 8003
.
- Get started with the Zep Community Edition SDKs!
The Zep Community Edition SDKs are API compatible with the Zep Cloud SDKs. The Zep Guides and API reference note where functionality may differ.
Next Steps:
- Install the Zep Community Edition SDKs
- Read the Zep Service Guides
- Explore the Zep API Reference
Using LLM Providers other than OpenAI
Zep Community Edition can be used with any LLM provider that implements the OpenAI API. There are two approaches to configure this:
1. Direct Configuration (Recommended)
Set the following environment variables for the Graphiti service in your docker-compose.ce.yaml
file:
This approach works with any OpenAI API-compatible provider and local LLM inference servers, such as Ollama.
2. Using a Proxy (Alternative)
You can also use a proxy service like LiteLLM that provides an OpenAI compatible API for non-OpenAI compatible LLM providers. LiteLLM supports proxying both LLM and Embedding requests.
Set the OPENAI_API_KEY
, MODEL_NAME
, and OPENAI_BASE_URL
environment variables to point to your LLM provider. This may be done in a .env
file or directly in the docker-compose.ce.yaml
file.