Configure Zep server settings, authentication, and LLM options

You are viewing the Zep Open Source v0.x documentation. This version is no longer supported, and documentation is provided as a reference only.

The current documentation for Zep Community Edition is available here.

Zep is configured via a yaml configuration file and/or environment variables. The zep server accepts a CLI argument --config to specify the location of the config file. If no config file is specified, the server will look for a config.yaml file in the current working directory.

$zep --config /path/to/config.yaml

Warning: Your OpenAI/Anthropic API key and Auth Secret should not be set in the config file, rather the environment variables below should be set. These can also be configured in a .env file in the current working directory.

Zep Server

The Zep server can be configured via environment variables, a .env file, or the config.yaml file. The following table lists the available configuration options.

Server Config

Config KeyEnvironment VariableDefault
store.typeZEP_STORE_TYPEpostgres
store.postgres.dsnZEP_STORE_POSTGRES_DSNInstallation dependent.
server.hostZEP_SERVER_HOST0.0.0.0
server.portZEP_SERVER_PORT8000
server.web_enabledZEP_SERVER_WEB_ENABLEDtrue
server.max_request_sizeZEP_SERVER_MAX_REQUEST_SIZE5242880
nlp.server_urlZEP_NLP_SERVER_URLInstallation dependent.
opentelemetry.enabledZEP_OPENTELEMETRY_ENABLEDfalse
developmentZEP_DEVELOPMENTfalse
log.levelZEP_LOG_LEVELinfo

Authentication Config

Please see the Authentication documentation for more information on configuring authentication.

Config KeyEnvironment VariableDefault
auth.requiredZEP_AUTH_REQUIREDfalse
auth.secretZEP_AUTH_SECRETdo-not-use-this-secret-in-production
data.purge_everyZEP_DATA_PURGE_EVERY60

LLMs

See the LLM Configuration for more information on configuring LLMs.

Note: Anthropic does not support embeddings. If configuring Zep to use the Anthropic LLM service, you must configure Zep to use the local embeddings service.

Config KeyEnvironment VariableDefault
llm.serviceZEP_LLM_SERVICEopenai
llm.modelZEP_LLM_MODELgpt-3.5-turbo
llm.azure_openai_endpointZEP_LLM_AZURE_OPENAI_ENDPOINTundefined
llm.openai_endpointZEP_LLM_OPENAI_ENDPOINTundefined
llm.openai_org_idZEP_LLM_OPENAI_ORG_IDundefined
llm.azure_openaiZEP_LLM_AZURE_OPENAIundefined
llm.azure_openai_endpointZEP_LLM_AZURE_OPENAI_ENDPOINTundefined

Enrichment and Extraction

Config KeyEnvironment VariableDefault
memory.message_windowZEP_MEMORY_MESSAGE_WINDOW12
extractors.documents.embeddings.enabledZEP_EXTRACTORS_DOCUMENTS_EMBEDDINGS_ENABLEDtrue
extractors.documents.embeddings.dimensionsZEP_EXTRACTORS_DOCUMENTS_EMBEDDINGS_DIMENSIONS384
extractors.documents.embeddings.serviceZEP_EXTRACTORS_DOCUMENTS_EMBEDDINGS_SERVICElocal
extractors.documents.embeddings.chunk_sizeZEP_EXTRACTORS_DOCUMENTS_EMBEDDINGS_CHUNK_SIZE1000
extractors.messages.summarizer.enabledZEP_EXTRACTORS_MESSAGES_SUMMARIZER_ENABLEDtrue
extractors.messages.entities.enabledZEP_EXTRACTORS_MESSAGES_ENTITIES_ENABLEDtrue
extractors.messages.intent.enabledZEP_EXTRACTORS_MESSAGES_INTENT_ENABLEDfalse
extractors.messages.embeddings.enabledZEP_EXTRACTORS_MESSAGES_EMBEDDINGS_ENABLEDtrue
extractors.messages.embeddings.dimensionsZEP_EXTRACTORS_MESSAGES_EMBEDDINGS_DIMENSIONS384
extractors.messages.embeddings.serviceZEP_EXTRACTORS_MESSAGES_EMBEDDINGS_SERVICElocal
extractors.messages.summarizer.embeddings.enabledZEP_EXTRACTORS_MESSAGES_SUMMARIZER_EMBEDDINGS_ENABLEDtrue
extractors.messages.summarizer.embeddings.dimensionsZEP_EXTRACTORS_MESSAGES_SUMMARIZER_EMBEDDINGS_DIMENSIONS384
extractors.messages.summarizer.embeddings.serviceZEP_EXTRACTORS_MESSAGES_SUMMARIZER_EMBEDDINGS_SERVICElocal
custom_prompts.summarizer_prompts.openaiZEP_CUSTOM_PROMPTS_SUMMARIZER_PROMPTS_OPENAISee Zep’s source code for details
custom_prompts.summarizer_prompts.anthropicZEP_CUSTOM_PROMPTS_SUMMARIZER_PROMPTS_ANTHROPICSee Zep’s source code for details

Data Management

See the Data Management documentation for more information on configuring data management.

Config KeyEnvironment VariableDefault
data.purge_everyZEP_DATA_PURGE_EVERY60

Valid LLM Models

The following table lists the valid LLM models for the llm.model configuration option.

ProviderModel
OpenAIgpt-3.5-turbo
OpenAIgpt-3.5-turbo-16k
OpenAIgpt-4
OpenAIgpt-4-32k
Anthropicclaude-instant-1
Anthropicclaude-2

Zep NLP Server

The Zep NLP Server may be configured via a .env file or environment variables. The following table lists the available configuration options. Note that the NLP server’s container is not shipped with CUDA nor configured to use GPU acceleration.

Config KeyEnvironment VariableDefault
log_levelZEP_LOG_LEVELinfo
server.portZEP_SERVER_PORT5557
embeddings.deviceZEP_EMBEDDINGS_DEVICEcpu
embeddings.messages.enabledZEP_EMBEDDINGS_MESSAGES_ENABLEDtrue
embeddings.messages.modelZEP_EMBEDDINGS_MESSAGES_MODELall-MiniLM-L6-v2
embeddings.documents.enabledZEP_EMBEDDINGS_DOCUMENTS_ENABLEDtrue
embeddings.documents.modelZEP_EMBEDDINGS_DOCUMENTS_MODELall-MiniLM-L6-v2
nlp.spacy_modelZEP_NLP_SPACY_MODELen_core_web_sm