Ecosystem

Chainlit

Chainlit is an open-source async Python framework which allows developers to build scalable Conversational AI or agentic applications.

You can follow Chainlit installation steps on their Getting Started Page

By integrating Zep into your Chainlit LLM application, you elevate your conversational agent with powerful features like long-term memory and context fusion.

In this guide, we’ll walk you through the steps to build a simple Question and Answer agent using Chainlit, Open AI and Zep.

Steps to Use Zep Cloud with ChainLit

  1. Setup Zep Client: Initialize the Zep Client within your ChainLit application using your Zep Project API key.
1# Import necessary modules from Zep Python SDK and ChainLit.
2from zep_cloud import ZepClient
3from zep_cloud.memory import Memory, Session
4from zep_cloud.message import Message
5from zep_cloud.user import CreateUserRequest
6import chainlit as cl
7import uuid
8import os
9from openai import AsyncOpenAI
10
11# Retrieve API keys from environment variables.
12ZEP_API_KEY = os.environ.get("ZEP_API_KEY")
13OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY")
14
15# Initialize clients for OpenAI GPT-4 and Zep with respective API keys.
16openai_client = AsyncOpenAI(api_key=OPENAI_API_KEY)
17zep = ZepClient(api_key=ZEP_API_KEY)
  1. User and Session Management: Leverage the CreateUserRequest and Session models to manage your application’s users and sessions effectively.
1@cl.on_chat_start
2async def start_chat():
3 """Handles the event triggered at the start of a new chat through ChainLit."""
4 # Generate unique identifiers for the user and session.
5 user_id = str(uuid.uuid4())
6 session_id = str(uuid.uuid4())
7
8 # Save user and session identifiers in the current session context.
9 cl.user_session.set("user_id", user_id)
10 cl.user_session.set("session_id", session_id)
11
12 # Register a new user in Zep's system using the generated User ID.
13 await zep.user.aadd(CreateUserRequest(user_id=user_id))
14
15 # Start a new session for the user in Zep.
16 await zep.memory.aadd_session(Session(user_id=user_id, session_id=session_id))
  1. Zep Dialog tools: Elevate agent knowledge with ChainLit Steps and Zep Dialog Tools
    Discover more about Zep’s dialog tools on the Zep Documentation Page.
1@cl.step(name="session classification", type="tool")
2async def classify_session(session_id: str):
3 """Classify dialog with custom instructions."""
4 # Define categories for classification.
5 classes = [
6 "General",
7 "Travel",
8 "Shopping",
9 "Cars",
10 ]
11 # Use Zep's dialog async classification feature with custom instruction for session classification.
12 classification = await zep.memory.aclassify_session(
13 session_id, "session_classification", classes, persist=True, instruction="you are a helpful assistance, give a conversation classify 0 for General topics, 1 for Travel-related discussions, 2 for Shopping conversations, and 3 for talks about Cars. For example, a chat about visiting Paris for landmarks and cuisine should be classified as 1."
14 )
15 return classification
  1. Message Handling: You can effectively store and fetch your Chainlit application chat history on Zep memory store, enhancing your LLM conversational context.
Discover more about Zep’s memory store capabilities on the Zep Documentation Page.
1@cl.step(name="OpenAI", type="llm")
2async def call_openai(session_id):
3 """Invokes the OpenAI API to generate a response based on the session message history."""
4 # Fetch session messages from Zep.
5 memory = await zep.message.aget_session_messages(session_id)
6 memory_history = [m.to_dict() for m in memory]
7
8 # Prepare data, excluding certain fields for privacy/security.
9 cleaned_data = [{k: v for k, v in item.items() if k not in ['created_at', 'role_type', 'token_count', 'uuid']} for item in memory_history]
10
11 # Generate a response from OpenAI using the cleaned session data.
12 response = await openai_client.chat.completions.create(
13 model="gpt-4",
14 temperature=0.1,
15 messages=cleaned_data,
16 )
17 return response.choices[0].message
18
19@cl.on_message
20async def on_message(message: cl.Message):
21 """Processes each incoming message, integrates with OpenAI for response, and updates Zep memory."""
22 session_id = cl.user_session.get("session_id")
23 # classify user message to give the LLM a semantic insights to what the user request is about
24 classify_sess = await classify_session(session_id)
25 # Store the incoming message in Zep's session memory and append the classified dialog.
26 await zep.memory.aadd_memory(
27 session_id,
28 Memory(messages=[Message(role_type="user", content=message.content + "\n" + "conversation_topic: " + classify_sess.class_, role="user")]),
29 )
30
31 # Retrieve a response from the OpenAI model.
32 response_message = await call_openai(session_id)
33
34 # Send the generated response back through ChainLit.
35 msg = cl.Message(author="Answer", content=(response_message.content))
36 await msg.send()
37
38 # Update Zep's session memory with the assistant's response for continuity.
39 await zep.memory.aadd_memory(
40 session_id,
41 Memory(messages=[Message(role_type="assistant", content=response_message.content, role="assistant")]),
42 )
  1. To access your LLM session data, navigate to the Zep Cloud Console, select a session, and review all the associated session data and logs.
Zep Cloud session console example

In conclusion, integrating Zep Cloud with Chainlit empowers developers to create conversational AI applications that are more intelligent, context-aware, and efficient.

For additional examples, check out more use cases.