Chainlit

Chainlit is an open-source async Python framework which allows developers to build scalable Conversational AI or agentic applications.

You can follow Chainlit installation steps on their Getting Started Page

In this guide, we’ll walk you through the steps to build a simple Question and Answer agent using Chainlit, Open AI and Zep.

Steps to Use Zep Cloud with ChainLit

  1. Setup Zep Client: Initialize the Zep Client within your ChainLit application using your Zep Project API key.
1# Import necessary modules from Zep Python SDK and ChainLit.
2from zep_cloud.client import AsyncZep
3from zep_cloud.memory import Memory, Session
4from zep_cloud.message import Message
5import chainlit as cl
6import uuid
7import os
8from openai import AsyncOpenAI
9
10# Retrieve API keys from environment variables.
11ZEP_API_KEY = os.environ.get("ZEP_API_KEY")
12OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY")
13
14# Initialize clients for OpenAI GPT-4 and Zep with respective API keys.
15openai_client = AsyncOpenAI(api_key=OPENAI_API_KEY)
16zep = AsyncZep(api_key=ZEP_API_KEY)
  1. User and Session Management:
1@cl.on_chat_start
2async def start_chat():
3 """Handles the event triggered at the start of a new chat through ChainLit."""
4 # Generate unique identifiers for the user and session.
5 user_id = str(uuid.uuid4())
6 session_id = str(uuid.uuid4())
7
8 # Save user and session identifiers in the current session context.
9 cl.user_session.set("user_id", user_id)
10 cl.user_session.set("session_id", session_id)
11
12 # Register a new user in Zep's system using the generated User ID.
13 await zep.user.add(
14 user_id=user_id,
15 email="[email protected]", # Optional: Add email or other user details
16 first_name="Jane", # Optional: Add first name
17 last_name="Doe", # Optional: Add last name
18 metadata={"foo": "bar"} # Optional: Add metadata
19 )
20
21 # Start a new session for the user in Zep.
22 await zep.memory.add_session(
23 session_id=session_id,
24 user_id=user_id, # Associate this session with the user
25 metadata={"foo": "bar"} # Optional: Add session metadata
26 )
  1. Zep Dialog Classification tools
    Read more about Zep’s dialog classification on the Zep Dialog Classification Page.
1@cl.step(name="session classification", type="tool")
2async def classify_session(session_id: str):
3 """Classify dialog with custom instructions."""
4 # Define categories for classification.
5 classes = [
6 "General",
7 "Travel",
8 "Shopping",
9 "Cars",
10 ]
11 # Use Zep's dialog async classification feature with custom instruction for session classification.
12 classification = await zep.memory.classify_session(
13 session_id=session_id,
14 name="session_classification",
15 classes=classes,
16 last_n=4, # Optional: Specify the number of previous messages to consider
17 persist=True,
18 instruction="What is the topic of this conversation? Classify it into one of the categories"
19 )
20 return classification
  1. Message Handling: You can effectively store and fetch your Chainlit application chat history on Zep memory store, enhancing your LLM conversational context.
Discover more about Zep’s memory store capabilities on the Zep Documentation Page.
1@cl.step(name="OpenAI", type="llm")
2async def call_openai(session_id):
3 """Invokes the OpenAI API to generate a response based on the session message history."""
4 # Fetch session messages from Zep.
5 memory = await zep.message.aget_session_messages(session_id)
6 memory_history = [m.to_dict() for m in memory]
7
8 # Prepare data, excluding certain fields for privacy/security.
9 cleaned_data = [{k: v for k, v in item.items() if k not in ['created_at', 'role_type', 'token_count', 'uuid']} for item in memory_history]
10
11 # Generate a response from OpenAI using the cleaned session data.
12 response = await openai_client.chat.completions.create(
13 model="gpt-4",
14 temperature=0.1,
15 messages=cleaned_data,
16 )
17 return response.choices[0].message
18
19@cl.on_message
20async def on_message(message: cl.Message):
21 """Processes each incoming message, integrates with OpenAI for response, and updates Zep memory."""
22 session_id = cl.user_session.get("session_id")
23 # classify user message to give the LLM a semantic insights to what the user request is about
24 classify_sess = await classify_session(session_id)
25 # Store the incoming message in Zep's session memory and append the classified dialog.
26 await zep.memory.add(session_id, messages=[Message(role_type="user", content=message.content + "\n" + "conversation_topic: " + classify_sess.class_, role="user")]) # Updated method
27
28 # Retrieve a response from the OpenAI model.
29 response_message = await call_openai(session_id)
30
31 # Send the generated response back through ChainLit.
32 msg = cl.Message(author="Answer", content=(response_message.content))
33 await msg.send()
34
35 # Update Zep's session memory with the assistant's response for continuity.
36 await zep.memory.add(session_id, messages=[Message(role_type="assistant", content=response_message.content, role="assistant")]) # Updated method
  1. To access your LLM session data, navigate to the Zep Cloud Console, select a session, and review all the associated session data and logs.
Zep Cloud session console example

In conclusion, integrating Zep Cloud with Chainlit empowers developers to create conversational AI applications that are more intelligent, context-aware, and efficient.