Autogen Memory Integration

Add persistent memory to Microsoft Autogen agents using the zep-autogen package.

The zep-autogen package provides seamless integration between Zep and Microsoft Autogen agents. Simply add Zep memory to any Autogen agent to enable persistent conversation memory and intelligent context retrieval.

Install dependencies

$pip install zep-autogen zep-cloud

Setup and usage

1import os
2import uuid
3from autogen_agentchat import AssistantAgent
4from autogen_agentchat.models import OpenAIChatCompletionClient
5from zep_cloud.client import AsyncZep
6from zep_autogen import ZepMemory
7
8# Initialize Zep and create memory
9zep_client = AsyncZep(api_key=os.environ.get("ZEP_API_KEY"))
10await zep_client.user.add(user_id="user_123")
11
12memory = ZepMemory(
13 client=zep_client,
14 user_id="user_123",
15 thread_id=str(uuid.uuid4())
16)
17
18# Add memory to any Autogen agent
19agent = AssistantAgent(
20 name="Assistant",
21 model_client=OpenAIChatCompletionClient(
22 model="gpt-4o-mini",
23 api_key=os.environ.get("OPENAI_API_KEY")
24 ),
25 system_message="You are a helpful assistant.",
26 memory=[memory] # ← This gives the agent access to user memory
27)
28
29from autogen_core.memory import MemoryContent, MemoryMimeType
30
31# Start a conversation - add user message to memory first
32user_message = "Hi, my name is Alice and I love hiking. What outdoor activities would you recommend for someone like me?"
33await memory.add(MemoryContent(
34 content=user_message,
35 mime_type=MemoryMimeType.TEXT,
36 metadata={"type": "message", "role": "user", "name": "Alice"}
37))
38
39# Run the agent
40response = await agent.run(task=user_message)
41agent_response = response.messages[-1].content
42print(f"Agent: {agent_response}")
43
44# Add the agent's response to memory
45await memory.add(MemoryContent(
46 content=agent_response,
47 mime_type=MemoryMimeType.TEXT,
48 metadata={"type": "message", "role": "assistant", "name": "Assistant"}
49))

Adding data to memory

Use memory.add() to manually store important information:

1from autogen_core.memory import MemoryContent, MemoryMimeType
2
3# Store additional user information
4await memory.add(MemoryContent(
5 content="Alice prefers weekend outdoor activities and lives near the mountains",
6 mime_type=MemoryMimeType.TEXT,
7 metadata={"type": "message", "role": "user", "name": "Alice"}
8))
9
10# Store structured data about Alice
11await memory.add(MemoryContent(
12 content='{"user_profile": {"name": "Alice", "interests": ["hiking", "rock climbing"], "experience_level": "beginner", "location": "mountain region"}}',
13 mime_type=MemoryMimeType.JSON,
14 metadata={"type": "data", "role": "assistant", "name": "Assistant"}
15))

Query memory

Use memory.query() to perform intelligent queries across the user’s knowledge graph:

1# Query user edges (facts)
2results = await memory.query(
3 query="What does the user like to do?",
4 scope="edges"
5)
6print(f"Thread results: {results}")
7
8# Query all entities in the user's knowledge graph
9results = await memory.query(
10 query="What do I know about this user's hobbies?",
11 scope="entities"
12)
13print(f"User results: {results}")
14
15# Query globally across all users (if permitted)
16results = await memory.query(
17 query="Common outdoor activities mentioned",
18 scope="episodes" # Search across all conversations
19)
20print(f"Global results: {results}")