Unlocking mem0 Memory for Anthropic Claude Bot: A Step-by-Step Coding Guide
This tutorial explains how to build a memory-enabled chatbot combining Anthropic's Claude model with mem0 for context-rich conversations in Google Colab.
Setting Up a Memory-Enabled Claude Chatbot in Google Colab
This tutorial guides you through building a fully functional chatbot using Anthropic's Claude model integrated with mem0 memory for seamless recall. The combination of LangGraph's state-machine orchestration and mem0's vector-based memory store enables the assistant to remember previous conversations and retrieve relevant information on demand, ensuring natural conversational continuity.
Installing Required Libraries
Begin by installing and upgrading all necessary libraries including LangGraph, mem0 AI client, LangChain with Anthropic connector, and the Anthropic SDK. This ensures compatibility and smooth setup:
!pip install -qU langgraph mem0ai langchain langchain-anthropic anthropicImporting Core Modules
We import essential modules for API key management, typing utilities, LangGraph for chat flow orchestration, LangChain message classes for prompt construction, the ChatAnthropic wrapper for Claude interaction, and mem0 client for persistent memory.
import os
from typing import Annotated, TypedDict, List
from langgraph.graph import StateGraph, START
from langgraph.graph.message import add_messages
from langchain_core.messages import SystemMessage, HumanMessage, AIMessage
from langchain_anthropic import ChatAnthropic
from mem0 import MemoryClientConfiguring API Keys
Securely inject your Anthropic and mem0 API keys into the environment and variables to authenticate requests without hardcoding sensitive information.
os.environ["ANTHROPIC_API_KEY"] = "Use Your Own API Key"
MEM0_API_KEY = "Use Your Own API Key"Initializing Chat Model and Memory Client
Create an instance of ChatAnthropic configured with Claude 3.5 Sonnet and initialize mem0 MemoryClient for vector-based memory storage.
llm = ChatAnthropic(
model="claude-3-5-haiku-latest",
temperature=0.0,
max_tokens=1024,
anthropic_api_key=os.environ["ANTHROPIC_API_KEY"]
)
mem0 = MemoryClient(api_key=MEM0_API_KEY)Defining Conversational State and Chatbot Logic
Define a TypedDict for chat state holding messages and user ID. The chatbot function queries mem0 for relevant memories based on the latest user message, constructs a context-aware system prompt, invokes Claude to generate a reply, and stores the new interaction back into mem0.
class State(TypedDict):
messages: Annotated[List[HumanMessage | AIMessage], add_messages]
mem0_user_id: str
graph = StateGraph(State)
def chatbot(state: State):
messages = state["messages"]
user_id = state["mem0_user_id"]
memories = mem0.search(messages[-1].content, user_id=user_id)
context = "\n".join(f"- {m['memory']}" for m in memories)
system_message = SystemMessage(content=(
"You are a helpful customer support assistant. "
"Use the context below to personalize your answers:\n" + context
))
full_msgs = [system_message] + messages
ai_resp: AIMessage = llm.invoke(full_msgs)
mem0.add(
f"User: {messages[-1].content}\nAssistant: {ai_resp.content}",
user_id=user_id
)
return {"messages": [ai_resp]}Setting Up LangGraph Execution Flow
Add the chatbot as a node and create edges to manage chat flow, allowing the conversation to loop and process each user input.
graph.add_node("chatbot", chatbot)
graph.add_edge(START, "chatbot")
graph.add_edge("chatbot", "chatbot")
compiled_graph = graph.compile()Running the Conversational Loop
Define a function to run the conversation by streaming user input through the compiled graph, printing the assistant's response. A simple REPL loop allows continuous chatting until the user exits.
def run_conversation(user_input: str, mem0_user_id: str):
config = {"configurable": {"thread_id": mem0_user_id}}
state = {"messages": [HumanMessage(content=user_input)], "mem0_user_id": mem0_user_id}
for event in compiled_graph.stream(state, config):
for node_output in event.values():
if node_output.get("messages"):
print("Assistant:", node_output["messages"][-1].content)
return
if __name__ == "__main__":
print("Welcome! (type 'exit' to quit)")
mem0_user_id = "customer_123"
while True:
user_in = input("You: ")
if user_in.lower() in ["exit", "quit", "bye"]:
print("Assistant: Goodbye!")
break
run_conversation(user_in, mem0_user_id)This architecture empowers the chatbot to recall user-specific information and provide personalized, context-rich conversational experiences by integrating Anthropic Claude with mem0 memory and orchestrating the flow using LangGraph in Google Colab.
Explore further by experimenting with advanced memory retrieval, prompt tuning, or expanding the chatbot's capabilities with additional integrations.
Check out the Colab Notebook linked in the original article for hands-on experience.
Сменить язык
Читать эту статью на русском