Give your LangGraph agents persistent semantic memory that survives across sessions. Store data in cognee’s knowledge graph and retrieve it via natural language—no manual state management required.
Before using the integration, configure your environment variables:
Copy
export OPENAI_API_KEY="your-openai-api-key-here" # for LangGraphexport LLM_API_KEY="your-openai-api-key-here" # for cogneeexport LLM_MODEL="gpt-4o-mini"
Add memory tools to your LangGraph agent:
Copy
from langgraph.prebuilt import create_react_agentfrom cognee_integration_langgraph import get_sessionized_cognee_toolsfrom langchain_core.messages import HumanMessage# Get memory toolsadd_tool, search_tool = get_sessionized_cognee_tools()# Create agent with memoryagent = create_react_agent( "openai:gpt-4o-mini", tools=[add_tool, search_tool],)# Store and retrieve informationresponse = agent.invoke({ "messages": [ HumanMessage(content="Remember: Acme Corp, healthcare, $1.2M contract"), HumanMessage(content="What healthcare contracts do we have?") ],})
Build domain knowledge incrementally over multiple sessions:
Copy
# Add knowledge from sessionsfor doc in knowledge_base: agent.invoke({"messages": [HumanMessage(content=f"Learn: {doc}")]})# Add multiple documentsfor doc_path in document_paths: with open(doc_path, 'r') as f: content = f.read() await cognee.add(content)await cognee.cognify()# Query across allresponse = agent.invoke({ "messages": [HumanMessage(content="Find information about contract terms")]})
Context-Aware Assistance
Maintain user context across work sessions:
Copy
# Mondayagent.invoke({"messages": [HumanMessage(content="Debugging payment flow")]})# Wednesdayagent.invoke({"messages": [HumanMessage(content="What was I debugging?")]})
Multi-Tenant Applications
Isolate data per user/organization while sharing global knowledge: