Skip to main content
Set up your environment and install Cognee to start building AI memory.
Python 3.9 – 3.12 is required to run Cognee.

Prerequisites

  • We recommend creating a .env file in your project root
  • Cognee supports many configuration options, and a .env file keeps them organized
You have two main options for configuring LLM and embedding providers:Option 1: OpenAI (Simplest)
  • Single API key handles both LLM and embeddings
  • Uses gpt-4o-mini for LLM and text-embedding-3-small for embeddings by default
  • Works out of the box with minimal configuration
Option 2: Other Providers
  • Configure both LLM and embedding providers separately
  • Supports Gemini, Anthropic, Ollama, and more
  • Requires setting both LLM_* and EMBEDDING_* variables
By default, Cognee uses OpenAI for both LLMs and embeddings. If you change the LLM provider but don’t configure embeddings, it will still default to OpenAI.
  • We recommend using uv for virtual environment management
  • Run the following commands to create and activate a virtual environment:
uv venv && source .venv/bin/activate
  • PostgreSQL database is required if you plan to use PostgreSQL as your relational database (requires postgres extra)

Setup

Next Steps

Run Your First Example

Quickstart TutorialGet started with Cognee by running your first knowledge graph example.

Explore Advanced Features

Core ConceptsDive deeper into Cognee’s powerful features and capabilities.