Use AWS Bedrock LLMs with Cognee through LiteLLM proxy
Integrate AWS Bedrock LLMs with Cognee using LiteLLM Proxy (not the SDK) for seamless access to Anthropic Claude, Amazon Titan, and other Bedrock models. The proxy acts as a server that Cognee connects to, providing a unified interface for all Bedrock models.
import cogneeimport asyncioasync def main(): # Add text to cognee await cognee.add("Natural language processing (NLP) is an interdisciplinary subfield of computer science and information retrieval.") # Generate the knowledge graph await cognee.cognify() # Query the knowledge graph results = await cognee.search("Tell me about NLP") # Display the results for result in results: print(result)if __name__ == '__main__': asyncio.run(main())