1. Install
- Hosted (lyzr-adk)
- Open Source (lyzr-cognis)
2. Set API Keys
- Hosted (lyzr-adk)
- Open Source (lyzr-cognis)
3. Add Memories
- Hosted (lyzr-adk)
- Open Source (lyzr-cognis)
4. Search Memories
- Hosted (lyzr-adk)
- Open Source (lyzr-cognis)
5. Get Context for Your LLM
- Hosted (lyzr-adk)
- Open Source (lyzr-cognis)
What Just Happened
- Cognis stored your raw messages
- The LLM extracted discrete facts and auto-categorized them (identity, preferences, interests, etc.)
- Facts were embedded and indexed for hybrid search (vector + keyword)
- Search used Reciprocal Rank Fusion: 70% vector similarity + 30% BM25 keyword matching
context/get_contextassembled both short-term messages and long-term memories for LLM use
Next Steps
Configuration
API keys, init params, search weight tuning
Add Memories
Full method reference for storing conversations
Features
Hybrid search, categories, session management, and more
Cookbooks
CrewAI, LangChain, LangGraph, and Agno integrations