LangChain provides powerful chain composition (LCEL) and prompt management. Cognis adds persistent, searchable memory that survives beyond a single session or chain invocation — giving your chains true long-term memory.What you’ll build: A personal tutor chatbot that remembers each student’s learning style, progress, and preferences across sessions using an LCEL chain with Cognis memory.Integration pattern: LCEL chain with memory injection via MessagesPlaceholder. Retrieve before, store after.
prompt = ChatPromptTemplate.from_messages([ ("system", "You are a helpful personal tutor. Adapt your teaching style based on " "what you know about the student from past interactions."), MessagesPlaceholder(variable_name="memory_context"), MessagesPlaceholder(variable_name="chat_history"), ("human", "{input}"),])chain = prompt | llm
chat_history = []response = chat("I'm a visual learner and prefer examples over theory.", chat_history, owner_id="student_001", session_id="session_1")chat_history.append(HumanMessage(content="I'm a visual learner..."))chat_history.append(AIMessage(content=response))response = chat("Teach me about list comprehensions in Python.", chat_history, owner_id="student_001", session_id="session_1")# New session — memory recalls preferences automaticallyresponse = chat("I'd like to learn about decorators today.", chat_history=[], owner_id="student_001", session_id="session_2")
chat_history = []response = chat("I'm a visual learner and prefer examples over theory.", chat_history, session_id="session_1")chat_history.append(HumanMessage(content="I'm a visual learner..."))chat_history.append(AIMessage(content=response))response = chat("Teach me about list comprehensions in Python.", chat_history, session_id="session_1")# New session — memory recalls preferences automaticallym.set_session("session_2")response = chat("I'd like to learn about decorators today.", chat_history=[], session_id="session_2")m.close()
cog.context() is the hosted API method. The open-source equivalent is m.get_context(). Both return assembled short-term + long-term context.
Hosted (lyzr-adk)
Open Source (lyzr-cognis)
context = cog.context( current_messages=[CognisMessage(role="user", content="Teach me about decorators")], owner_id="student_001", session_id="session_2", max_short_term_messages=20, enable_long_term_memory=True, cross_session=True,)
ctx = m.get_context( messages=[{"role": "user", "content": "Teach me about decorators"}], max_short_term=20, include_long_term=True,)# Use ctx["context_string"] in your prompt