Quick Start
Pre-built chat agent by Lyzr
Getting Started with Lyzr Chat Agent SDK
Lyzr’s Chat Agent is powered by a state-of-the-art chatbot architecture that super abstracts all the complexity of building an advanced LLM-powered chatbot. This enables developers to focus more on data quality, prompt quality, and the application use case instead of spending countless hours stitching together various building blocks and indexes to build the backend RAG pipeline.
Lyzr’s Chat Agent integrates all the building blocks of a chatbot
What are the various methods and arguments that you could pass to Lyzr’s ChatBot class?
Methods
Method | What it does? |
---|---|
pdf_chat | chat with PDF documents |
website_chat | automatically scrape the website content and chat with website data |
docx_chat | chat with Microsoft Word documents |
txt_chat | chat with flat files |
youtube_chat | chat with youtube content (must have transcriptions) |
webpage_chat | automatically scrape the webpage content and chat with webpage data |
Chat with PDF
Sample Code 👇
import os
from lyzr import ChatBot
# Set your OpenAI API key
os.environ['OPENAI_API_KEY'] = 'sk-'
# Initialize the PDF Chatbot with the path to the PDF file
chatbot = ChatBot.pdf_chat(
input_files=["PATH/TO/YOUR/PDF/FILE"],
)
# Ask a question related to the PDF content
response = chatbot.chat("Your question here")
# Print the chatbot's response
print(response.response)
# Access source nodes for additional information
for n, source in enumerate(response.source_nodes):
print(f"Source {n+1}")
print(source.text)
Types of Arguments
pdf_chat(
input_dir: Optional[str] = None,
input_files: Optional[List] = None,
exclude_hidden: bool = True,
filename_as_id: bool = True,
recursive: bool = True,
required_exts: Optional[List[str]] = None,
system_prompt: str = None,
query_wrapper_prompt: str = None,
embed_model: Union[str, EmbedType] = "default",
llm_params: dict = None,
vector_store_params: dict = None,
service_context_params: dict = None,
chat_engine_params: dict = None,
retriever_params: dict = None,
):
Use input_dir
to parse all the .pdf files from a directory.
Pass a list of .pdf file paths.
Set to true
to ignore hidden files when using input_dir
.
Set to true
to consider the filename as the id for indexing the parsed data.
Set to true
to parse files from all subdirectories.
System-wide prompt to be prepended to all input prompts, used to guide system “decision making”.
A specific wrapper instruction for passed-in input queries.
The default embed model is OpenAI text-embedding-ada-002
. Default fallback model is bge
from Hugging Face.
Default language model is OpenAI gpt-4-0125-preview
. Default temperature is 0.
The default vector store is Embedded Weaviate DB.
Default chunk_size is 1024 tokens. Default overlap is 20 tokens.
Default is none.
Default is none.
Integrations
Vector Store Integrations
Lyzr + Weaviate
Lyzr + Supabase Pgvector
Install vecs and supabase
pip install vecs supabase
vector_store_params = {
"vector_store_type": "SupabaseVectorStore",
"postgres_connection_string": "postgresql://<user>:<password>@<host>:<port>/<db_name>",
"collection_name":"base_demo",
}
Lyzr + Qdrant Vector Store
!pip install -U qdrant_client
Lyzr + LanceDB Vector Store
!pip install -U lancedb
Lyzr + Azure Cognitive Search
pip install azure-search-documents==11.4.0 azure-identity