Class: LangChat
The main entry point for the LangChat SDK. Wraps LangChatEngine and returns typed ChatResponse objects.
Constructor
*).
LLM provider instance. From
langchat.providers: OpenAI, Anthropic, Gemini, Mistral, Cohere, Ollama.Vector database provider. From
langchat.providers: Pinecone.History database provider. From
langchat.providers: Supabase.Reranker instance. Defaults to
FlashrankRerankAdapter(model_name="ms-marco-MiniLM-L-12-v2", top_n=3).Custom system prompt. Must include
{context}, {chat_history}, {question}. See Prompts guide.Custom standalone question prompt. Must include
{chat_history} and {question}.Enable verbose LangChain logging.
Number of recent exchanges to include in each prompt. One exchange = one (user, AI) pair.
Methods
chat()
Send a message and receive a typed response. Async.
The user’s message.
Unique identifier for the user. Scopes conversation history.
Namespace for the conversation. Use different values to separate conversations for the same user across multiple apps.
ChatResponse
Example:
chat_sync()
Synchronous wrapper around chat(). Blocks until the response is ready.
chat().
Example:
index()
Index documents into Pinecone. Can accept a file path, a list of paths, or a directory.
Path to a file, a list of file paths, or a directory path.
Maximum characters per chunk.
Character overlap between adjacent chunks.
Pinecone namespace to index into.
Skip chunks that are already in Pinecone (detected by content hash).
dict with:
chunks_indexed(int) — chunks addedchunks_skipped(int) — duplicates skippedfiles_processed(int) — files successfully loadederrors(list) — files that failed to load
get_session()
Get or create a UserSession for a user.
load_env() (class method)
Load a .env file into the environment. Call this before creating LangChat.
Properties
Access to the underlying
LangChatEngine instance.