Class: LangChat
Main interface for using LangChat.Constructor
LLM provider instance (e.g., OpenAI, Anthropic, Gemini)
Vector database provider (e.g., Pinecone)
Database provider (e.g., Supabase)
Optional reranker provider (defaults to Flashrank)
Custom system prompt template
Custom standalone question prompt
Enable verbose logging
Maximum chat history messages to keep
Methods
async chat()
Process a chat query.
User query text
User identifier for session management
Domain for conversation context
dict with:
response(str): AI responsestatus(str): “success” or “error”response_time(float): Time in seconds
chat_sync()
Synchronous version of chat.
load_and_index_documents()
Index documents to Pinecone.
dict with:
chunks_indexed(int): Number of chunks indexedchunks_skipped(int): Number of duplicates skipped
load_and_index_multiple_documents()
Index multiple documents.
Properties
engine- Access to LangChatEngine instance
Next Steps
- Quick Start - Get started
- Examples - See examples
- Configuration - Learn configuration
Built with ❤️ by NeuroBrain
