Class: LangChat
The primary interface for using LangChat. Simplifies initialization and chat operations.
Constructor
LangChat(config: Optional[LangChatConfig] = None)
Creates a new LangChat instance.
Parameters:
config
LangChatConfig | None
default:"None"
LangChat configuration. If None, creates config from environment variables using LangChatConfig.from_env().
Example:
from langchat import LangChat, LangChatConfig
# With custom config
config = LangChatConfig(
openai_api_keys=["sk-..."],
pinecone_api_key="pcsk-...",
pinecone_index_path="my-index",
supabase_url="https://xxxxx.supabase.co",
supabase_key="eyJ..."
)
langchat = LangChat(config=config)
# Or from environment variables
langchat = LangChat() # Uses LangChatConfig.from_env()
All adapters are automatically initialized when LangChat is created.
Methods
async chat()
Process a chat query asynchronously.
async def chat(
self,
query: str,
user_id: str,
domain: str = "default"
) -> dict
Parameters:
User identifier. Used for session management and chat history.
User domain. Used to separate different conversation contexts (e.g., “education”, “travel”, “support”).
Returns:
dict - Dictionary with the following keys:
response (str): AI response text
user_id (str): User identifier
timestamp (str): ISO format timestamp
status (str): “success” or “error”
response_time (float): Response time in seconds
error (str, optional): Error message if status is “error”
Example:
import asyncio
from langchat import LangChat, LangChatConfig
async def main():
config = LangChatConfig.from_env()
langchat = LangChat(config=config)
# Response is automatically displayed in a Rich panel box
result = await langchat.chat(
query="What are the best universities in Europe?",
user_id="user123",
domain="education"
)
# Response already shown in panel above
# Access response data programmatically if needed:
print(f"Status: {result['status']}")
print(f"Response Time: {result['response_time']:.2f}s")
asyncio.run(main())
What happens under the hood:
- Gets or creates user session
- Generates standalone question from query and chat history
- Retrieves relevant documents from Pinecone
- Reranks documents using Flashrank
- Generates response using OpenAI with context
- Saves chat history to Supabase
- Updates in-memory session
- Returns response with metadata
chat_sync()
Synchronous version of chat() method.
def chat_sync(
self,
query: str,
user_id: str,
domain: str = "default"
) -> dict
Parameters:
Returns:
dict - Same as chat() method
Example:
from langchat import LangChat, LangChatConfig
config = LangChatConfig.from_env()
langchat = LangChat(config=config)
result = langchat.chat_sync(
query="Hello!",
user_id="user123",
domain="general"
)
print(result["response"])
For production, prefer chat() over chat_sync() as it’s non-blocking and more efficient.
get_session()
Get or create a user session.
def get_session(
self,
user_id: str,
domain: str = "default"
) -> UserSession
Parameters:
Returns:
UserSession - User session instance with chat history and metadata
Example:
from langchat import LangChat, LangChatConfig
config = LangChatConfig.from_env()
langchat = LangChat(config=config)
# Get or create session
session = langchat.get_session(
user_id="user123",
domain="education"
)
# Access chat history
print(session.chat_history) # List of (query, response) tuples
print(session.domain) # "education"
print(session.user_id) # "user123"
Properties
config
Access the configuration object.
Example:
langchat = LangChat(config=my_config)
print(langchat.config.openai_model) # "gpt-4o-mini"
print(langchat.config.server_port) # 8000
engine
Access the underlying engine.
Example:
# Access engine directly (advanced)
engine = langchat.engine
print(engine.llm) # OpenAILLMService instance
print(engine.vector_adapter) # PineconeVectorAdapter instance
Usage Examples
Basic Usage
import asyncio
from langchat import LangChat, LangChatConfig
async def main():
config = LangChatConfig.from_env()
langchat = LangChat(config=config)
result = await langchat.chat(
query="What can you help me with?",
user_id="user123",
domain="general"
)
print(result["response"])
asyncio.run(main())
With Custom Configuration
from langchat import LangChat, LangChatConfig
config = LangChatConfig(
openai_api_keys=["sk-key1", "sk-key2"],
openai_model="gpt-4o",
pinecone_api_key="pcsk-...",
pinecone_index_path="my-index",
supabase_url="https://xxxxx.supabase.co",
supabase_key="eyJ...",
retrieval_k=10,
reranker_top_n=5
)
langchat = LangChat(config=config)
Multi-Domain Sessions
async def handle_multiple_domains():
langchat = LangChat(config=config)
# Education domain
result1 = await langchat.chat(
query="What universities offer CS?",
user_id="user123",
domain="education"
)
# Travel domain (separate session)
result2 = await langchat.chat(
query="Best travel destinations?",
user_id="user123",
domain="travel"
)
Error Handling
async def chat_with_error_handling():
langchat = LangChat(config=config)
try:
result = await langchat.chat(
query="Hello!",
user_id="user123",
domain="general"
)
if result["status"] == "success":
print(f"Response: {result['response']}")
else:
print(f"Error: {result.get('error', 'Unknown error')}")
except Exception as e:
print(f"Exception: {str(e)}")