Skip to main content

What is LangChat?

LangChat is a powerful, production-ready conversational AI library designed to simplify building intelligent chatbots with vector search capabilities. Instead of juggling multiple libraries, API integrations, vector databases, and chat history management, LangChat provides a unified, modular architecture that handles all these concerns out of the box.

Why LangChat?

Building production-ready conversational AI systems is complex. You need to:
  • Integrate LLM APIs (OpenAI, Anthropic, etc.)
  • Manage Vector Databases (Pinecone, Weaviate, etc.)
  • Handle Chat History (conversation context and memory)
  • Implement Reranking (improve search result relevance)
  • Track Metrics (response times, errors, feedback)
  • Rotate API Keys (handle rate limits and failures)
LangChat simplifies all of this by providing:
  • πŸš€ Fast Development: Get a production-ready chatbot running in minutes
  • πŸ”§ Modular Architecture: Use components independently or together
  • πŸ”„ Automatic API Key Rotation: Built-in fault tolerance for OpenAI API
  • πŸ“Š Production Ready: Includes chat history, metrics, and feedback tracking
  • 🎨 Highly Customizable: Easy to extend with custom prompts and adapters
  • 🐳 Docker Ready: Auto-generated Dockerfile for easy deployment

Key Features

πŸ€– LLM Integration

  • OpenAI: Native OpenAI API support with automatic API key rotation
  • Fault Tolerant: Automatic retry logic with multiple API keys
  • Production Ready: Handles rate limits and errors gracefully
  • Pinecone Integration: Seamless vector database integration
  • Reranking: Flashrank reranker for improved search results
  • Configurable Retrieval: Adjustable document retrieval and reranking

πŸ’Ύ Database Management

  • Supabase: Built-in Supabase integration
  • ID Management: Automatic ID generation with conflict resolution
  • Session Management: User-specific chat history and memory

🎨 Customization

  • Custom Prompts: Configure both system prompts and standalone question prompts
  • Flexible Configuration: Environment variables or direct configuration
  • Modular Architecture: Use components independently or together

πŸš€ Developer Experience

  • Auto-Generated Interface: Chat interface HTML auto-created on startup
  • Auto-Generated Dockerfile: Dockerfile auto-created with correct port
  • Easy Setup: Simple configuration and initialization

Quick Example

Here’s how easy it is to get started with LangChat:
import asyncio
from langchat import LangChat, LangChatConfig

async def main():
    # Create configuration
    config = LangChatConfig(
        openai_api_keys=["your-openai-api-key"],
        pinecone_api_key="your-pinecone-api-key",
        pinecone_index_name="your-index-name",
        supabase_url="your-supabase-url",
        supabase_key="your-supabase-key"
    )
    
    # Initialize LangChat
    langchat = LangChat(config=config)
    
    # Chat with the AI
    result = await langchat.chat(
        query="What are the best universities in Europe?",
        user_id="user123",
        domain="education"
    )

asyncio.run(main())

Architecture Overview

LangChat follows a modular architecture where each component is designed to work independently or together:
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                LangChat Engine                                           β”‚
β”‚     (Main orchestrator for all components)                               β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                      β”‚
                      β”œβ”€β”€β”€ LLM Service (OpenAI)
                      β”‚       └─── Automatic key rotation
                      β”‚
                      β”œβ”€β”€β”€ Vector Database (Pinecone)
                      β”‚       └─── Document retrieval & embeddings
                      β”‚
                      β”œβ”€β”€β”€ Reranker (Flashrank)
                      β”‚       └─── Result relevance improvement
                      β”‚
                      β”œβ”€β”€β”€ Database (Supabase)
                      β”‚       β”œβ”€β”€β”€ Chat history storage
                      β”‚       β”œβ”€β”€β”€ Metrics tracking
                      β”‚       └─── Feedback collection
                      β”‚
                      └─── Session Management
                              └─── User context & memory

Use Cases

LangChat is perfect for:
  • πŸ“š Education Chatbots: Help students find universities and programs
  • ✈️ Travel Assistants: Provide travel recommendations and booking guidance
  • πŸ›’ Customer Support: Answer product questions with RAG (Retrieval-Augmented Generation)
  • πŸ’Ό Business Assistants: Internal knowledge base queries
  • πŸŽ“ Learning Platforms: Answer questions about course materials
  • πŸ₯ Healthcare: Provide medical information with context

What Makes LangChat Different?

FeatureLangChatOther Libraries
Setup TimeMinutesDays/Weeks
API Key Rotationβœ… Built-in❌ Manual
Chat Historyβœ… Automatic⚠️ Manual
Vector Searchβœ… Integrated⚠️ Separate
Rerankingβœ… Built-in❌ Manual
Metrics Trackingβœ… Automatic❌ Manual
Production Readyβœ… Yes⚠️ Depends

Next Steps

Ready to get started? Check out our guides:
  1. Getting Started - Set up LangChat in minutes
  2. Installation Guide - Install and configure LangChat
  3. Configuration - Learn about all configuration options
  4. Examples - See LangChat in action

Community & Support


Built with ❀️ by NeuroBrain