Skip to main content

Overview

LangChatEngine is the internal engine that powers LangChat. In most cases you don’t interact with it directly — use the LangChat SDK class instead. Access it via lc.engine when you need low-level control.
from langchat.core.engine import LangChatEngine

Constructor

LangChatEngine(
    *,
    llm,
    vector_db,
    db,
    reranker=None,
    prompt_template=None,
    standalone_question_prompt=None,
    verbose=False,
    max_chat_history=20,
)
Same parameters as LangChat. All are keyword-only.

Methods

async chat()

Process a chat query. Returns a raw dict (not a ChatResponse).
async def chat(
    self,
    query: str,
    user_id: str,
    platform: str = "default",
    standalone_question: str | None = None,
) -> dict
standalone_question
str | None
default:"None"
If provided, skips the standalone question generation step and uses this value directly. Useful for pre-processing.
Returns:
{
    "response": str,           # AI response text
    "user_id": str,            # echoed user_id
    "timestamp": str,          # ISO 8601 UTC
    "status": "success"|"error",
    "response_time": float,    # seconds
    "error": str | None,       # error message on failure
}

get_session()

Get or create a UserSession.
def get_session(
    self,
    user_id: str,
    platform: str = "default",
) -> UserSession
Sessions are cached in self.sessions dict keyed by "{user_id}_{platform}".

Attributes

AttributeTypeDescription
llmAnyLLM provider instance
vector_adapterAnyVector DB provider instance
history_storeAnyDatabase provider instance
reranker_adapterAnyReranker instance
sessionsdict[str, UserSession]In-memory session cache
id_managerIDManagerDatabase ID sequence manager
prompt_templatestrSystem prompt template
verboseboolVerbose logging flag
max_chat_historyintHistory window size

Advanced: skip standalone question generation

If you’re building a pipeline that pre-processes questions, you can skip the standalone question step:
# Pre-process the question yourself
standalone = f"Question about Acme Corp: {user_query}"

result = await lc.engine.chat(
    query=user_query,
    user_id="alice",
    standalone_question=standalone,
)

API server mode

When running as an API server, the engine suppresses Rich console output (the formatted response panel):
from langchat.core.engine import set_api_server_mode
set_api_server_mode(True)   # called automatically by create_app()