Skip to main content

Class: ChatResponse

A dataclass returned by every chat() and chat_sync() call. Provides typed access to the response and metadata.
from langchat import ChatResponse

Fields

text
str
The AI’s response text. This is what you display to the user.
status
"success" | "error"
Whether the request succeeded. "success" means the LLM returned a valid response. "error" means something went wrong (LLM failure, timeout, etc.).
user_id
str
The user_id you passed to chat(). Echoed back for convenience.
platform
str
The platform you passed to chat(). Defaults to "default".
response_time
float
End-to-end latency in seconds, from when chat() was called to when the response was ready.
timestamp
str
UTC timestamp of the response in ISO 8601 format (e.g. "2025-01-15T10:30:00.000Z").
error
str | None
Error message when status == "error". None on success.

Special methods

__bool__

if response: returns True when status == "success".
response = await lc.chat(query="Hello", user_id="alice")

if response:
    print("Success!")
else:
    print("Failed.")

__str__

print(response) and str(response) return response.text.
response = await lc.chat(query="Hello", user_id="alice")
print(response)          # prints the AI response text

Usage patterns

Basic

response = await lc.chat(query="What is RAG?", user_id="alice")
print(response)

With error handling

response = await lc.chat(query="Explain quantum computing", user_id="alice")

if response:
    print(response.text)
    print(f"Answered in {response.response_time:.2f}s")
else:
    # Log the error and show a user-friendly message
    logger.error(f"Chat failed: {response.error}")
    print("Sorry, something went wrong. Please try again.")

In an API response

from fastapi import FastAPI
from langchat import LangChat, ChatResponse

app = FastAPI()

@app.post("/ask")
async def ask(query: str, user_id: str):
    response: ChatResponse = await lc.chat(query=query, user_id=user_id)
    return {
        "answer": response.text,
        "ok": bool(response),
        "latency": response.response_time,
        "timestamp": response.timestamp,
    }

In a chat loop

async def chat_loop(user_id: str):
    while True:
        query = input("You: ")
        if not query.strip():
            continue

        response = await lc.chat(query=query, user_id=user_id)

        if response:
            print(f"Bot: {response.text}")
        else:
            print(f"Error: {response.error}")