Skip to main content
LangChat uses two prompt templates internally:
  1. System prompt — defines the chatbot’s persona, task, and how to use context
  2. Standalone question prompt — reformulates follow-up questions into self-contained queries before searching
Both have sensible defaults. Override either or both.

System prompt

The system prompt is built once per message. It contains three required placeholders:
PlaceholderFilled with
{context}Top retrieved document chunks (after reranking)
{chat_history}Recent conversation exchanges
{question}The user’s current message

Default template

You are a helpful AI assistant. Answer questions clearly and accurately.

Use the following context to answer:
{context}

Previous conversation:
{chat_history}

User question: {question}

Your response:

Custom template

Pass prompt_template when creating LangChat:
from langchat import LangChat
from langchat.providers import OpenAI, Pinecone, Supabase

MY_PROMPT = """You are Aria, a friendly support agent for Acme Corp.
Always be concise and professional. If you don't know the answer, say so clearly
and suggest contacting support@acme.com.

Relevant knowledge base articles:
{context}

Previous messages:
{chat_history}

Customer: {question}

Aria:"""

lc = LangChat(
    llm=OpenAI("gpt-4o-mini"),
    vector_db=Pinecone("my-index"),
    db=Supabase(),
    prompt_template=MY_PROMPT,
)
Your template must contain all three placeholders — {context}, {chat_history}, and {question} — or the chain will raise an error.

Domain-specific examples

LEGAL_PROMPT = """You are a legal research assistant. Provide accurate information
based on the provided documents. Always note that this is not legal advice and
users should consult a qualified attorney for legal decisions.

Relevant legal documents:
{context}

Prior conversation:
{chat_history}

Question: {question}

Legal Assistant:"""

Standalone question prompt

Before searching Pinecone, LangChat reformulates the user’s message to be self-contained. This handles follow-up questions like:
  • “What about the pricing?” → “What is Acme Corp’s pricing?”
  • “Can you explain that further?” → “Can you explain Acme Corp’s refund policy further?”

Default behavior

The default prompt:
  1. Uses the LLM to rephrase the question
  2. Keeps greetings unchanged (hi, hello, hey)
  3. Always generates questions in English

Custom standalone question prompt

STANDALONE_PROMPT = """Given the following conversation and a follow-up question,
rephrase the follow-up as a standalone question that includes all necessary context.
If the follow-up is a greeting, return it unchanged.

Conversation:
{chat_history}

Follow-up: {question}

Standalone question:"""

lc = LangChat(
    llm=OpenAI("gpt-4o-mini"),
    vector_db=Pinecone("my-index"),
    db=Supabase(),
    standalone_question_prompt=STANDALONE_PROMPT,
)
The standalone question prompt uses two placeholders: {chat_history} and {question}.

Prompt tips

Be specific about tone. “Be concise and professional” produces very different results than the default. Tell the model what to do when it doesn’t know. If you don’t specify, it may hallucinate. Add: “If the answer is not in the context, say you don’t know.” Set the output format. If you need structured output: “Always respond with bullet points.” or “Answer in 2-3 sentences maximum.” Keep {context} early. Models attend more strongly to content near the start of the prompt. Test with verbose=True. See exactly what prompt is being sent:
lc = LangChat(
    ...,
    verbose=True,  # logs the full prompt on every call
)