Some text some message..
Back 🧠 What is PromptTemplate?📦 22 May, 2025

PromptTemplate is a template system for building dynamic prompts in LangChain and other LLM-based tools.

It lets you define a prompt structure with placeholders, which can be filled with actual values at runtime.


📦 Example

from langchain.prompts import PromptTemplate

template = PromptTemplate(
    input_variables=["name", "age"],
    template="My name is {name} and I am {age} years old."
)

output = template.format(name="Abhi", age=30)
print(output)

🖨️ Output:

My name is Abhi and I am 30 years old.

🔧 Core Parameters

Parameter Description
template The prompt string with {variables} as placeholders
input_variables List of variable names used in the prompt
partial_variables (Optional) Predefine some values that always stay the same

🛠️ Why Use PromptTemplate?

  1. Reusable prompts with dynamic content

  2. ✅ Separates prompt logic from data

  3. ✅ Helps in prompt engineering with LLM chains

  4. ✅ Works with output parsers, agents, and tools


🔍 Real Use Case (With Output Parser)

from langchain.prompts import PromptTemplate
from langchain.output_parsers import PydanticOutputParser
from pydantic import BaseModel

# Define model
class Person(BaseModel):
    name: str
    age: int

# Parser
parser = PydanticOutputParser(pydantic_object=Person)

# Prompt with format instructions
prompt = PromptTemplate(
    input_variables=["text"],
    template="""
    Extract the person's details from the text below.

    Text: {text}

    {format_instructions}
    """,
    partial_variables={"format_instructions": parser.get_format_instructions()}
)

formatted_prompt = prompt.format(text="My name is Abhi and I am 30.")
print(formatted_prompt)

🧩 Advanced Features

🪄 Partial Prompting

partial_prompt = prompt.partial(name="Abhi")
output = partial_prompt.format(age=30)

🛠 Custom Templates with Jinja2 (Advanced)

If you want more control (e.g., loops, conditions), LangChain supports Jinja2 templates too.

from langchain.prompts.prompt import PromptTemplate
from langchain.prompts.few_shot import FewShotPromptTemplate

🧪 PromptTemplate + Chain Example

from langchain.chat_models import ChatOpenAI
from langchain.chains import LLMChain

llm = ChatOpenAI()

prompt = PromptTemplate(
    input_variables=["product"],
    template="Write an advertisement for the following product: {product}"
)

chain = LLMChain(llm=llm, prompt=prompt)

response = chain.run("eco-friendly toothbrush")
print(response)

📌 Summary

Feature Benefit
Dynamic Prompting Fill in variables at runtime
Input Validation Ensures required variables are supplied
Reusability One template, many use cases
Integration Works with LangChain chains, agents, tools, output parsers