Some text some message..
Back Latest LangChain v0.3.26 (June 20, 2025) Integration Commands 23 Jul, 2025

🚀 1. Installation & Integration Packages

Install core and provider-specific packages:

pip install langchain               # Core package  
pip install langchain-openai       # OpenAI chat models  
pip install langchain-google-vertexai   # Google VertexAI chat  
pip install langchain-anthropic    # Anthropic chat  
# ...other providers (aws, cohère, groq, huggingface, etc.) :contentReference[oaicite:2]{index=2}

🧠 2. Chat Model Initialization

from langchain_openai import ChatOpenAI
llm = ChatOpenAI(streaming=True)  # OpenAI chat model with token streaming :contentReference[oaicite:3]{index=3}

from langchain.chat_models import init_chat_model
model = init_chat_model("gemini-2.0-flash", model_provider="google_genai")  # Google Gemini :contentReference[oaicite:4]{index=4}

🔧 3. Core Chain Construction (LCEL syntax)

Use the LangChain Expression Language (LCEL) for clean chain building:

from langchain_core.prompts import PromptTemplate
prompt = PromptTemplate.from_template("Translate {input} to {output_language}:")
chain = prompt | llm
response = chain.invoke({"input": "Hello", "output_language": "Spanish"})

(Note: older chain constructors like LLMChain are deprecated in favor of | chaining) (LangChain, LangChain Python API)


🧰 4. Tools / Toolkits Integration

Built-in tools for models to call:

  • Search tools: BingSearch, DuckDuckGoSearch, GoogleSearch, SerpAPI, etc. (LangChain, Wikipedia)

  • Code interpreters: Azure Container Apps, Bearly Code Interpreter, etc. (LangChain)

  • Productivity & automation: i.e. Twilio, IFTTT WebHooks, Zapier, etc. (Wikipedia)

Instantiate a tool:

from langchain.tools import DuckDuckGoSearch
tool = DuckDuckGoSearch()
result = tool.run("Latest news on LangChain")

📦 5. Custom Integration Package Setup

Bootstrap a new provider package:

pip install langchain-cli poetry
langchain-cli integration new --name parrot-link
cd parrot-link
poetry add langchain-core
# implement integration modules (chat_models.py, vectorstores.py)

Use langchain-cli template for consistent package structure and tests (LangChain, LangChain)


🕸️ 6. LangGraph for Agents & Tool-Based Workflows

Define tools that return commands to update agent state:

from langchain_core.tools import tool, InjectedToolCallId
from langgraph.types import Command
from langchain_core.messages import ToolMessage

@tool
def update_user_name(new_name: str, tool_call_id: InjectedToolCallId) -> Command:
    return Command(update={"user_name": new_name},
                   messages=[ToolMessage(f"Updated name to {new_name}", tool_call_id)])

Use with create_react_agent or custom graph execution (LangChain)


🧠 7. Memory & State Management with LangGraph

from langgraph.store.memory import InMemoryStore
store = InMemoryStore()
store.put(("users",), "user123", {"name": "John", "language": "English"})

Retrieve state in tools:

from langchain_core.runnables import RunnableConfig
from langgraph.config import get_store

@tool
def get_user_info(config: RunnableConfig) -> str:
    store = get_store()
    data = store.get(("users",), config["configurable"]["user_id"])
    return str(data.value) if data else "Unknown"

Allows persisting conversation and context across runs (LangChain)


📚 8. Documentation & How‑To Guides

Check these essential resources for hands-on examples:

  • How to: chain runnables, stream, parallel, inspect, add memory, fallbacks (LangChain, LangChain)

  • Integration authoring: detailed guide for custom component creation (LangChain, LangChain)


✅ Summary Table

Category Key Commands/Patterns
Install pip install langchain + provider
Chat Setup ChatOpenAI(...), init_chat_model(...)
Chain Composition `prompt
Tools Search, code execution, productivity tools
Custom Integrations langchain-cli integration new
Agent Workflow @tool functions, create_react_agent
Memory InMemoryStore, get_store()
Docs LangChain official how‑to / contrib guides