Open SourcePythonTypeScript

LangChain

by LangChain Inc.Updated Jun 15, 2025

The foundational framework for building LLM-powered applications with a massive integration ecosystem. LangChain provides composable abstractions for prompt templates, output parsers, chains, retrievers, and tool integration. While LangGraph is now the recommended path for agents, LangChain remains the backbone for model integrations, tool definitions, and the broader ecosystem.

Architecture Overview

LangChain uses a modular architecture with core abstractions: Models (chat models, LLMs, embeddings), Prompts (templates, few-shot selectors), Output Parsers (JSON, Pydantic, etc.), Retrievers (vector stores, keyword search), and the LCEL (LangChain Expression Language) for composing components into chains. The Runnables protocol provides a unified interface for all components with built-in streaming, batching, and async support.

When to Use LangChain

  • LLM-powered applications with rich integrations
  • Chain-based workflows combining multiple LLM calls
  • Tool and API integration via a standardized interface
  • RAG applications with diverse retriever backends
  • Prototyping with rapid model and tool swapping

Strengths & Weaknesses

Strengths

  • Largest ecosystem with 700+ integrations
  • Comprehensive model provider support (OpenAI, Anthropic, Google, etc.)
  • Well-documented with extensive tutorials and cookbooks
  • LCEL provides composable, streaming-first chain building
  • Both Python and TypeScript implementations

Weaknesses

  • Abstraction overhead can make debugging difficult
  • Rapid API changes across versions cause migration pain
  • For agent use cases, LangGraph is now the recommended approach
  • Large dependency footprint for the full package

Quick Start

python
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser

# Create a simple chain with LCEL
prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful assistant that translates {input_language} to {output_language}."),
    ("human", "{input}"),
])

model = ChatOpenAI(model="gpt-4o")
output_parser = StrOutputParser()

# Compose with LCEL pipe operator
chain = prompt | model | output_parser

result = chain.invoke({
    "input_language": "English",
    "output_language": "French",
    "input": "Hello, how are you?",
})
print(result)

Features at a Glance

DeveloperLangChain Inc.
LanguagePython, TypeScript
LicenseMIT
GitHub Stars100k+
MCP SupportYes
Multi-AgentNo

Notable Users

ElasticDatabricksMorningstarRakuten

Resources

Explore Related Content