With Day zero support for Azure AI Foundry and Associated Azure Services
Today LangChain v1 officially launches and marks a new era for the popular AI agent library. The new version ushers in a more opinionated, streamlined, and extensible foundation for building agentic LLM applications. In this post we'll breakdown of what’s new, what changed, and what “general availability” means in practice.
Join Microsoft Developer Advocates, Marlene Mhangami and Yohan Lasorsa, to see live demos of the new API and find out more about what JavaScript and Python developers need to know about v1. Register for this event here.
Why v1? The Motivation Behind the Redesign
The number of abstractions in LangChain had grown over the years to include chains, agents, tools, wrappers, prompt helpers and more, which, while powerful, introduced complexity and fragmentation. As model APIs evolve (multimodal inputs, richer structured output, tool-calling semantics), LangChain needed a cleaner, more consistent core to ensure production ready stability.
In v1:
- All existing chains and agent abstractions in the old LangChain are deprecated; they are replaced by a single high-level agent abstraction built on LangGraph internals.
- LangGraph becomes the foundational runtime for durable, stateful, orchestrated execution. LangChain now emphasizes being the “fast path to agents” that doesn’t hide but builds upon LangGraph.
- The internal message format has been upgraded to support standard content blocks (e.g. text, reasoning, citations, tool calls) across model providers, decoupling “content” from raw strings.
- Namespace cleanup: the langchain package now focuses tightly on core abstractions (agents, models, messages, tools), while legacy patterns are moved into langchain-classic (or equivalents).
What’s New & Noteworthy for Developers
Here are key changes developers should pay attention to:
1. create_agent becomes the default API
The create_agent function is now the idiomatic way to spin up agents in v1. It replaces older constructs (e.g. create_react_agent) with a clearer, more modular API that is middleware-centric. You can now compose middleware around model calls, tool calls, before/after hooks, error handling, etc.
2. Standard content blocks & normalized message model
Responses from models are no longer opaque strings. Instead, they carry structured content_blocks which classify parts of the output (e.g. “text”, “reasoning”, “citation”, “tool_call”). If needed for backward compatibility or client serialization, you can opt in to serializing those blocks back into the .content field by setting output_version="v1".
3. Multimodal and richer model inputs / outputs
LangChain now supports more than just text-based interactions. Models can accept and return files, images, video, etc., and the message format reflects this flexibility. This upgrade prepares us well for the next generation of models with mixed modalities (vision, audio, etc.).
4. Middleware hooks, runtime context, and finer control
Because create_agent is designed as a pluggable pipeline, developers can now inject logic before/after model calls, tool calls, error recoveries, fallback strategies, request transformations, and more. New middleware such as retry, fallback, call limits, and context editing have been added.
The notion of a runtime and context object accompanies each agent execution, making it easier to carry state or metadata through the pipeline.
5. Simplified, leaner namespace & migration path
Many formerly top-level modules or helper classes have been removed or relocated to langchain-classic (or similarly stamped “legacy”) to declutter the main API surface. A migration guide is available to help projects transition from v0 to v1. While v1 is now the main line, older v0 is still documented and maintained for compatibility.
What “General Availability” Means (and Doesn’t)
- v1 is production-ready, after months of testing the alpha version
- The stable v0 release line remains supported for those unwilling or unable to migrate immediately.
- Breaking changes in public APIs will be accompanied by version bumps (i.e. minor version increments) and deprecation notices.
- The roadmap anticipates minor versions every 2–3 months (with patch releases more frequently).
- Because the field of LLM applications is evolving rapidly, the team expects continued iterations in v1—even in GA mode—with users encouraged to surface feedback, file issues, and adopt the migration path. (This is in line with the philosophy stated in docs.)
Developer Callouts & Suggested Steps
Here are practical tips to get developers onboard:
- Try the new API Now!
LangChain Azure AI and Azure OpenAI have migrated to LangChain v1 and are ready to test! Try out our getting started sample here. Learn more about using LangChain and Azure AI: - Join us for a Live Stream on Wednesday 22 October 2025
Join Microsoft Developer Advocates Marlene Mhangami and Yohan Lasorsa for a livestream this Wednesday to see live demos and find out more about what JavaScript and Python developers need to know about v1. Register for this event here.