Use Azure Postgres LangChain + LangGraph connector as your secure persistence + retrieval layer for embeddings, chat history, and long-term memory.
AI agents are only as powerful as the data layer behind them. That’s why we’re excited to announce native LangChain + LangGraph connector for Azure Database for PostgreSQL. With this release, Postgres becomes your single source of truth for AI agents, handling knowledge retrieval, chat history, and long-term memory all in one place.
This new connector is packed with everything you need to build secure, scalable and enterprise-ready AI agents on Azure without the complexity. With EntraID authentication, DiskANN acceleration, vector store, and a dedicated agent store, you can go from prototype to production on Azure faster than ever.
You can quickly get started with the LangChain + LangGraph connector today
pip install langchain-azure-postgresql
In this post, we’ll cover:
- How Azure Postgres connector for LangGraph can serve as the single persistence + retrieval layer for an AI agent
- New first-class connector for LangChain +LangGraph
- A practical example to help you get started
Azure PostgreSQL as the single persistence + retrieval layer for an AI agent
When building AI agents today, developers face a fragmented stack:
- Vector storage and search require a library, service or separate database.
- Chat history & short-term memory need yet another data source.
- Long-term memory often means bolting on yet another system.
This sprawl leads to complex integrations, higher costs, and weaker security, making it hard to scale AI agents reliably.
The Solution
The new Azure Postgres connector for LangChain + LangGraph transforms your Azure Postgres database to the single persistence + retrieval layer for AI agents. Instead of working on a fragmented stack, developers can now:
- Run embeddings + semantic search with built-in DiskANN acceleration in the same database that powers their application logic.
- Persist chat history and short-term memory and keep agent conversations grounded via seamless context retrieval from data stored in Postgres.
- Capture, retrieve, and evolve knowledge over time with a built-in long-term memory without bolting on external systems.
All in one database, simplified, secure, and enterprise ready. Postgres becomes the persistent and retrieval data layer for your AI agent.
Azure Postgres LangGraph ConnectorBuilt for Enterprise Readiness: LangChain + LangGraph Connector
This release unlocks several new capabilities that make it easy to build robust, production-ready agents:
- Auth with EntraID: Enterprise-grade identity to securely connect LangChain + LangGraph workflows to Azure Database for PostgreSQL within a centrally managed security perimeter based on identity.
- DiskANN & Extensions: First-class support for faster vector search using pgvector combined with DiskANN indexing, enabling support for high-dimensional vectors and cost-efficient search. Additionally, helper functions ensure your favorite extensions are installed.
- Native Vector Store: Store and query embeddings, enabling semantic search and Retrieval-Augmented Generation (RAG) scenarios.
- Dedicated Agent Store: Persist agent state, memory, and chat history with structured access patterns, perfect for multi-turn conversations and long-term context.
Together, these features give developers a turnkey persistence solution for building reliable AI agents without stitching together multiple storage systems.
Using LangGraph on Azure Database for PostgreSQL
Using LangGraph with Azure Database for PostgreSQL is easy.
- Enable the vector & pg_diskann Extension: Allowlist the vector and pg_diskann extension within your server configuration.
- Import LangChain + LangGraph connector
pip install langchain-azure-postgresql
pip install -qU langchain-openai
pip install -qU azure-identity
- Login to Azure, to your Entra ID
Run az login in your terminal, where you will also run the LangGraph code.
az login
- To get started, you need to set up a production-ready vector store for your agent in a few lines of code.
# 1. Auth: Securely connect to Azure Postgres
connection_pool = AzurePGConnectionPool(azure_conn_info=ConnectionInfo(host=os.environ["PGHOST"]))
#2. Create embeddings
embeddings = AzureOpenAIEmbeddings(model="text-embedding-3-small")
# 3. Initialize a vector store in Postgres with DiskANN
vector_store = AzurePGVectorStore(connection=connection, embedding=embeddings)
- Use LangGraph to build a sample agent. Here’s a practical example that combines vector search and checkpointer inside Postgres:
#4 Define the tool for data retrieval.
def get_data_from_vector_store(query: str) -> str:
"""Get data from the vector store."""
results = vector_store.similarity_search(query)
return results
#5 Define the agent, checkpointer and memory store.
with connection_pool.getconn() as conn:
agent = create_react_agent(
model=model,
tools=[get_data_from_vector_store],
checkpointer=PostgresSaver(conn)
)
#6 Run the agent and print results
config = {"configurable": {"thread_id": "1", "user_id": "1"}}
response = agent.invoke(
{"messages": [{"role": "user", "content": "What does my database say about cats? Make sure you address me with my name"}]},
config
)
for msg in response["messages"][-2:]:
msg.pretty_print()
With just a few lines of code, you can:
- Uses the vector store backed by Postgres
- Enable DiskANN for semantic search
- Use checkpointers for short-term conversation history
Learn More
This is just the beginning. With native LangChain + LangGraph support in Azure PostgreSQL, developers can now rely on a single, secure, high-performance data layer for building the next generation of AI agents.
👉 Ready to start? All the code are available in the Azure Postgres Agents Demo GitHub repository. See how easy it is to bring your AI agent to life on Azure.
👉 Check out the docs for more details on the LangChain + LangGraph connector.