Six Agentic integration patterns that enterprises are deploying right now — because the moment Oracle AI Database@Azure and Microsoft AI share a datacenter, every excuse for not having AI evaporates
- "Your organization runs Oracle on Azure. You've purchased Microsoft Foundry & OpenAI licenses. Why are your AI projects still stuck in proof-of-concept?
- "Your employees spend hours every week entering or reconciling data from documents, copying information into ERP systems. What would it mean if that information could be automatically gathered and processed in sub-millisecond level?
- "You have contracts, policies, SOPs, and compliance documents buried across Oracle modules that nobody can find fast enough to use. What decisions are being made on guesswork because the right answer was technically there but practically invisible?
- "Your data science team built a churn model, an equipment failure predictor, a demand forecast. Those predictions live in a Python notebook. When did your executives last act on one of them before the thing it predicted had already happened?
If these feel familiar, the issue is not your models. It is the distance between your Data and your AI.
Here's a scenario that plays out in enterprise after enterprise: The CIO has a mandate to deploy AI. The business runs on Oracle On-Premise or other Hyperscaler — ERP, EBS, financials, operations, the whole estate. Microsoft 365, Microsoft Fabric Power BI is popular choice of tool for Data Analytics and AI. Azure OpenAI licenses are purchased. Proof-of-concepts perform beautifully in demos and collapse in production cross-cloud boundaries. Copilots return stale data. Agents time out. Dashboards refresh so slowly that executives print them out and trust the paper more than the screen.
The culprit is almost never the AI model. It's perhaps 200–300 milliseconds of latency between your Oracle data and your Microsoft AI platform cloud services. That gap — caused by Oracle sitting on-premises, in OCI, or in a different cloud — literally kills enterprise AI on Oracle workloads. It makes real-time copilots impractical, autonomous agents commercially unviable, and hourly AI-enriched dashboards technically impossible. Until that distance is removed, Azure AI in Oracle environments remains a demo. It does not become a production capability.
WHERE THESE PATTERNS COME FROM?
These six patterns in the article series were not invented for this article. They were discovered — built pattern by pattern, deployment by deployment, by Oracle customers and teams working on real enterprise problems against real Oracle databases. The original implementations ran on Oracle Cloud Infrastructure: Oracle's converged database handling JSON, vectors, graphs and relational data together; Oracle Integration Cloud orchestrating workflows; Oracle APEX delivering low-code applications; Oracle GenAI Service providing the intelligence layer. Real problems. Real Oracle customers. Real production deployments.
Then something changed. Oracle AI Database@Azure arrived — and with it, a question nobody had been able to answer cleanly before: what happens when you take these proven Oracle patterns and surround them with the full Microsoft AI stack? Not as a replacement. Not as a migration away from Oracle. But as a surrounding activation layer — Microsoft Foundry, Azure Open AI, Power Platform, Copilot Studio, Power BI, Microsoft Fabric as well as Microsoft IQ Intelligence layer (Work IQ + Fabric IQ + Foundry IQ) — all now living in the same Azure region, same datacenter fabric, same network switch as Oracle AI Database@Azure. Sub-millisecond latency away from your Oracle data. Zero egress between them. The same tenancy, effectively.
Oracle AI Database@Azure collapses the distance between your data and your models to zero. With Oracle Exadata and Microsoft AI platform sharing the same physical datacenter fabric — same building, same top-of-rack switch — the network hop becomes negligible. Latency falls to milliseconds. Cross-cloud egress charges vanish entirely.
The patterns didn't change. The physics did. What changed is the underlying infrastructure. Microsoft AI sitting close to Oracle Data. This shift moves AI from possibility into practicality. These patterns already worked. What changed is that they now work fast enough, cheap enough, and simply enough to deploy at scale.
NOTE: Based on the desired degree of customization and deployment flexibility, the six patterns described in this series may be realized using Microsoft 365, Power Platform, Copilot Studio, Microsoft Fabric, and associated services. The scope of this article series is limited to illustrating the Agent‑to‑Oracle AI Database@Azure integration paths through which these patterns are enabled.
Six High‑Impact AI Use Cases for Oracle Estate Modernization on Azure with Microsoft AI
Your Oracle data answers questions in plain English, without touching a line of code or a single schema object in Oracle.
Your application runs on Oracle AI Database@Azure — the data is co-located, the latency is gone, the infrastructure is ready. But your teams are still spending the first hour of every shift logging into Oracle screens, toggling between systems, and manually looking up the numbers they need before they can start working. Relationship managers, care coordinators, and field engineers switch between multiple applications during a single interaction. The data is right there in Oracle. The friction is in how people reach it. This pattern removes that friction entirely. For sake of illustrating the end-to-end integration flows we will focus on Foundry which helps build custom agents get request from users and connect to Oracle.
Pattern 01 of 06: In-Place AI Enablement
THE PATTERN IN-PLACE AI ENABLEMENT HIGH LEVEL DESCRIPTION
Your Agents connects to Oracle AI Database@Azure, treating your Oracle schema as a live service endpoint. Users ask questions in plain English and because Microsoft AI and database share the same datacenter at sub-millisecond latency, responses are real-time - no stale or timeouts.
AI doesn’t replace Oracle access. It removes the 40-minute overhead of logging in, navigating screens, and running reports — so teams focus on decisions, not data retrieval.
The question for your architecture team: how does Agents reach Oracle AI Database@Azure? There are six distinct paths. All six end at the same place. The path you choose determines who builds the integration and how much control you have.
SIX PATHS AT A GLANCE
Path
In plain language
Who builds it
Best for
Oracle requirement
1. Oracle Connector
Point-and-click chatbot for Teams
Business analyst
Fastest start
Any Oracle 19c+
2. ORDS + PL/SQL
Secure web service, your own SQL
DBA + developer
Max control
Any Oracle
3. ORDS + Select AI
NL2SQL inside Oracle via Azure OpenAI
DBA + developer
Production NL2SQL
Autonomous DB only
4. JDBC Functions
Custom code, any language
Developer
Custom logic
Any Oracle
5. Logic Apps
Low-code workflow orchestration
Citizen dev
Multi-system
Any Oracle
6. MCP + AI Agent
Oracle reasons autonomously
DBA + developer
Complex reasoning
Autonomous 26ai
Your Oracle data answers questions in plain English, without touching a line of code or a single schema object in Oracle.
Think of this as giving your agent the ability to look things up in your Oracle database — without anyone writing code. Microsoft Copilot Studio is a drag-and-drop tool where a business analyst points a chatbot at specific Oracle tables and says “these are your knowledge sources.” From that moment, any employee in Teams can type a question like “What’s the status of PO 4872?” and the chatbot answers instantly by reading live Oracle data.
HOW IT WORKS
Copilot Studio using the Native Oracle connector. Best separation of concerns.
- User → Teams→ Copilot Studio Agent → Oracle: The user interacts with Teams and asks plain English question. Copilot Agent with the native Oracle Database connector as knowledge source to query live Oracle AI Database@Azure data tables.
- Oracle → Copilot Agent: Returns live, governed data based on user context and access controls. Returns structured results to Copilot Agent for response assembly and formatting.
- Copilot Agent→ Teams → User: Delivers the final response back to the user in Teams, web app, or chatbot.
STRENGTHS
- Accelerates AI deployment with minimal engineering overhead, enabling rapid rollout of enterprise copilots and agents without custom application development.
- Democratizes agent creation across business units through low‑code conversational interfaces — expanding AI innovation beyond centralized data science teams.
- Enterprise identity governance built‑in via Microsoft Entra ID with MFA, conditional access, and centralized authentication.
- Aligned to Responsible AI and data governance mandates through Microsoft Purview, Entra ID, and RBAC‑based access enforcement.
- Native integration across Microsoft 365 and Power Platform enables AI experiences to be embedded directly into existing workforce workflows.
- Data‑in‑place architecture: Oracle data remains at the source, eliminating duplication risks and enabling real‑time AI grounded in operational truth.
- Flexible agent development paths: Both Copilot Studio (low‑code) and Microsoft Foundry Agent Builders (pro‑code) can be leveraged to build enterprise agents using standardized Oracle connectors/tools.
CONSIDERATIONS
- NL2SQL generation is mediated by Copilot Studio’s AI engine rather than Oracle Select AI — reducing direct control over SQL generation logic.
- Knowledge grounding is limited to table‑level data sources, restricting reuse of custom PL/SQL‑based business logic within agent interactions.
Key Takeaway — Oracle AI Database@Azure Deployment Compatibility
- Base DB Service: Fully supported.
- Exadata DB Service: Fully supported. Same direct table access as Base DB.
- Exascale DB Service: Fully supported.
- ADB Serverless: Fully supported. Note that NL2SQL still comes from Copilot Studio, not Oracle Select AI.
- ADB Dedicated: Fully supported.
- This is the most universally compatible path. It works on every Oracle Database@Azure deployment type because the Copilot Studio connector reads tables directly — no dependency on Select AI or any Autonomous-only feature. On-premises Oracle (11g+) is also reachable via the Power Platform on-premises data gateway.
You define exactly what goes back to the user — every SQL statement, every business rule, every byte of data is governed by your PL/SQL logic.
This is the path for users that need deterministic, predictable responses from Oracle. Your Oracle DBA writes the exact stored procedures that retrieve and transform data. These procedures are exposed as secure REST endpoints via ORDS. Foundry calls them like any other API. You control the SQL, the business logic, the validation, and the shape of every response.
HOW IT WORKS
Microsoft Foundry Agents invoke ORDS REST endpoints that execute custom PL/SQL — no AI-generated SQL involved.
- User → Teams -> Foundry: The user interacts with Teams and asks plain English question that gets submitted to a structured intent request to a Foundry agent.
- Foundry → ORDS REST Endpoint: Foundry is the authoritative control plane: it validates the intent and route calls a pre-defined ORDS endpoint using OAuth 2.0 or Entra ID managed identity authentication with structured parameters. It accesses ORDS the same way it accesses any external API: thru tool connection. Best patterns for ORDS are Open API or a wrapped as Azure Functions/APIM.
- ORDS → PL/SQL Stored Procedure: ORDS routes the call to a PL/SQL stored procedure. The procedure executes your hand-crafted SQL with full RLS/VPD enforcement.
- Oracle → ORDS: Oracle executes the query, applies Row-Level Security, runs business validation, returns structured results.
- ORDS → Foundry: ORDS formats the result as JSON. Rate limiting and audit logging applied automatically.
- Foundry → Teams → User: Foundry assembles the response, optionally enriches with Azure OpenAI narration, delivers the answer back to user.
STRENGTHS
- Deterministic AI‑to‑data interaction layer: Every SQL statement is explicitly defined — eliminating autonomous query generation and enabling policy‑driven data access.
- Enterprise‑grade identity and access controls through OAuth 2.0, Microsoft Entra ID managed identities.
- Embedded data‑level governance: Enforces Row‑Level Security (RLS) and Virtual Private Database (VPD) policies on every agent‑initiated query.
- Consistent deployment across database estates: ORDS is natively integrated with Autonomous Database and deployable on Base Database and Exadata.
- Agent‑ready API abstraction: ORDS endpoints can be registered as Open API tools/specs within Foundry Agent Service or Copilot Studio for controlled agent access.
- Enterprise AI control plane integration: Azure API Management (APIM) and Azure Functions introduce centralized authentication, policy enforcement, and abstraction of Oracle‑specific logic.
- Improved AI observability and governance posture across agent‑mediated database interactions.
CONSIDERATIONS
- Requires PL/SQL development expertise to build and maintain stored procedure wrappers for agent access.
- ORDS endpoints must be explicitly defined and lifecycle‑managed, including versioning.
- New data use cases typically require endpoint or stored procedure updates, increasing operational overhead.
- Does not leverage Oracle Select AI, limiting ad hoc natural language query flexibility at the database layer.
Key Takeaway — Oracle AI Database@Azure Deployment Compatibility
- Base DB Service: Fully supported. ORDS is installable. Full PL/SQL control with no Autonomous features required.
- Exadata DB Service: Fully supported. ORDS installable. Ideal for EBS/ERP workloads with custom stored procedures.
- Exascale DB Service: Fully supported.
- ADB Serverless: Fully supported. ORDS is built-in with zero setup — the best developer experience.
- ADB Dedicated: Fully supported. ORDS built-in.
- This is the most universally deployable path for controlled data access. It works on every Oracle Database@Azure service including Base DB (19c). No Autonomous-only features required. If your organization demands complete control over SQL generation and data exposure, this is the production-recommended starting point.
Users ask plain English questions — Oracle’s own AI engine translates them into SQL, executes them, and returns governed answers, all within the database boundary.
Your Oracle DBA creates a secure REST endpoint that accepts a plain-English question and returns an answer. Inside Oracle, Select AI reads your table structures, asks Azure OpenAI to generate the right SQL, runs it, and returns the result — all within the database boundary. Your data never leaves Oracle.
HOW IT WORKS
Microsoft Foundry Agents invoke ORDS REST endpoints that call Oracle’s Select AI engine (DBMS_CLOUD_AI).
- User → Teams -> Foundry: The user interacts with Teams and asks plain English question that gets submitted to a structured intent request to a Foundry agent.
- Foundry → ORDS REST Endpoint: Foundry is the authoritative control plane: it validates the intent and route calls a pre-defined ORDS endpoint Calls ORDS using OAuth 2.0 / Entra ID. Payload: {"question": "Avg salary by dept?", "action": "narrate"}.
- ORDS → DBMS_CLOUD_AI.GENERATE: PL/SQL wrapper invokes Select AI. Reads schema metadata, sends augmented prompt to Azure OpenAI (sub-ms latency).
- Azure OpenAI → Select AI: Returns generated SQL. Select AI validates, executes inside Oracle with RLS/VPD. Data never leaves Oracle.
- Oracle → Foundry: Select AI narrates the result. ORDS formats as JSON.
- Foundry → Team →User: Delivers the natural language answer with full audit trail.
STRENGTHS
- In‑database natural language query capability through Select AI (NL→SQL engine in 26ai) enables governed, conversational access to enterprise data for ad hoc analysis and BI‑style exploration.
- Agent interaction occurs via curated query intelligence endpoints, ensuring controlled invocation from any agent orchestration layers like Microsoft Foundry or Power Platform.
- Governed data exposure model: Curated object_list definitions prevent unintended access to sensitive datasets.
- Enterprise‑grade identity and access controls via OAuth 2.0, Microsoft Entra ID, and ORDS rate limiting.
- NL2SQL execution remains fully contained within the Oracle database boundary, with Row‑Level Security (RLS) and Virtual Private Database (VPD) policies enforced at runtime.
- Native ORDS integration within Autonomous Database eliminates the need for separate deployment layers.
- Multi‑modal Select AI interaction support including showsql, runsql, narrate, chat, and summarize for agent‑mediated reasoning workflows.
CONSIDERATIONS
- Requires PL/SQL expertise to implement stored procedure wrappers for agent invocation.
- Schema changes necessitate Select AI profile updates to maintain alignment with governed query surfaces.
- Select AI is limited to Autonomous Database deployments, and is not available on Base Database or Exadata Database Service environments.
Key Takeaway — Oracle AI Database@Azure Deployment Compatibility
- Base DB Service: Not supported for Select AI. DBMS_CLOUD_AI is not available on Base DB. Use Path 2 (ORDS + PL/SQL) instead, or deploy a sidecar ADB instance to unlock NL2SQL capabilities.
- Exadata DB Service: Not supported for Select AI. Same limitation as Base DB. Deploy a sidecar ADB Serverless that reaches Exadata data via Database Links.
- Exascale DB Service: Not supported for Select AI. Same sidecar approach applies.
- ADB Serverless: Fully supported. Full Select AI, ORDS built-in, Azure OpenAI integration. This is the production-recommended deployment for Path 3.
- ADB Dedicated: Fully supported. Full Select AI and ORDS built-in.
- ORDS itself works on every deployment type — you can always expose PL/SQL as REST endpoints. But Select AI (the NL2SQL engine) is exclusively Autonomous Database. For Base DB, Exadata, or Exascale, the sidecar pattern (see Sidecar section below) unlocks Select AI without modifying your production systems.
Your developers write the integration code — any language, any framework, any logic — connecting directly to Oracle using traditional database drivers.
Your engineering team writes a small Azure Function that connects directly to Oracle using JDBC or Python. Maximum flexibility: custom logic, combining Oracle results with other data sources, retry policies, caching, and batching.
HOW IT WORKS
Microsoft Foundry Agents invoke Azure Functions that connect to Oracle via JDBC/python-oracledb.
- User → Teams-> Foundry: The user interacts with Teams and asks plain English question submitted to Foundry agent.
- Foundry → Azure Function: Calls the Azure Function’s HTTP endpoint as a custom tool.
- Azure Function → Oracle (JDBC): Opens JDBC/python-oracledb connection using wallet or token auth. Calls DBMS_CLOUD_AI.GENERATE, custom PL/SQL, or direct SQL.
- Oracle → Azure Function: Executes with RLS/VPD. Function can then enrich, transform, cache, or combine with other sources.
- Azure Function → Foundry: Formats and returns response. Your code controls error handling and retries.
- Foundry → Teams → User: Delivers the final answer.
STRENGTHS
- Maximum engineering flexibility: Full developer control across any language and framework, enabling teams to standardize on existing enterprise development patterns.
- Leverages existing Azure app infrastructure: Runs cleanly on established Azure Functions and Azure App Service footprints.
- Operational resilience patterns under your control: Supports batching, retries, caching, and multi‑source aggregation to optimize performance and reliability.
- Broad driver ecosystem: Works with standard Oracle drivers including python-oracledb, JDBC (Java), and ODP.NET (.NET).
CONSIDERATIONS
- Higher build-and-run burden: More custom code to develop, test, secure, maintain, and operate compared to an ORDS-based API approach.
Key Takeaway — Oracle AI Database@Azure Deployment Compatibility
- Base DB Service: Supported for custom PL/SQL via JDBC. Not supported for Select AI — your Function would call Azure OpenAI directly for NL2SQL, then execute the generated SQL against Oracle.
- Exadata DB Service: Supported for custom PL/SQL via JDBC. Same external NL2SQL approach for Select AI.
- Exascale DB Service: Supported for custom PL/SQL. Same limitation for Select AI.
- ADB Serverless: Fully supported.
- ADB Dedicated: Fully supported.
- JDBC connectivity works with every Oracle Database@Azure service. You can always call custom PL/SQL from an Azure Function. However, calling DBMS_CLOUD_AI.GENERATE (Select AI) is Autonomous only.
Low-code workflow orchestration that bridges Oracle into the broader Azure ecosystem — CRUD operations, stored procedures, conditional logic, and multi-system integration.
Azure Logic Apps or Power Automate provides a visual workflow designer with a pre-built Oracle connector. Ideal when Oracle is one component of a larger multi-system workflow — invoice processing, approval chains, scheduled synchronization, cross-system orchestration.
HOW IT WORKS
Microsoft Foundry Agents invoke Azure Logic Apps workflows that orchestrate Oracle alongside other enterprise systems.
- User → Teams -> Foundry: The user interacts with Microsoft 365 Copilot or Teams and asks plain English question. Copilot interprets user intent and available Microsoft 365 context, then submits a structured intent request to a Foundry agent.
- Foundry → Logic Apps: Invokes a Logic Apps workflow as a tool — visual, low-code canvas.
- Logic Apps → Oracle (Connector): Uses pre-built Oracle connector for CRUD and stored procedures. Supports on-prem via data gateway.
- Oracle → Logic Apps: Returns results. Workflow applies conditional logic, routes to approvals, calls additional systems.
- Logic Apps → Foundry: Returns orchestrated result — may combine Oracle + other system data.
- Foundry → User: Delivers the final response or workflow status.
STRENGTHS
- Low‑code orchestration model: Enables business teams to design workflows using a visual interface without requiring PL/SQL development.
- Pre‑built Oracle connectivity: Native connector manages authentication, connection handling, and CRUD operations out of the box.
- Supports invocation of stored procedures for integration with existing business logic and transactional workflows.
- Enterprise integration layer: Integrates Oracle into cross‑platform workflows spanning SAP, Salesforce, ServiceNow, and other enterprise systems via the same connector framework.
- Hybrid deployment support: Enables access to on‑premises Oracle environments (11g+) through the on‑premises data gateway without modifying the source database.
CONSIDERATIONS
- Connector does not natively invoke Select AI or DBMS_CLOUD_AI, limiting direct access to in‑database NL2SQL capabilities.
- On‑premises data gateway may introduce an additional runtime component requiring installation, security assessment, and lifecycle management
Key Takeaway — Oracle Database@Azure Deployment Compatibility
- Base DB Service: Fully supported. Full CRUD + stored procedure invocation via the Oracle connector.
- Exadata DB Service: Fully supported. Ideal for EBS workflow automation.
- Exascale DB Service: Fully supported.
- ADB Serverless: Fully supported. Can invoke Select AI indirectly by calling an ORDS endpoint as one workflow step.
- ADB Dedicated: Fully supported.
- Logic Apps works with every Oracle deployment type including on-premises Oracle (11g+) via the data gateway. For NL2SQL, combine Logic Apps with an ORDS endpoint (Path 2 or 3) as one step in the workflow. ORDS gives you control; Logic Apps give you speed and multi-system orchestration.
The latest advanced path — the Oracle database itself becomes an AI agent that reasons, calls tools, reflects, and chains multi-step business logic autonomously.
While Select AI is a stateless NL→SQL capability that handles a single query and returns a single result, Select AI Agent provides context-aware, multi-step agentic reasoning to support workflows—effectively acting as an agent wrapper that uses Select AI (or a data agent endpoint) as one of its tools. Oracle Private Agent Factory ("PAF") may be used to build agent that run inside the Oracle database which can query data (SQL, vector), analyze, reason, research and execute workflow. These can be exposed and accessed by Oracle MCP* Server accessible by MCP Clients or Foundry Agent (APIM/Functions tool wrapper) ORDS which invokes PAF agents.
*MCP (Model Context Protocol) is an open standard — a “USB-C port” for AI agents. Oracle now offers a built-in MCP server on Autonomous Database (GA March 2026).
HOW IT WORKS
Microsoft Foundry Agents invoke Oracle’s Select AI Agent via ORDS or the built-in ADB MCP Server.
- User → Teams -> Foundry: The user interacts with Teams and asks plain English question submitted to Foundry agent.
- Foundry → Oracle (ORDS/MCP): Calls Oracle’s agent endpoint — a single tool call from Foundry’s perspective.
- Oracle → ReAct Loop: Select AI Agent reasons about data needs, calls NL2SQL, Vector Search, PL/SQL business rules, external APIs.
- Agent → Tools (Iterative): Calls tools, evaluates results, iterates 3–10 times. All within Oracle’s security boundary.
- Oracle → Foundry: Formulates comprehensive narrated answer with citations.
- Foundry → User: Delivers the answer with full audit trail of every reasoning step
STRENGTHS
- Advanced agentic execution capability: Supports multi‑step reasoning workflows where agents can invoke tools, maintain conversational context, and orchestrate actions across turns.
- Business logic executes natively within Oracle at sub‑millisecond latency — minimizing external orchestration overhead.
- Data remains within the database boundary, supporting structural compliance requirements for regulated industries.
- Select Agent execution path: ORDS invokes the Oracle in‑database agent runtime, enabling multi‑step reasoning workflows prior to returning results.
- Oracle Private Agent Factory agents may be exposed through ORDS endpoints or optionally surfaced via an Oracle MCP Server for agent‑based tool invocation.
CONSIDERATIONS
- Requires Oracle 26ai Autonomous Database deployments to support in‑database agent runtime capabilities.
- Configuration complexity may be higher relative to lower‑capability integration patterns.
- Agent‑based reasoning may increase token consumption, resulting in higher per‑query costs compared to direct Select AI NL2SQL execution.
- Model Context Protocol (MCP) support remains in preview, and enterprise orchestration platform integrations continue to mature.
Key Takeaway — Oracle Database@Azure Deployment Compatibility
- Base DB Service: Not supported.
- Exadata DB Service: Not supported.
- Exascale DB Service: Not supported.
- ADB Serverless: Fully supported. Built-in MCP server (GA March 2026), ReAct reasoning, Private Agent Factory, multi-turn memory. Requires 19.29+ or 26ai.
- ADB Dedicated: Fully supported. Full agent capabilities. Requires 26ai.
- Path 6 is exclusively Autonomous Database. The agentic runtime is deeply integrated into the ADB kernel. For non-Autonomous deployments, deploy a sidecar ADB instance to unlock agent capabilities.
If your priority is…
Start with…
Oracle requirement
Fastest time-to-value, M365
Path 1 — Copilot Studio
Any Oracle 19c+
Complete SQL control, regulated
Path 2 — ORDS + PL/SQL
Any Oracle
Natural language with Oracle NL2SQL
Path 3 — ORDS + Select AI
Autonomous DB
Custom integration, multi-source
Path 4 — JDBC Functions
Any Oracle
Multi-system orchestration
Path 5 — Logic Apps
Any Oracle
Complex reasoning, agentic AI
Path 6 — ORDS/MCP + AI Agent
Autonomous 26ai
Key principle
All six paths can coexist. Most enterprises will run Path 1 (Copilot Studio) for rapid Teams access, Path 2 or 3 (ORDS) for production, and Path 6 (Select AI Agent) for complex reasoning — all against the same Oracle database, same security model, same audit trail.
Now imagine your company has years of critical data sitting in Oracle databases. You want AI to answer business questions — "What were our top 10 customers last quarter?" or "Why did revenue dip in March?" — but you can't afford the risk, cost, or time of migrating your data to a Oracle Database At Azure system.
The sidecar solves this. It's a small, lightweight AI database you park next to your existing Oracle environment. Your data never moves. Your production systems are never touched. You simply get AI-powered answers on top of what you already own — in minutes, not months.
The sidecar can be an Autonomous Database instance (Serverless or Dedicated) running on Oracle Database@Azure. It does not store your production data — it connects to your existing databases (Exadata, Base DB, on-premises Oracle.
What it IS: A smart AI assistant sitting beside your data, reading and answering questions on demand. Additionally enterprises can adopt a Fabric-centric, sidecar architecture that integrates Microsoft Fabric OneLake and Oracle Database@Azure. What it IS NOT: A replacement for your existing databases, a data warehouse, or a migration project. It does not own your data — it borrows a view of it.
Everything in this article comes down to one physical reality: when Oracle AI Database@Azure and Microsoft AI share the same datacenter, the economics of enterprise AI don't just improve — they flip entirely.
Latency disappears. Egress disappears. Complexity disappears.
What we discussed today - PATTERN #01 of #06 - "AI In-Place Enablement" was never the hard part. The distance was.
This only works because the AI and Oracle data now sit within milliseconds of each other. Previously, this interaction would time out or return stale results.
For years, the gap between your Oracle data and your AI ambitions wasn't a technology problem — it was both physics and geography problem. That gap is now closed. Permanently.
So now the only question left is this: "Do you want to be early — or do you want to be catching up?"
Because Frontier organizations are not experimenting. They are building the next version of how their business runs.
If you’re already running Oracle AI Database@Azure, you’re closer than you think. The only thing left is deciding where to start.
If this resonates, you already know where to start.
Pattern 1 of 6 is live and Oracle AI Database@Azure is live across 40+ regions.
Remaining five patterns drop in over next series of this blog. Which one are you most curious about? We would love to know where you are headed. Drop it in the comments - we read every comment.
🔜 Next up — Pattern #02 of #06: Smart Agentic Workflow Automation and Document Intelligence.
READING REFERENCES
— Build Copilots with Copilot Studio + Oracle Database@Azure (Microsoft)
Add Oracle as a knowledge source in Copilot Studio (Microsoft Learn)
Use Autonomous AI Database as an AI Proxy for Select AI (Oracle)
Use AI Proxy Database for Select AI NL2SQL
Oracle REST Data Services documentation hub (Oracle)
Connect Logic Apps to Azure OpenAI and AI Search (Microsoft Learn)
ORDS POST/PUT handlers for write-back to Oracle (Oracle community)
Oracle AI Database 26ai announcement
Microsoft Foundry Agent Service overview (Microsoft Learn)
ORDS download page listing Oracle MCP Servers (Oracle)
Extend agents with REST APIs / OpenAPI specs from ORDS (Microsoft Learn)
Copilot Studio integration strategies including MCP and connectors (Microsoft Learn)
IMPORTANT DISCLAIMER:
Before making architectural decisions or production commitments based on any pattern or paths described here, we strongly recommend validating the latest capabilities, version compatibility, regional availability, licensing, and support status directly with Oracle and Microsoft. This includes confirming Select AI feature availability, MCP server GA status, Copilot Studio connector regional rollout, Logic Apps connector capabilities, and sidecar Database compatibility for your specific Oracle version and deployment type.
The assumptions, figures, and concepts presented in this blog are directional; each requires validation through a proof-of-concept or pilot before production commitment. The intent is not to prescribe a final architecture, but to map the art of the possible: identifying feasible, high-value AI use cases worth carrying forward into deeper design and execution. This blog is for educational purposes only. It does not constitute a product commitment, service guarantee, or contractual obligation from Oracle, Microsoft, or the authors.
## Acknowledgements
This work reflects a collaborative "Art of Possible" effort between Microsoft Global Black Belt, Oracle Tiger Team and Cloud Engineering teams. Special thanks to Chip Baber, Dennis Var, John Andrew Prabaharan, Don Kidwell, Heema Satapathy and Johnnie Konstantas from Oracle Corporation for their valuable input and contributions to validating the real-world implementation scenarios, sidecar AI database activation patterns, Select AI Engine, Private Agent Factory with in-database agent orchestration, and hybrid multi-agent integration approaches that helped shape several of the multi-cloud architecture patterns discussed in this blog series.