Introduction
Today, we are announcing two new built-in connectors Azure OpenAI and Azure AI Search. As more users are embracing emerging AI technologies, there is a need to be able to leverage these technologies inside of Logic Apps workflows. To that end we are announcing the Public Preview of new built-in connectors to bridge the Logic Apps + AI gap. With the Azure OpenAI and AI Search you can now build powerful automations by integrating your enterprise data and services with generative AI capabilities offered by Azure Open-AI and AI Search services inside of Logic Apps.
These new connectors support multiple types of authentications such as connection keys, AAD and managed identity and support connecting to your Azure OpenAI and Azure Search endpoints behind your firewalls enabling your workflows to securely connect to your AI resources in Azure.
Scenarios
The goal with these built-in connectors is to enable our users to leverage their new or existing AI services inside of their Logic Apps. Your workflow can now serve as the orchestration engine for pulling data into and out of your AI services. Outlined below are just some of the scenarios these new built-in connectors can enable.
Building a knowledgebase from your enterprise data
With Logic Apps ability to connect to almost any data source securely with its host of connectors (e.g., SharePoint, One Drive, Dropbox, Salesforce, SAP, etc.), you can easily build document ingestion pipelines to build knowledgebases containing vector embeddings for these documents in Azure AI Search.
With many types of triggers, you can make these automations run on a schedule or based on events such as arrivals of new documents in a SharePoint site.
Generating completions
With Azure OpenAI’s completion operation, you can generate responses to questions about your data. For example, you can generate answers to real-time questions or automate responses to emails. With Logic Apps ability to accept input, your user's question can be ingested into your workflow and response to those questions can be generated using the Azure OpenAI’s completion operation. You can immediately send those responses back to the client or send them to an approval workflow for verification.
Design your AI Workflow
OpenAI Provider
Azure OpenAI Service provides access to OpenAI's powerful language models including the GPT-4, GPT-4 Turbo with Vision, GPT-3.5-Turbo, and Embeddings model series. More information on Azure OpenAI here. The Azure OpenAI provider enables you to connect your workflow to your Azure OpenAI service. Use this provider to get OpenAI embeddings for your data or generate chat completions.
Get single embedding action
Get Chat completions
AI Search Provider
Azure AI Search, is an AI-powered information retrieval platform, helps developers build rich search experiences and generative AI apps that combine large language models with enterprise data. The Azure AI Search provider enables you to connect your workflows to your Azure AI Search service. This provider enables you to index documents and perform vector searches on your data.
Index a document
Vector search
Authentication
Each connector has multiple ways of authenticating to your AI service endpoint. With each authentication option you will need to provide the service’s endpoint. Given this, you have 3 ways to authenticate and connect:
- Authentication Key or Administration Key: Provide the key generated by the AI service for key-based authentication.
- Azure Active Directory: Using Active Directory parameters such as tenant, client identifier, and password you can authenticate to the connector as an Active Directory user.
- Managed Identity: Once a managed identity is created for your AI service, that identity can then be used to authenticate to the connector.
Additionally, you can connect to your OpenAI and AI Search services that are being VNets. These options provide a robust authentication mechanism that should suit the vast majority of user's needs.
GitHub resources
If you want to learn how to ingest document data into your Azure AI Search service and chat with your data using Logic Apps, we have a GitHub project that can help you get started. The project contains the following components:
- A Logic App workflow that uses the OpenAI connector to get embeddings from an OpenAI model and uses the AI Search connector to index documents.
- A Logic App workflow that uses the AI Search connector to search indexed documents from an AI Search service and uses the OpenAI connector to answer questions about the documents using natural language.
To use this project, you will need to have an Azure subscription, an OpenAI account, and an AI Search service. You will also need to grant the connectors access to your AI Search service and OpenAI account. You can find detailed instructions on how to set up and run the project in the GitHub repository below.
logicapps/ai-sample at master · Azure/logicapps (github.com)
What’s next?
Try it out!
We’re excited for you to try the new built-in connectors that enable linking your Azure OpenAI and AI Search services. We want to hear from you and get your feedback so please give it a try and let us know your thoughts!
We are constantly evaluating how to enable new scenarios inside of Logic Apps, and we want to hear from you. What AI scenarios are top of mind? Please answer a few questions to help us prioritize the next AI scenarios to enable in Logic Apps - https://aka.ms/raglogicapps