Blog Post

Azure Integration Services Blog
5 MIN READ

🤖 AI Procurement assistant using prompt templates in Standard Logic Apps

shahparth's avatar
shahparth
Icon for Microsoft rankMicrosoft
Apr 15, 2025

📘 Introduction

Answering procurement-related questions doesn't have to be a manual process. With the new Chat Completions using Prompt Template action in Logic Apps (Standard), you can build an AI-powered assistant that understands context, reads structured data, and responds like a knowledgeable teammate.

🏢 Scenario: AI assistant for IT procurement

Imagine an employee wants to know:

"When did we last order laptops for new hires in IT?"

Instead of forwarding this to the procurement team, a Logic App can:

  • Accept the question
  • Look up catalog details and past orders
  • Pass all the info to a prompt template
  • Generate a polished, AI-powered response

🧠 What Are Prompt Templates?

Prompt Templates are reusable text templates that use Jinja2 syntax to dynamically inject data at runtime.

In Logic Apps, this means you can:

  • Define a prompt with placeholders like {{ customer.orders }}
  • Automatically populate it with outputs from earlier actions
  • Generate consistent, structured prompts with minimal effort

✨ Benefits of Using Prompt Templates in Logic Apps

  • Consistency: Centralized prompt logic instead of embedding prompt strings in each action.
  • Reusability: Easily apply the same prompt across multiple workflows.
  • Maintainability: Tweak prompt logic in one place without editing the entire flow.
  • Dynamic control: Logic Apps inputs (e.g., values from a form, database, or API) flow right into the template.

This allows you to create powerful, adaptable AI-driven flows without duplicating effort — making it perfect for scalable enterprise automation.

💡 Try it Yourself

Grab the sample prompt template and sample inputs from our GitHub repo and follow along.

👉 Sample logic app

🧰 Prerequisites

To get started, make sure you have:

  • A Logic App (Standard) resource in Azure
  • An Azure OpenAI resource with a deployed GPT model (e.g., GPT-3.5 or GPT-4)

💡 You’ll configure your OpenAI API connection during the workflow setup.

🔧 Build the Logic App workflow

Here’s how to build the flow in Logic Apps using the Prompt Template action. This setup assumes you're simulating procurement data with test inputs.

📌 Step 0: Start by creating a Stateful Workflow in your Logic App (Standard) resource.

  • Choose "Stateful" when prompted during workflow creation.
  • This allows the run history and variables to be preserved for testing.

📸 Creating a new Stateful Logic App (Standard) workflow

 

Here’s how to build the flow in Logic Apps using the Prompt Template action. This setup assumes you're simulating procurement data with test inputs.

📌 Trigger: "When an HTTP request is received"

📌 Step 1: Add three Compose actions to store your test data.

  • documents: This stores your internal product catalog entries
[ { "id": "1", "title": "Dell Latitude 5540 Laptop", "content": "Intel i7, 16GB RAM, 512GB SSD, standard issue for IT new hire onboarding" }, { "id": "2", "title": "Docking Station", "content": "Dell WD19S docking stations for dual monitor setup" } ]

 

📸 Compose action for documents input 

  • question: This holds the employee’s natural language question.

[ { "role": "user", "content": "When did we last order laptops for new hires in IT?" } ]

📸 Compose action for question input

 

 

  • customer: This includes employee profile and past procurement orders
{ "firstName": "Alex", "lastName": "Taylor", "department": "IT", "employeeId": "E12345", "orders": [ { "name": "Dell Latitude 5540 Laptop", "description": "Ordered 15 units for Q1 IT onboarding", "date": "2024/02/20" }, { "name": "Docking Station", "description": "Bulk purchase of 20 Dell WD19S docking stations", "date": "2024/01/10" } ] }

📸 Compose action for customer input

📌 Step 2: Add the "Chat Completions using Prompt Template" action
📸 OpenAI connector view 

💡Tip: Always prefer the in-app connector (built-in) over the managed version when choosing the Azure OpenAI operation. Built-in connectors allow better control over authentication and reduce latency by running natively inside the Logic App runtime.

📌 Step 3: Connect to Azure OpenAI 
Navigate to your Azure OpenAI resource and click on Keys and Endpoint for connecting using key-based authentication 

📸 Create Azure OpenAI connection 



📝 Prompt template: Building the message for chat completions

Once you've added the Get chat completions using Prompt Template action, here's how to set it up:

1. Deployment Identifier

Enter the name of your deployed Azure OpenAI model here (e.g., gpt-4o).

📌 This should match exactly with what you configured in your Azure OpenAI resource.

2. Prompt Template

This is the structured instruction that the model will use.

Here’s the full template used in the action — note that the variable names exactly match the Compose action names in your Logic App: documents, question, and customer.

system: You are an AI assistant for Contoso's internal procurement team. You help employees get quick answers about previous orders and product catalog details. Be brief, professional, and use markdown formatting when appropriate. Include the employee’s name in your response for a personal touch. # Product Catalog Use this documentation to guide your response. Include specific item names and any relevant descriptions. {% for item in documents %} Catalog Item ID: {{item.id}} Name: {{item.title}} Description: {{item.content}} {% endfor %} # Order History Here is the employee's procurement history to use as context when answering their question. {% for item in customer.orders %} Order Item: {{item.name}} Details: {{item.description}} — Ordered on {{item.date}} {% endfor %} # Employee Info Name: {{customer.firstName}} {{customer.lastName}} Department: {{customer.department}} Employee ID: {{customer.employeeId}} # Question The employee has asked the following: {% for item in question %} {{item.role}}: {{item.content}} {% endfor %} Based on the product documentation and order history above, please provide a concise and helpful answer to their question. Do not fabricate information beyond the provided inputs.

 

📸 Prompt template action view

 

3. Add your prompt template variables

Scroll down to Advanced parameters → switch the dropdown to Prompt Template Variable.

Then:

  • Add a new item for each Compose action and reference it dynamically from previous outputs:
    • documents
    • question
    • customer

📸 Prompt template variable references

🔍 How the template works

Template elementWhat it does
{{ customer.firstName }} {{ customer.lastName }}Displays employee name
{{ customer.department }}Adds department context
{{ question[0].content }}Injects the user’s question from the Compose action named question
{% for doc in documents %}Loops through catalog data from the Compose action named documents
{% for order in customer.orders %}Loops through employee’s order history from customer

Each of these values is dynamically pulled from your Logic App Compose actions — no code, no external services needed. You can apply the exact same approach to reference data from any connector, like a SharePoint list, SQL row, email body, or even AI Search results. Just map those outputs into the Prompt Template and let Logic Apps do the rest.

✅ Final Output

When you run the flow, the model might respond with something like:

"The last order for Dell Latitude 5540 laptops was placed on February 20, 2024 — 15 units were procured for IT new hire onboarding."

This is based entirely on the structured context passed in through your Logic App — no extra fine-tuning required.

📸 Output from run history


💬 Feedback

 Let us know what other kinds of demos and content you would like to see using this form

 

Updated Apr 16, 2025
Version 5.0
  • mikeholdorf's avatar
    mikeholdorf
    Copper Contributor

    I like how this is set up, but having trouble getting this to work. First time I attempted to run, it returned from OpenAI with: The message list in input should contain atleast one message with role 'user'.

    I added this to the Prompt Template and it now runs, but just states that none of my data can be found.  I am using gpt-4 for this request, so cannot figure out what I am doing incorrectly. Is there a sample you can provide in GitHub so we can get this working and test?  Thanks.

    • shahparth's avatar
      shahparth
      Icon for Microsoft rankMicrosoft

      thanks mikeholdorf, I have added a github sample in the beginning of the blog which you try out. Let me know if you still face any issues while getting it to work

      • mikeholdorf's avatar
        mikeholdorf
        Copper Contributor

        Somehow I copied in some special characters. I used your sample and got it working. Thanks for following up!