rest
23 Topicsđ New in Azure API Management: MCP in v2 SKUs + external MCP-compliant server support
Your APIs are becoming tools. Your users are becoming agents. Your platform needs to adapt. Azure API Management is becoming the secure, scalable control plane for connecting agents, tools, and APIs â with governance built in. -------------------------------------------------------------------------------------------------------------------------------------------------------------------- Today, weâre announcing two major updates to bring the power of the Model Context Protocol (MCP) in Azure API Management to more environments and scenarios: MCP support in v2 SKUs â now in public preview Expose existing MCP-compliant servers through API Management These features make it easier than ever to connect APIs and agents with enterprise-grade controlâwithout rewriting your backends. Why MCP? MCP is an open protocol that enables AI agentsâlike GitHub Copilot, ChatGPT, and Azure OpenAIâto discover and invoke APIs as tools. It turns traditional REST APIs into structured, secure tools that agents can call during execution â powering real-time, context-aware workflows. Why API Management for MCP? Azure API Management is the single, secure control plane for exposing and governing MCP capabilities â whether from your REST APIs, Azure-hosted services, or external MCP-compliant runtimes. With built-in support for: Security using OAuth 2.1, Microsoft Entra ID, API keys, IP filtering, and rate limiting. Outbound token injection via Credential Manager with policy-based routing. Monitoring and diagnostics using Azure Monitor, Logs, and Application Insights. Discovery and reuse with Azure API Center integration. Comprehensive policy engine for request/response transformation, caching, validation, header manipulation, throttling, and more. âŠyou get end-to-end governance for both inbound and outbound agent interactions â with no new infrastructure or code rewrites. â Whatâs New? 1. MCP support in v2 SKUs Previously available only in classic tiers (Basic, Standard, Premium), MCP support is now in public preview for v2 SKUs â Basic v2, Standard v2, and Premium v2 â with no pre-requisites or manual enablement required. You can now: Expose any REST API as an MCP server in v2 SKUs Protect it with Microsoft Entra ID, keys or tokens Register tools in Azure API Center 2. Expose existing MCP-compliant servers (pass-through scenario) Already using tools hosted in Logic Apps, Azure Functions, LangChain or custom runtimes? Now you can govern those external tool servers by exposing them through API Management. Use API Management to: Secure external MCP servers with OAuth, rate limits, and Credential Manager Monitor and log usage with Azure Monitor and Application Insights Unify discovery with internal tools via Azure API Center đ You bring the tools. API Management brings the governance. đ§ Whatâs Next Weâre actively expanding MCP capabilities in API Management: Tool-level access policies for granular governance Support for MCP resources and prompts to expand beyond tools đ Get Started đ Expose APIs as MCP servers đ Connect external MCP servers đ Secure access to MCP servers đ Discover tools in API Center Summary Azure API Management is your single control plane for agents, tools and APIs â whether you're building internal copilots or connecting external toolchains. This preview unlocks more flexibility, less friction, and a secure foundation for the next wave of agent-powered applications. No new infrastructure. Secure by default. Built for the future.2.2KViews2likes3CommentsSending Messages to Confluent Cloud topic using Logic App
With Logic Apps, we can create workflows that connect to various services and systems, allowing us to automate tasks and streamline the business operations. In this blog post, we will explore how to use Azure Logic Apps to send messages to Kafka Confluent topic. Currently, there is no out-of-box Kafka Confluent connector in logic app. We found that Kafka Confluent provides REST API Confluent Cloud API Reference Documentation. This sample shows how to use HTTP action in workflow to call the Kafka Confluent API which produce record to topic. Prerequisites Azure Account and access to Azure Portal. Access to Confluent Cloud. Confluent Cloud is a fully managed pay-as-you-go Kafka service. You can get a free trial here. Setup the Kafka cluster and topic If you are new to Confluent Kafka, you can check their tutorials: Quick Start for Confluent Cloud | Confluent Documentation. Create a new Kafka cluster on Confluent Cloud. Navigate to the cluster and click Cluster settings. Note the REST endpoint. We will use following endpoint in this example: Create a new Kafka Topic called "LAtest" using the default topic settings. Create a new API Key and Secret. Navigate to the cluster and from the left menu, select Cluster Overview -> API Keys. Click Create key and follow the prompts to create a Global access API key. Note down the value of key and secret. To communicate with the REST API, we need to use this API key ID and corresponding secret to create the base64 encoded string in the authorization header that will be included in the REST calls. To learn more, see Authentication part which describes Cloud and Cluster API keys and base64 encoding. Create Logic App workflow To produce message to a topic, we need to provide JSON data and a base64-encoded API key and secret to the REST Produce endpoint: /kafka/v3/clusters/<cluster-id>/topics/<topic-name>/records. Below is a sample REST code snippet (Non-streaming mode): curl \ -X POST \ -H "Content-Type: application/json" \ -H "Authorization: Basic <base64-encoded API key and secret>" \ https://xxx.gcp.confluent.cloud:443/kafka/v3/clusters/lkc-mxpx52/topics/LAtest/records \ -d '{"value":{"type":"JSON","data":"Hello World!"}}' In the Logic App workflow, we can add a "When a HTTP request is received trigger" to fire the workflow for test. Then we add a HTTP action with following configurations: Run the workflow We can send the body message to target topic successfully. View the Messages tab of Latest topic on the Confluent Cloud UI:592Views2likes0CommentsDeploy Logic App Standard with Application Routing Feature Based on Terraform and Azure Pipeline
Due to Terraform's cross-cloud compatibility, automation, and efficient execution, among many other advantages, more and more customers use it to deploy integration solutions based on Azure Logic App standard. However, despite the extensive contributions from the community and individual contributors providing Terraform templates and supporting VNET integration solutions for Logic App standards, there are still very few terraform templates covering the "Application routing" and "Configuration routing" settings: This article shared a mature plan to deploy logic app standard then set the mentioned routing features automatically. It's based on Terraform template and Azure DevOps Pipeline. Code Reference: https://github.com/serenaliqing/LAStandardTerraformDeployment/tree/main/Terraform-Deployment-Demo About Terraform Template: Please kindly find the the template in directory Terraform/LAStandard.tf, it includes the terraform definitions for logic app standard, the backend storage account, application insights, virtual network and VNET integration settings. About VNET Routing Configuration Because there is no terraform examples available for VNET routing, we add VNET Settings by invoking "Patch" request to ARM RESTful API endpoint for interacting with logic app standard site: https://management.azure.com/subscriptions/<Your subscription id>/resourceGroups/$(deployRG)/providers/Microsoft.Web/sites/$(deployLA)?api-version=2022-03-01 We figured out the required request body in network trace as the following format: { "properties": { "vnetContentShareEnabled": false, "vnetImagePullEnabled": true, "vnetRouteAllEnabled": false, "vnetBackupRestoreEnabled": false } } Please find the YAML file in TerraformPipeline/logicappstandard-terraform.yml. Within the Yaml file , the "AzureCLI@2" task is used to send the request by Azure CLI command. task to send the patch request. Special Tips: To use the terraform task during Azure pipeline run, it's required to install terraform extension (which you can find in the following link): https://marketplace.visualstudio.com/items?itemName=ms-devlabs.custom-terraform-tasks Terraform tasks: Reference: Deploy Logic App Standard with Terraform and Azure DevOps pipelines https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/app_service https://azure.microsoft.com/en-us/products/devops/pipelines412Views2likes0CommentsUnable to attach binary files for Azure DevOps REST API
I was trying to upload binary files using Azure DevOps REST API service. reference: https://docs.microsoft.com/en-us/rest/api/azure/devops/wit/attachments/create?view=azure-devops-rest-6.0#upload-a-binary-file I was trying to upload "ATTACHMENT_TEST.zip" ref : https://drive.google.com/file/d/15Y3IS0BWoCaMo7kjt6t1ZCdHNH_PGT65/view?usp=sharing Converted ATTACHMENT_TEST.zip to base64 UEsDBBQAAAAAALZIcVSXxhNiBAAAAAQAAAAIAAAAVEVTVC50eHRTSUREUEsBAhQAFAAAAAAAtkhxVJfGE2IEAAAABAAAAAgAAAAAAAAAAQAgAAAAAAAAAFRFU1QudHh0UEsFBgAAAAABAAEANgAAACoAAAAAAA== Tried to add base64 as json in payload. The URL produced by the output is giving me invalid zip Code import requests import json url = "https://dev.azure.com/{Organization}/{ProjectName}/_apis/wit/attachments?uploadType=Simple&api-version=6.0&fileName=app.zip" payload = json.dumps("[UEsDBBQAAAAAALZIcVSXxhNiBAAAAAQAAAAIAAAAVEVTVC50eHRTSUREUEsBAhQAFAAAAAAAtkhxVJfGE2IEAAAABAAAAAgAAAAAAAAAAQAgAAAAAAAAAFRFU1QudHh0UEsFBgAAAAABAAEANgAAACoAAAAAAA==]") headers = { 'Content-Type': 'application/json', 'Authorization': 'Basic $AuthKey', 'Cookie': 'VstsSession=%7B%22PersistentSessionId%22%3A%22fe6c3302-6671-4bfc-9cbe-0d33f145a31f%22%2C%22PendingAuthenticationSessionId%22%3A%2200000000-0000-0000-0000-000000000000%22%2C%22CurrentAuthenticationSessionId%22%3A%2200000000-0000-0000-0000-000000000000%22%2C%22SignInState%22%3A%7B%7D%7D' } response = requests.request("POST", url, headers=headers, data=payload) print(response.text)1.7KViews0likes0CommentsGetting News articles using REST API
Hello, We have a site for newsroom, where we are displaying all news articles. I need to create a rest API to read this information and display it on our REACT based app. I am unable to get hold of a URL that would provider me with this Information. I have tried following URLS but I am not getting the desired results https://<tenant>.sharepoint.com/sites/iNewsroom/_api/Web/Lists/getByTitle('Site%20Pages')/items?$select=* https://intermountainhealth.sharepoint.com/_api/search/query?querytext=%27IsDocument:True%20AND%20FileExtension:aspx%20AND%20PromotedState:2%27 https://<tenant>.sharepoint.com/search/_api/search/query?querytext=%27IsDocument:True%20AND%20FileExtension:aspx%20AND%20PromotedState:2%27\ Is there any other URL I can try to get the information? The newsroom site has a few sitep743Views0likes0CommentsQuery web API and return JSON data
curl -X GET "https://api.server.com/v1/markets/quotes?symbols=AAPL,VXX190517P00016000&greeks=false" \ -H 'Authorization: Bearer <TOKEN>' \ -H 'Accept: application/json' How do i run this Rest Json API in sql server directly ? I believe its a combination of using he below, but i could not figure out the last syntax. sp_OACreate, sp_OAMethod sp_OAGetProperty Python version is here : # Version 3.6.1 import requests response = requests.get('https://api.server.com/v1/markets/quotes', params={'symbols': 'AAPL,VXX190517P00016000', 'greeks': 'false'}, headers={'Authorization': 'Bearer <TOKEN>', 'Accept': 'application/json'} ) json_response = response.json() print(response.status_code) print(json_response)1.1KViews0likes1CommentDevSum Special: Skriv inte kod som "Legenden Leo" - Distributed Systems - Season 3, Ep. 42
I veckans episod, som spelades in live pÄ DevSum konferensen (https://devsum.se), Àr vi glada att vÀlkomna Dylan Beattie! Dylan har en imponerande meritlista, med erfarenhet frÄn att ha talat pÄ hundratals konferenser till att undervisa i komplexa arkitekturkurser. Men det kanske mest unika inslaget Àr hans kreativa omtolkning av lÄten "We didn't start the fire", omarbetad för att handla om JavaScript-ramverk! I detta avsnitt berör vi allt frÄn Dylan's syn pÄ YouTube-kommentarer till det mÀnskliga egot, med en hint av IT-historia dyker vi djupt ner i Àmnen som distribuerade system och eventbaserad arkitektur, samt utforskar hur teamstruktur kan pÄverka arbetsflöden. Men det Àr inte allt - vi tar Àven en titt pÄ mÄnga andra spÀnnande Àmnen och kommer in pÄ allt frÄn REST, SOAP och GRPC till Telemetri och legenden leo (Legenden Leo - Kungliga slotten). SÄ luta dig tillbaka och njut av en diskussion som bjuder pÄ bÄde insikter och underhÄllning, oavsett om du Àr en erfaren utvecklare eller bara har ett allmÀnt intresse för teknikvÀrlden. Missa inte det hÀr avsnittet! Lyssna pÄ avsnittet hÀr