Forum Widgets
Latest Discussions
Azure function app to read files from SMB mounted file share
How can I programmatically connect an Azure Function App to multiple (50+) SMB-mounted Azure File Shares that use the same credentials, given that Logic Apps aren't suitable due to their static connection requirements?velmarsJul 25, 2025Copper Contributor85Views0likes1CommentIssue with Custom Domain on APIM and Cloudflare Proxying
Dear all, Last week, we attempted to configure a custom domain name for our Azure API Management (APIM) instance. We use Cloudflare as our DNS provider. The required CNAME record was created with the proxied attribute enabled. However, when configuring the custom hostname in Azure, we encountered the following error: Invalid parameter: CustomHostnameOwnershipCheckFailed. A CNAME record pointing from apim.ourowndomain.net to apim.azure-api.net was not found. As a workaround, we disabled the proxied attribute in Cloudflare, retried the configuration, and it worked successfully. We then re-enabled the proxied attribute, and the custom domain continued to function correctly. However, yesterday, we discovered that the custom domain was no longer working and returned a "404 Web site not found" error page. After extensive troubleshooting—including disabling the proxied attribute on the CNAME record—we were unable to resolve the issue. To restore functionality, we removed and reconfigured the custom domain by following the same steps: Disable the proxied attribute on the CNAME record. Configure the custom domain in APIM. Re-enable the proxied attribute. This resolved the issue again. We suspect that Azure initially validates the CNAME record during the custom domain configuration process when the proxied attribute is disabled. However, after a few days, Azure appears to revalidate the CNAME record and expects it to resolve to *.azure-api.net. Since Cloudflare returns its own IPs when proxying is enabled, Azure may reject the custom domain configuration, leading to the issue. Can anyone confirm whether our assumption is correct? Additionally, is there a recommended workaround for this issue? We are considering deploying a reverse proxy (Application Gateway) to handle Cloudflare requests and forward them to the APIM instance. Thank you in advance for your help. Best regards,mkg310Mar 21, 2025Copper Contributor534Views0likes4CommentsDebug your APIs using request tracing
We are leveraging Azure API Management's tracing capabilities to monitor and log incoming traffic. The primary goal is to track traffic in APIM and attribute it to specific client applications by identifying the appid from JWT tokens included in requests. Additionally, we aim to ensure that trace logs are correctly sent to Log Analytics for debugging and further analysis. To achieve this, we implemented a test policy in a GET method of a cloned API within APIM. The policy is as follows: “<policies> <inbound> <base /> <trace source="InboundTrace" severity="verbose"> <message>Inbound processing started</message> <metadata name="User-Agent" value="@(context.Request.Headers.GetValueOrDefault("User-Agent", "unknown"))" /> </trace> </inbound> <backend> <base /> </backend> <outbound> <base /> <trace source="OutboundTrace" severity="verbose"> <message>Outbound response being sent</message> <metadata name="ResponseCode" value="@(context.Response.StatusCode.ToString())" /> </trace> </outbound> <on-error> <base /> <trace source="ErrorTrace" severity="error"> <message>Error encountered</message> <metadata name="ErrorDetails" value="@(context.LastError.Message)" /> </trace> </on-error> </policies>” This approach aims to ensure the appid appears in the tracerecords attribute of ApiManagementGatewayLogs, enabling us to identify which client applications are consuming specific APIs. Challenges Faced Trace Logs: Trace logs are not appearing in Log Analytics, despite being configured in diagnostics. Using the queries suggested in the documentation, we could not find the TraceRecords field or metadata added by the trace policy. We are unsure if the policy is being correctly applied or if additional configurations are needed. Traffic Attribution: While traffic is traceable, attributing requests to client applications without the appid is challenging. We want to confirm if the approach to extract and log the appid aligns with best practices and whether there are more efficient alternatives. Questions Are there additional configurations needed to ensure trace logs are correctly sent to Log Analytics? Could you provide more detailed examples of KQL queries to check the records generated by the trace policy? Does the proposed approach for extracting and logging appid align with best practices in APIM? Are there any limitations or performance considerations when modifying global policies for this purpose? References Followed Debug APIs in Azure API Management Trace Policy Documentation260Views0likes1CommentAzure API Management Gateway - RBAC on the API level
Is it possible to grant access on specific APIs implementation, making users able to see some APIs but not others inside the same Azure API Management Gateway? For example: User1 can manage green ones, but not red ones. Thanks.mkg310Nov 21, 2024Copper Contributor169Views0likes3CommentsAPI Guide: Resubmitting from a specific Action in Logic Apps Standard
In collaboration with Sofia Hubendick This how-to article explains the process of resubmitting a Logic App Standard from a specific action via API. If you want to resubmit the workflow from the beginning, you can use the https://learn.microsoft.com/sv-se/rest/api/appservice/workflow-trigger-histories/resubmit?view=rest-appservice-2024-04-01&tabs=HTTP instead. Workflow Run Histories - Resubmit Authentication I used a managed identity for authentication, which simplifies the process by eliminating the need to obtain a token manually. Additionally, I implemented the new Logic App Standard Operator role. URL The URL for resubmitting an action looks like this: https://management.azure.com/subscriptions/[subscriptionId]/resourceGroups/[resourceGroupName]/providers/Microsoft.Web/sites/[logicAppName]/hostruntime/runtime/webhooks/workflow/api/management/workflows/[workflowName]/runs/[runId]/resubmit?api-version=2022-03-01 Mandatory URL Path Parameters Name Description subscriptionId The Azure subscription Id resourceGroupName The name of the resource group containing the Logic App logicAppName The name of the Logic App workflowName The name of the workflow runId The id of the workflow run to be resubmitted Request Body The API request body is structured as follows; replace the placeholder with the name of the action: { "actionsToResubmit": [ { "name": "[action name]" } ] } Response Name Description 202 Accepted OK Other Status Codes Error response describing why the operation failed.andevjenOct 14, 2024Copper Contributor419Views0likes0CommentsIntegration with SuccessFactors
Hi Community We have SuccessFactors and we have four different Azure's. SuccessFactors can only connect to one of these (Its for SuccessFactors Recruitment Integration to Outlook) Does anyone know a way that I can connect the three other Azure's to our main company one, so that SuccessFactors can connect to our main one, then can see / post to Outlooks that exist on the other three?pallen330Sep 18, 2024Copper Contributor263Views0likes1CommentSemantic Kernel: Develop your AI Integrated Web App on Azure and .NET 8.0
How to create a Smart Career Advice and Job Search Engine with Semantic Kernel The concept The Rise of Semantic Kernel Semantic Kernel, an open-source development kit, has taken the .NET community by storm. With support for C#, Python, and Java, it seamlessly integrates with dotnet services and applications. But what makes it truly remarkable? Let’s dive into the details. A Perfect Match: Semantic Kernel and .NET Picture this: you’re building a web app, and you want to infuse it with AI magic. Enter Semantic Kernel. It’s like the secret sauce that binds your dotnet services and AI capabilities into a harmonious blend. Whether you’re a seasoned developer or just dipping your toes into AI waters, Semantic Kernel simplifies the process. As part of the Semantic Kernel community, I’ve witnessed its evolution firsthand. The collaborative spirit, the shared knowledge—it’s electrifying! We’re not just building software; we’re shaping the future of AI-driven web applications. The Web App Our initial plan was simple: create a job recommendations engine. But Semantic Kernel had other ideas. It took us on an exhilarating ride. Now, our web application not only suggests career paths but also taps into third-party APIs to fetch relevant job listings. And that’s not all—it even crafts personalized skilling plans and preps candidates for interviews. Talk about exceeding expectations! Build Since i have already created the repository on GitHub i don’t think it is critical to re post Terraform files here. We are building our main Infrastructure with Terraform and also invoke an Azure Cli script to automate the Container Image build and push. We will have these resources at the end: Before deployment make sure to assign the Service Principal with the role “RBAC Administrator” and narrow down the assignments to AcrPull, AcrPush, so you can create a User Assigned Managed Identity with these roles. Since we are building and pushing the Container Images with local-exec and Az Cli scripts within Terraform you will notice some explicit dependencies, for us to make sure everything builds in order. It is really amazing the fact that we can build all the Infra including the Apps with Terraform ! Architecture Upon completion you will have a functioning React Web App with the ASP NET Core webapi, utilizing Semantic Kernel and an external Job Listings API, to get advice, find Jobs and get a Skilling Plan for a specific recommended role! The following is a reference Architecture. Aside the Private Endpoints the same deployment is available in GitHub. Kernel SDK The SDK provides a simple yet powerful array of commands to configure and “set” the Semantic Kernel characteristics. Let’s the first endpoint, where users ask for recommended career paths: [HttpPost("get-recommendations")] public async Task<IActionResult> GetRecommendations([FromBody] UserInput userInput) { _logger.LogInformation("Received user input: {Skills}, {Interests}, {Experience}", userInput.Skills, userInput.Interests, userInput.Experience); var query = $"I have the following skills: {userInput.Skills}. " + $"My interests are: {userInput.Interests}. " + $"My experience includes: {userInput.Experience}. " + "Based on this information, what career paths would you recommend for me?"; var history = new ChatHistory(); history.AddUserMessage(query); ChatMessageContent? result = await _chatCompletionService.GetChatMessageContentAsync(history); if (result == null) { _logger.LogError("Received null result from the chat completion service."); return StatusCode(500, "Error processing your request."); } string content = result.Content; _logger.LogInformation("Received content: {Content}", content); var recommendations = ParseRecommendations(content); _logger.LogInformation("Returning recommendations: {Count}", recommendations.Count); return Ok(new { recommendations }); The actual data flow is depicted below, and we can see the Interaction with the local Endpoints and the external endpoint as well. The user provides Skills, Interests, Experience and Level of current position and the API sends the Payload to Semantic kernel with a constructed prompt asking for positions recommendations. The recommendations return with clickable buttons, one to find relevant positions from LinkedIn listings using the external API, and another to ask again the Semantic Kernel for skill up advice! The UI experience : Recommendations: Skill Up Plan: Job Listings: The Project can be extended to a point of automation and AI Integration where users can upload their CVs and ask the Semantic Kernel to provide feedback as well as apply for a specific position! As we discussed earlier some additional optimizations are good to have, like the Private Endpoints, Azure Front Door and/or Azure Firewall, but the point is to see Semantic Kernel in action with it’s amazing capabilities especially when used within the .NET SDK. Important Note: This could have been a one shot deployment but we cannot add the custom domain with Terraform ( unless we use Azure DNS) and the Cors Settings. So we have to add these details for our Solution to function properly! Once the Terraform completes, add the Custom Domains to both Container Apps. The advantage here is that we will know the Frontend and Backend FQDNs, since we decide the Domain name, and the React Environment Value is preconfigured with the backend URL. Same for the Backend, we have set as Environment Value for the ALLOWED_ORIGINS, the frontend URL. So we can just go to Custom Domain on each App, and add the domain names after selecting the Certificate which will be already there, since we have uploaded it via Terraform! Lessons Learned This was a real adventure and i want to share with you important lessons learned and hopefully save you some time and effort. Prepare ahead with a Certificate. I was having problems from the get go with ASP NET refusing to build on Containers until i integrated the certificate. The local development works fine without it. Cross Origin is very important, do not underestimate it ! Configure it correctly and in this example i went directly to Custom Domains, so i can have better overall control. This solution worked both on Azure Web Apps and Azure Container Apps. The Git Hub repo has the Container Apps solution but you can go with Web Apps. Finally don’t waste you time to go with Dapr. React does not ‘react’ well with the Dapr Client and my lesson learned here is that Dapr is made for same framework invocation or you are going to need a middleware. Since we cannot create the Custom Domain with Terraform there are solutions we can use, like using AzApi, We utilized a small portion of what really Semantic Kernel can do and i stopped when i realized that this project will never end if i continue pursuing ideas ! It is much better to have it on GiHub and probably we can come back and add some more features ! Conclusion In this journey through the intersection of technology and career guidance, we’ve explored the powerful capabilities of Azure Container Apps and the transformative potential of Semantic Kernel, Microsoft’s open-source development kit. By seamlessly integrating AI into .NET applications, Semantic Kernel has not only simplified the development process but also opened new doors for innovation in career advice. Our adventure began with a simple idea—creating a job recommendations engine. However, with the help of Semantic Kernel, this idea evolved into a sophisticated web application that goes beyond recommendations. It connects to third-party APIs, crafts personalized skilling plans, and prepares candidates for interviews, demonstrating the true power of AI-driven solutions. By leveraging Terraform for infrastructure management and Azure CLI for automating container builds, we successfully deployed a robust architecture that includes a React Web App, ASP.NET Core web API, and integrated AI services. This project highlights the ease and efficiency of building and deploying cloud-based applications with modern tools. The code is available in GitHub for you to explore, contribute and extend as mush as you want to ! Git Hub Repo: Semantic Kernel - Career Advice Links\References Intro to Semantic Kernel Understanding the kernel Chat completion Deep dive into Semantic Kernel Azure Container Apps documentation1.2KViews0likes0CommentsKernel Memory - Retrieval Augmented Generation (RAG) using Azure Open AI
Hello Community, I am seeking for guidance here, Looking for Kernel Memory - Retrieval Augmented Generation (RAG) using Azure Open AI which can read file in kernel memory. I can ask question and based on memory it can answer my questions. I want to use .NetCore here for implementation. I have referred below article but i did not found configuration related to Azure Open AI. https://github.com/microsoft/kernel-memory/tree/main%22github.com%22Bhavin163884Jun 27, 2024Copper Contributor367Views0likes1CommentThird Party NVA in Azure VMware Solution
Hi all, I am following below link to get more information on how to deploy 3rd party NVA however, would like to know if you have any other detailed documentations and considerations that I can follow during my initial discussion with the customers. https://vuptime.io/post/2023-07-24-third-party-nva-in-avs-nsxt/#:~:text=In%20order%20to%20deploy%20a,and%20to%20the%20NVA%20uplink. Appreciate your support!pravesh_kaushalJun 26, 2024Copper Contributor338Views0likes1Comment
Resources
Tags
- logic apps11 Topics
- azure api management4 Topics
- Event Grid3 Topics
- Biztalk 20202 Topics
- biztalk2 Topics
- azure2 Topics
- biztalk server1 Topic
- Visual Studio 20191 Topic
- azure devops1 Topic