<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>Azure Federal Developer Connect articles</title>
    <link>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/bg-p/AzureFederalDeveloperConnect</link>
    <description>Azure Federal Developer Connect articles</description>
    <pubDate>Fri, 24 Apr 2026 13:05:49 GMT</pubDate>
    <dc:creator>AzureFederalDeveloperConnect</dc:creator>
    <dc:date>2026-04-24T13:05:49Z</dc:date>
    <item>
      <title>Calling API Management using Entra ID authentication and testing with PowerShell</title>
      <link>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/calling-api-management-using-entra-id-authentication-and-testing/ba-p/4418938</link>
      <description>&lt;P&gt;Integrating Entra ID or other compatible identity providers with Azure API Management is both easy and a great way to enhance security for your APIs.&amp;nbsp; However, when you enforce authentication with the Validate JWT policy in API Management, you now have the extra step of obtaining a JWT token from your identity provider and supplying it to API Management. If you are writing code, this is fairly straight forward to achieve with the Azure Identity libraries, and there are great API testing tools such as Postman which support integrating with an identity provider and obtaining a token and presenting it for authentication. But what happens if you happen to be in a restricted environment where tools like Postman, or even VS Code, are not available and you need to test an API?&amp;nbsp; The good news is that with just a few short lines of PowerShell we can achieve the same results.&lt;/P&gt;
&lt;H2&gt;Setting up the App Registration&lt;/H2&gt;
&lt;P&gt;The first step in enabling Entra ID authentication for your app is creating an App Registration in Entra ID.&amp;nbsp; There is an excellent Learn article &lt;A class="lia-external-url" href="https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-protect-backend-with-aad" target="_blank" rel="noopener"&gt;here&lt;/A&gt; describing the process of setting up an App Registration and enabling the JWT validation policy in API Management, but I'll go over the rough steps here:&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;Open the Azure Portal&lt;/LI&gt;
&lt;LI&gt;Navigate to the Entra ID blade&lt;/LI&gt;
&lt;LI&gt;Go to App registrations and select New registration&lt;/LI&gt;
&lt;LI&gt;Enter a name for the app registration and click register&lt;/LI&gt;
&lt;LI&gt;Go to Certificates &amp;amp; secrets and create a new client secret&lt;/LI&gt;
&lt;LI&gt;Make note of the client secret, client ID, and tenant ID&lt;/LI&gt;
&lt;LI&gt;Click on the "Expose an API" blade&lt;/LI&gt;
&lt;LI&gt;Click "Add" next to "Application ID URI"&lt;/LI&gt;
&lt;LI&gt;Take the default URI and save the Application ID URI.&lt;/LI&gt;
&lt;/OL&gt;
&lt;img /&gt;
&lt;P&gt;The Learn article discusses setting up scopes in the Expose an API blade but we will use the default scope in the interests of simplicity.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;H2&gt;Setting up the validate-jwt policy&lt;/H2&gt;
&lt;P&gt;In API Management, setup the validate-jwt policy by adding the policy expression at the appropriate scope, e.g. global, workspace, product, API or operation in the Inbound policies section. While there are many options for &lt;A class="lia-external-url" href="https://learn.microsoft.com/en-us/azure/api-management/validate-jwt-policy" target="_blank" rel="noopener"&gt;JWT validation&lt;/A&gt;, e.g using claims, for the purposes of this example we'll evaluate the issuer and audience.&amp;nbsp; The validate-jwt policy will look like this:&lt;/P&gt;
&lt;LI-CODE lang="xml-doc"&gt;&amp;lt;validate-jwt header-name="Authorization" failed-validation-httpcode="401" failed-validation-error-message="Unauthorized. Access token is missing or invalid."&amp;gt;
            &amp;lt;openid-config url="https://login.microsoftonline.us/{your-tenant-id}/v2.0/.well-known/openid-configuration" /&amp;gt;
            &amp;lt;audiences&amp;gt;
                &amp;lt;audience&amp;gt;{your-client-id}&amp;lt;/audience&amp;gt;
            &amp;lt;/audiences&amp;gt;
            &amp;lt;issuers&amp;gt;
                &amp;lt;issuer&amp;gt;https://sts.windows.net/{your-tenant-id}/&amp;lt;/issuer&amp;gt;
            &amp;lt;/issuers&amp;gt;
&amp;lt;/validate-jwt&amp;gt;&lt;/LI-CODE&gt;
&lt;P&gt;It's important to note two things here:&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;Even though we are in Azure Government, the issuer is still sts.windows.net (that took me down a rabbit hole once upon a time).&lt;/LI&gt;
&lt;LI&gt;The "/" at the end of the issuer string is important. Failure to include the "/" will result in your validation to fail because the issuer does not match.&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;After you save your policy, you can test that it's working by trying an Invoke-WebRequest to your API endpoint. You should receive a 401 Unauthorized message.&lt;/P&gt;
&lt;H2&gt;Testing with PowerShell&lt;/H2&gt;
&lt;P&gt;The PowerShell script essentially has two parts. The first part obtains the JWT token from Entra ID.&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="powershell"&gt;$tenantId = "your-tenant-id"
$clientId = "your-client-id"
$clientSecret = "your-client-secret"
$scope = "api://$clientId/.default"
$subscriptionKey = "your-subscription-key"
$tokenUrl = "https://login.microsoftonline.us/$tenantId/oauth2/v2.0/token"

$body = @{
    client_id = $clientId
    scope = $scope
    client_secret = $clientSecret
    grant_type = "client_credentials"
}
$response = Invoke-RestMethod -Method Post -Uri $tokenUrl -Body $body -ContentType "application/x-www-form-urlencoded"
$token = $response.access_token&lt;/LI-CODE&gt;
&lt;P&gt;The second part builds the request to include the token in the Authorization header.&lt;/P&gt;
&lt;LI-CODE lang="powershell"&gt;$headers = @{
    Authorization = "Bearer $token"
    "Ocp-Apim-Subscription-Key" = $subscriptionKey
}
$outputValue = Invoke-RestMethod -Uri "https://apim.yourdomain.com/apiName/operationName" -Headers $headers -Method Get&lt;/LI-CODE&gt;
&lt;P&gt;And that's it, a simple script that will allow you to grab a token and test your APIs with Entra ID or other identity provider authentication.&lt;BR /&gt;&lt;BR /&gt;Link to the script &lt;A class="lia-external-url" href="https://github.com/Azure/AzureFedDevBlog/tree/main/blogs/a-powershell-alternative-to-postman" target="_blank" rel="noopener"&gt;here&lt;/A&gt;.&lt;/P&gt;</description>
      <pubDate>Wed, 05 Nov 2025 23:17:52 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/calling-api-management-using-entra-id-authentication-and-testing/ba-p/4418938</guid>
      <dc:creator>JohnScott</dc:creator>
      <dc:date>2025-11-05T23:17:52Z</dc:date>
    </item>
    <item>
      <title>Enabling the Logic Apps PowerShell Connector for performing Azure tasks</title>
      <link>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/enabling-the-logic-apps-powershell-connector-for-performing/ba-p/4367295</link>
      <description>&lt;P&gt;This is a fairly short post, but in my &lt;A class="lia-internal-link lia-internal-url lia-internal-url-content-type-blog" href="https://techcommunity.microsoft.com/blog/azurefederaldeveloperconnect/deploying-logic-apps-standard-with-managed-identity-and-private-networking/4367184" data-lia-auto-title="last post" data-lia-auto-title-active="0" target="_blank"&gt;last post&lt;/A&gt; about deploying a Logic Apps Standard using Managed Identity and private networking, I'd mentioned the customer needed to run some automation tasks. Some of these tasks pertained to maintenance within their application, but some were cleaning up unused resources in Azure, specifically removing stale containers in blob storage. &lt;BR /&gt;&lt;BR /&gt;However, when trying to run the Remove-AzStorageContainer PowerShell cmdlet, we got the following error:&amp;nbsp; "The term 'Remove-AzStorageContainer' is not recognized as a name of a cmdlet, function, script file, or executable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again."&lt;BR /&gt;&lt;BR /&gt;By default, the PowerShell Connector in Logic Apps does not include the Az PowerShell module. To enable this (or other PowerShell modules you want to use):&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Open Kudu and open a CMD or PowerShell console&lt;/LI&gt;
&lt;LI&gt;Navigate to site/wwwroot&lt;/LI&gt;
&lt;LI&gt;Open the host.json file and ensure that you see a property "managedDependency":{"enabled":true} (this is the default and should be present).&lt;/LI&gt;
&lt;LI&gt;Open the requirements.psd1 file and add any PowerShell modules and version you want installed, e.g. 'Az' = '12.*'&lt;/LI&gt;
&lt;LI&gt;Save the file and restart the web service&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;After that, you can add the Execute PowerShell Code connector. To use a Managed Identity as the user context for performing tasks in Azure, add the following line at the beginning of your script:&lt;/P&gt;
&lt;P&gt;Connect-AzAccount -Identity -Environment AzureUSGovernment&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Enjoy!&lt;/P&gt;</description>
      <pubDate>Fri, 24 Jan 2025 17:05:22 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/enabling-the-logic-apps-powershell-connector-for-performing/ba-p/4367295</guid>
      <dc:creator>JohnScott</dc:creator>
      <dc:date>2025-01-24T17:05:22Z</dc:date>
    </item>
    <item>
      <title>Deploying Logic Apps Standard with Managed Identity and private networking</title>
      <link>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/deploying-logic-apps-standard-with-managed-identity-and-private/ba-p/4367184</link>
      <description>&lt;P&gt;I was working with a customer who needed to implement some automation tasks to support their application. The automation tasks would be driven on data in their Azure SQL database. As most of their developers were busy tackling their backlog, I thought "What if we could use Logic Apps to do a no code solution with their operations team?"&lt;/P&gt;
&lt;P&gt;The first step was of course to deploy the Logic App. As the customer is running fully private networking for all services (Azure SQL, Storage, etc.), we would deploy Logic Apps Standard, which runs on the Azure App Service runtime similar to Function Apps. Like a Function App, the Logic App uses a storage account in the background. However, when deploying through the portal, the deployment failed with a 403 Unauthorized error. Of course! The customer had &lt;A class="lia-external-url" href="https://learn.microsoft.com/en-us/azure/storage/common/shared-key-authorization-prevent?tabs=portal" target="_blank" rel="noopener"&gt;disabled shared key access &lt;/A&gt;to storage accounts, utilizing Entra ID exclusively. Like our post about &lt;A class="lia-internal-link lia-internal-url lia-internal-url-content-type-blog" href="https://techcommunity.microsoft.com/blog/azurefederaldeveloperconnect/creating-a-containerized-build-agent-for-azure-devops-and-azure-devops-server/4248075" target="_blank" rel="noopener" data-lia-auto-title="setting up an Azure Container Instance" data-lia-auto-title-active="0"&gt;setting up an Azure Container Instance&lt;/A&gt; to use Managed Identity to connect to an Azure Container Registry, we'd have to write a bit of Bicep script to utilize a User Assigned Managed Identity which has rights to Azure Storage and SQL.&lt;/P&gt;
&lt;P&gt;I created the User Assigned Managed Identity and granted it the following roles on the storage account:&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Storage Account Contributor&lt;/LI&gt;
&lt;LI&gt;Storage Blob Data Contributor&lt;/LI&gt;
&lt;LI&gt;Storage Queue Data Contributor&lt;/LI&gt;
&lt;LI&gt;Storage Table Data Contributor&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;While that solved the access issue and allowed the deployment to complete, the Logic App was showing a runtime error in the portal, stating that it was "Unable to load the proper Managed Identity."&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;As it turns out, we need to explicitly tell the Logic App in the App Service configuration that we are using Managed Identity as our authentication mechanism, and which Managed Identity we want to use.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Once I added that to the configuration, my Managed Identity error went away, but I was still getting a runtime error. Looking in the log stream, I could see that I was getting many errors trying to talk to the storage account queue and table endpoints. Because the client is using all private networking, we needed to setup a private endpoint and associated private DNS entry for all three storage account endpoints: blob, queue and table.&amp;nbsp; Once I added those private endpoints and added them to my App Service configuration, my Logic App deployed and ran successfully.&lt;/P&gt;
&lt;P&gt;I've added the Bicep code for the Logic App service &lt;A class="lia-external-url" href="https://github.com/Azure/AzureFedDevBlog/tree/main/blogs/Deploying_Logic_Apps_Standard" target="_blank" rel="noopener"&gt;here&lt;/A&gt;.&lt;/P&gt;
&lt;P&gt;One final "Gotcha": if you look in my Bicep, you will note that I am specifying both the User Assigned Managed Identity as well as a System Assigned Managed Identity. The reason for this is, when using a SQL connector, Managed Identity was not listed as an option for authentication. I was stumped by this at first, but then I noticed that it was an option in a portal deployed Logic App. The difference was that the portal deployment adds a System Assigned Managed Identity. Once I added this to my Bicep, the Managed Identity option showed up on the SQL connector. It appears that the connector is looking for the presence of a System Assigned Managed Identity to toggle that authentication option, but you can still use your User Assinged Managed Identity for SQL authentication.&lt;/P&gt;</description>
      <pubDate>Thu, 16 Jan 2025 19:15:21 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/deploying-logic-apps-standard-with-managed-identity-and-private/ba-p/4367184</guid>
      <dc:creator>JohnScott</dc:creator>
      <dc:date>2025-01-16T19:15:21Z</dc:date>
    </item>
    <item>
      <title>Creating a containerized build agent for Azure DevOps and Azure DevOps Server</title>
      <link>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/creating-a-containerized-build-agent-for-azure-devops-and-azure/ba-p/4248075</link>
      <description>&lt;P&gt;In this article, we'll go over creating a containerized build agent for Azure DevOps and Azure DevOps Server. This ask came from a customer who was looking to retire their VM based build agents in favor of something that required less manual patching and maintenance. The build agent needed to be injected into a VNet, so it could communicate with the customer's Azure DevOps Server (though this works perfectly well with the Azure DevOps service) and deploy into an App Service on their VNet. The build agent needed to have the ability to build both the customer's Dotnet and JavaScript projects and then deploy them to Azure. The customer was also using an Artifacts feed in Azure DevOps for their NuGet and npm packages, so the build agent needed access to these feeds.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H2&gt;Attempt #1: Windows Container&lt;/H2&gt;
&lt;P&gt;Because the customer was more familiar with Windows, we decided to use a Windows based container image, specifically Windows 2022 LTSC. To this base image, in the dockerfile I added the Dotnet 8 SDK, PowerShell 7, the Az PowerShell module, Node/npm, and AzCopy.&amp;nbsp; My first observation was the Windows 2022 container image started at 3.24 GB in size, and by the time we added the various packages it had ballooned up to 8.5 GB.&amp;nbsp;&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;The next step was to upload this image to a Azure Container Registry, which took quite some time since, as previously noted, the image was so large.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;*NOTE:&amp;nbsp; If you do not have Docker installed, you can use the "az acr build" task to build the container image from your dockerfile and push the image to your Azure Container Registry, as I'll show in a later step.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I chose to host this in an Azure App Service, as this supported VNet Integration and is a fairly simple hosting platform for my container.&amp;nbsp; I added the following 4 environment variables:&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;AZP_URL - the URL of the Azure DevOps Server plus the project collection (or organization for Azure DevOps service), e.g. &lt;A href="https://devops.contoso.com/myProjectCollection" target="_blank" rel="noopener"&gt;https://devops.contoso.com/myProjectCollection&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;AZP_POOL - the Agent Pool name where the build agent will live&lt;/LI&gt;
&lt;LI&gt;AZP_TOKEN - a PAT token that authorizes your build agent to interact with Azure DevOps. Be very careful to treat this value as a secret (consider storing it in Azure KeyVault) as it has full access to your DevOps org or collection.&lt;/LI&gt;
&lt;LI&gt;AZP_AGENT_NAME - a friendly name which will identify this build agent.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;I restarted the App Service so my container could pick up the environment variables, and checking in Azure DevOps Server, could see my build agent was registered in my agent pool.&amp;nbsp; I created a sample dotnet application and a sample Node application to test the build pipelines. Both applications built successfully with my new containerized build agent.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;Success!!!&amp;nbsp; Or so I thought...&lt;/H3&gt;
&lt;P&gt;I turned the build agent over to my customer and they tried building their (larger and more complex) projects with the containerized build agent. The dotnet project restored and built without issue, but their Node application was dying on the "npm install" step with the following error:&amp;nbsp; "&lt;SPAN&gt;FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory." I tried several things to fix this. &lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;SPAN&gt;Many articles recommended adjusting Node's max-old-space-size parameter (i.e. how much memory to allocate to old objects on the heap before garbage collecting).&amp;nbsp; &lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN&gt;There's also a default memory limit for Windows Containers running on Azure App Service which are tied to the App Service Plan SKU.&amp;nbsp; You can &lt;A href="https://learn.microsoft.com/en-us/azure/app-service/configure-custom-container?pivots=container-windows&amp;amp;tabs=debian#customize-container-memory" target="_self"&gt;update this limit&lt;/A&gt; with the WEBSITE_MEMORY_LIMIT_MB app setting up to the limit of the App Service Plan.&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN&gt;Finally, when all else fails, scale up the App Service Plan to the maximum (these go to eleven).&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;SPAN&gt;While these steps seemed to lessen the effects of the problem, we were still were having intermittent failures in the pipeline runs for the "JavaScript heap out of memory" exception. Plus, running on the highest SKU available cost more than the customer really wanted to spend.&amp;nbsp; Back to the drawing board.&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;
&lt;H2&gt;&lt;SPAN&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN&gt;Attempt #2: Linux Container&lt;/SPAN&gt;&lt;/H2&gt;
&lt;P&gt;My next thought went to doing this in Linux. The Ubuntu 22.04 image is only 77.86 MB in size, a fraction of the Windows size, and even by the time we install PowerShell Core, the Dotnet 8 SDK, Azure CLI and Node JS, it's still barely 2 GB in size for the whole package, again about 25% of the size the Windows container had ballooned to.&lt;BR /&gt;&lt;BR /&gt;After I'd created and built my dockerfile and pushed the container image to my Azure Container Registry, I tried running it in an Azure App Service, but noticed that the container kept failing post start with an error. The error indicated the service was not responding to health probes. This observation made a certain amount of sense; because there is no front end to the container, rather it's an agent listening for a signal from Azure DevOps.&amp;nbsp; Well, luckily in Azure we have lots of container hosting options, so I opted to switch over to using Azure Container Instances instead.&lt;/P&gt;
&lt;H3&gt;Networking and Container Instances&lt;/H3&gt;
&lt;P&gt;One thing I immediately noticed however was that while my test container running against Azure DevOps service worked just fine, my network injected container was throwing a DNS lookup error, while trying to resolve the name of the Azure Dev Ops Server. Typically, Azure services, which are injected into a VNet, inherit the DNS settings of the VNet itself. I verified the DNS settings and found the VNet had custom DNS servers specified, so what in the container is going on here???&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;It turns out, in order for Container Instances to use custom DNS, those custom DNS servers have to be specified at the time the Container Instance is created.&amp;nbsp; Unfortunately, the portal is somewhat limited as to what you can specify during creation, so I wrote a little bicep script to build the Container Instance. In addition to setting custom DNS, I was also able to create and assign a User Assigned Managed Identity to the Container Instance for accessing our Container Registry securely.&lt;BR /&gt;&lt;BR /&gt;*As an aside, you MUST use a User Assigned, vs. System Assigned Managed Identity here if you are restricting access to your Container Registry. The reason is a bit of a "chicken/egg" problem. If you specify a User Assigned identity, you can create it and assign it access BEFORE the Container Instance is created. With a System Assigned identity, the Container Instance will attempt to pull the image as part of the deployment process and will fail before the Container Instance and the associated System Assigned identity can be created.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;Once the Container Instance was deployed and running the build agent code, we were able to successfully run our build pipelines. We initially started out very small, with a single CPU and 1.5GB of RAM and did occasionally hit a "JavaScript heap out of memory" exception, but increasing the RAM eliminated this issue altogether.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;H2&gt;&amp;nbsp;&lt;/H2&gt;
&lt;H2&gt;Microsoft Defender for Containers and self-updating the container registry&lt;/H2&gt;
&lt;P&gt;One nice thing about having our build agents as containers is that we can configure Microsoft Defender to scan the Container Registry for vulnerabilities with Defender for Containers. While we can also scan our VM based build agents with Microsoft Defender for Servers, running in a containerized fashion gives us the opportunity to actually "self-heal" our container images by periodically re-running our dockerfile and pulling updated versions of the base OS and various packages (assuming we're not pulling specific versions of software).&amp;nbsp; This can be accomplished with a couple of simple az cli commands in a pipeline.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="bash"&gt;az acr build . -r registryname -t imagename:latest --platform linux --file dockerfile
#Trim the number of manifests to 1 to cleanup Defender for Container results
az acr run --registry registryname --cmd 'acr purge --filter "imagename:.*" --keep 1 --untagged --ago 1d' /dev/null&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H2&gt;Wrapping Up&lt;/H2&gt;
&lt;P&gt;I have placed the scripts and dockerfiles used in this blog in our GitHub repo &lt;A href="https://github.com/Azure/AzureFedDevBlog/tree/main/blogs/creating-azdo-container-build-agents" target="_blank" rel="noopener"&gt;here&lt;/A&gt;. This includes the dockerfile to build the Linux Agent, the bash script which installs the Azure DevOps agent code, my (failed) Windows version of the container, as well as the Container Instance bicep code to deploy a Container Instance with custom DNS and a Managed Identity. I hope this is helpful and please let me know if you run into any issues.&lt;/P&gt;</description>
      <pubDate>Mon, 21 Oct 2024 18:51:11 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/creating-a-containerized-build-agent-for-azure-devops-and-azure/ba-p/4248075</guid>
      <dc:creator>JohnScott</dc:creator>
      <dc:date>2024-10-21T18:51:11Z</dc:date>
    </item>
    <item>
      <title>Connecting to Azure Cache for Redis with Entra ID in Azure Government</title>
      <link>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/connecting-to-azure-cache-for-redis-with-entra-id-in-azure/ba-p/4262944</link>
      <description>&lt;P&gt;I was working with a customer who was trying to connect their ASP.NET application to Azure Cache for Redis, and in particular wanted to be able to connect from their developer workstation to the resource in Azure Government.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;There are a couple of ways to connect to Azure Cache for Redis, either by using Access Keys or via Entra ID.&amp;nbsp; Like with storage accounts and Azure Database for SQL, using static access keys or username/password authentication presents potential vulnerabilities, and using Entra ID via either Service Principals or Managed Identities provides a more robust, manageable authentication and authorization mechanism.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The Azure.Identity library provides a class called DefaultAzureCredential, which does some interesting things around chained credentials.&amp;nbsp; You can read the full article &lt;A href="https://learn.microsoft.com/en-us/dotnet/azure/sdk/authentication/credential-chains?tabs=dac#defaultazurecredential-overview" target="_blank" rel="noopener"&gt;here&lt;/A&gt;, but put simply, when using DefaultAzureCredential it will try several authentication mechanisms in order until it is able to successfully obtain a token. The order of the chain is as follows:&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;Environment (Essentially an Entra App Service client ID/secret or certificate)&lt;/LI&gt;
&lt;LI&gt;Workload Identity&lt;/LI&gt;
&lt;LI&gt;Managed Identity&lt;/LI&gt;
&lt;LI&gt;Visual Studio&lt;/LI&gt;
&lt;LI&gt;Azure CLI&lt;/LI&gt;
&lt;LI&gt;Azure PowerShell&lt;/LI&gt;
&lt;LI&gt;Azure Developer CLI&lt;/LI&gt;
&lt;LI&gt;Interactive browser&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;What this means is that, using the same authentication code, I can authenticate in an App Service using a Managed Identity, or locally in my Visual Studio development environment using an account configured in Azure Service Authentication.&amp;nbsp; I don't have to necessarily worry about how my app will authenticate in my environment so long as one of the options above is available.&lt;BR /&gt;&lt;BR /&gt;Following the guidance in the &lt;A href="https://github.com/Azure-Samples/azure-cache-redis-samples/tree/main" target="_blank" rel="noopener"&gt;Azure Cache for Redis Samples repo&lt;/A&gt;, the customer configured their Azure Cache for Redis connection as follows:&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="csharp"&gt;var configurationOptions = await ConfigurationOptions.Parse($"{_redisHostName}:6380").ConfigureForAzureWithTokenCredentialAsync(new DefaultAzureCredential());&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;However, when they tried stepping through the code, they would get the following error:&amp;nbsp;&amp;nbsp;&lt;SPAN&gt;"Failed to acquire token' - CredentialUnavailableException: EnvironmentCredential authentication unavailable. Environment variables are not fully configured. See the troubleshooting guide for more information.&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://aka.ms/azsdk/net/identity/environmentcredential/" target="_blank" rel="nofollow noopener"&gt;https://aka.ms/azsdk/net/identity/environmentcredential/&lt;/A&gt;&lt;SPAN&gt;"&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I took a look at it and the first thing that jumped out is that we probably need to be specifying the AuthorityHost value, i.e. pointing the credential at the Azure Government cloud, like so:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="csharp"&gt;var configurationOptions = await ConfigurationOptions.Parse($"{_redisHostName}:6380").ConfigureForAzureWithTokenCredentialAsync(new DefaultAzureCredential(new DefaultAzureCredentialOptions() { AuthorityHost=AzureAuthorityHosts.AzureGovernment}));&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;However, this did not change my error at all. So, what's going on?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;It turns out, that looking in the Microsoft.Azure.StackExchangeRedis library, the ConfigureForAzureWithTokenCredentialAsync method does not yet have a way to specify a sovereign cloud endpoint (and if I'm reading the code correctly, also does not allow a ManagedIdentity to specify a sovereign cloud either).&amp;nbsp; So, what now?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;As it turns out, the option to use a Service Principal DOES allow you to specify a sovereign cloud to authenticate against.&amp;nbsp; Creating a service principal in Entra is well documented, either in the portal as documented &lt;A href="https://learn.microsoft.com/en-us/entra/identity-platform/howto-create-service-principal-portal" target="_blank" rel="noopener"&gt;here&lt;/A&gt;, or via a simple az cli command:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="powershell"&gt;az ad sp create-for-rbac --name "myredissp"&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Once you have the service principal created, you can create a Redis user with the service principal in the Redis resource and connect to it in code with the following:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="csharp"&gt;var configurationOptions = await ConfigurationOptions.Parse($"{_redisHostName}:6380").ConfigureForAzureWithServicePrincipalAsync(clientId, tenantId, clientSecret, null, Microsoft.Identity.Client.AzureCloudInstance.AzureUsGovernment, null);&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Hopefully in the future, the option to specify a target sovereign cloud will be included to be able to use the DefaultAzureCredential to connect to Redis, but for now we can use the Service Principal.&lt;/P&gt;</description>
      <pubDate>Fri, 04 Oct 2024 20:00:00 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/connecting-to-azure-cache-for-redis-with-entra-id-in-azure/ba-p/4262944</guid>
      <dc:creator>JohnScott</dc:creator>
      <dc:date>2024-10-04T20:00:00Z</dc:date>
    </item>
    <item>
      <title>Setting up a Point-to-Site VPN connection in Azure Government (the quick version)</title>
      <link>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/setting-up-a-point-to-site-vpn-connection-in-azure-government/ba-p/4255032</link>
      <description>&lt;P&gt;I spent way too much time setting up a point-to-site VPN connection in Azure Government yesterday, so I figured I would do a quick blog post in the hopes this saves someone some time.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I typically setup a VPN connection between my on-prem environment to my sandbox in Azure to test network injected scenarios such as running applications on an App Service Environment, testing a SQL migration to an Azure SQL Managed Instance, etc. A point-to-site VPN is a very simple way to achieve this, but I had not yet deployed it to my government subscription.&amp;nbsp; The pre-reqs are as follows:&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;Create a virtual network&lt;/LI&gt;
&lt;LI&gt;Create a GatewaySubnet on the virtual network&lt;/LI&gt;
&lt;LI&gt;Create a Virtual Network Gateway&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;There is a tutorial of the steps for creating these resources&amp;nbsp;&lt;A href="https://learn.microsoft.com/en-us/azure/vpn-gateway/tutorial-create-gateway-portal" target="_blank" rel="noopener"&gt;here&lt;/A&gt;.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I chose the&amp;nbsp;VpnGw1 SKU for my Virtual Network Gateway (VNG) as this is just for testing and I do not require a great deal of bandwidth or number of connections, but you can size your VNG to your needs.&lt;BR /&gt;&lt;BR /&gt;The next step is to setup the point-to-site. To do this, you open the Virtual Network Gateway and choose the "Point-to-site configuration" blade.&amp;nbsp; The setup here is fairly simple - we add an address pool (I typically use an RFC-1918 space that does not conflict with my VNet space), the type of tunnel (we're using OpenVPN as we're doing Microsoft Entra ID authentication), the type of Authentication (which is still branded "Azure Active Directory"), and the public IP you want to use.&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The next step is to add the Azure Active Directory (Microsoft Entra ID) values. Being in Azure Government, we need to consider that our endpoints are different than those in the commercial environment.&amp;nbsp; The document &lt;A href="https://learn.microsoft.com/en-us/azure/vpn-gateway/openvpn-azure-ad-tenant" target="_blank" rel="noopener"&gt;here&lt;/A&gt; has a list of steps to do this, including the mapping between Azure Commercial and Azure Government endpoints. The endpoint mapping is as follows:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&amp;nbsp;Tenant ID
&lt;UL&gt;
&lt;LI&gt;&lt;A href="https://login.microsoftonline.com/{TenantID" target="_blank" rel="noopener"&gt;https://login.microsoftonline.com/{TenantID&lt;/A&gt;}&amp;nbsp;-&amp;gt;&lt;SPAN&gt;&lt;A href="https://login.microsoftonline.us/{TenantID" target="_blank" rel="noopener"&gt;https://login.microsoftonline.us/{TenantID&lt;/A&gt;}&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN&gt;Audience&lt;/SPAN&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;SPAN&gt;41b23e61-6c1e-4545-b367-cd054e0ed4b4 -&amp;gt;&amp;nbsp;51bb15d4-3a4f-4ebf-9dca-40096fe32426&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN&gt;Issuer&lt;/SPAN&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;SPAN&gt;&lt;A href="https://sts.windows.net/{TenantID}/" target="_blank" rel="noopener"&gt;https://sts.windows.net/{TenantID}/&lt;/A&gt;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;Wait.&amp;nbsp; Why doesn't the issuer change? Shouldn't that go to a government endpoint too? Well, the short answer is "no." I was severely overthinking this and tried to guess at this endpoint, putting in "login.microsoftonline.us" in place of the "sts.windows.net" value, but I got the following error when I tried to connect:&lt;BR /&gt;[Error] Dialing VPN connection ase-vnet, Status = Server did not respond properly to VPN Control Packets. Session State: Key Material sent&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;That didn't help me too much so, after spending an embarrassing amount of time digging through Microsoft Entra ID, I finally had the idea to actually grab a JWT token from my tenant and inspect it. You can grab a token (I'll write a future article on how to do this) and decode it in a tool like &lt;A href="http://jwt.ms" target="_blank" rel="noopener"&gt;jwt.ms&lt;/A&gt;. Sure enough, looking at the token, the issuer is still &lt;A href="https://sts.windows.net/{TenantID" target="_blank" rel="noopener"&gt;https://sts.windows.net/{TenantID&lt;/A&gt;}. Once I corrected that, my VPN client was able to connect.&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;BR /&gt;&lt;BR /&gt;There are many places in our learn.microsoft.com documentation describing the point-to-site VPN configuration process, but I feel that&amp;nbsp;&lt;A href="https://learn.microsoft.com/en-us/azure/vpn-gateway/openvpn-azure-ad-tenant" target="_blank" rel="noopener"&gt;this documentation&lt;/A&gt;&amp;nbsp;best describes the process for clouds other than Azure Commercial, such as Azure Government. It will walk you through authorizing the Azure VPN Application (which requires the Global administrator role in Entra ID), and how to configure the Azure VPN Client as well. Just don't do as I did and overthink the Issuer!&lt;/P&gt;</description>
      <pubDate>Wed, 25 Sep 2024 15:51:44 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/setting-up-a-point-to-site-vpn-connection-in-azure-government/ba-p/4255032</guid>
      <dc:creator>JohnScott</dc:creator>
      <dc:date>2024-09-25T15:51:44Z</dc:date>
    </item>
    <item>
      <title>Using Azure Service Bus in Azure US Government</title>
      <link>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/using-azure-service-bus-in-azure-us-government/ba-p/4193313</link>
      <description>&lt;P&gt;In our recent &lt;A href="https://aka.ms/AzureFedDevConnect" target="_blank" rel="noopener"&gt;Azure Federal Developer Connect&lt;/A&gt; series, we covered connecting to Azure Service Bus using Azure Functions in Azure US Government. The process for connecting to Azure Service Bus doesn't change much for the US Government cloud, but there are a couple of nuances I will call out here.&amp;nbsp; This post will walk you through the code step by step, but if you're in a hurry and just want to download the completed sample, the code is available &lt;A href="https://github.com/Azure/AzureFedDevBlog/tree/main/blogs/using-azure-service-bus-in-azure-us-government" target="_blank" rel="noopener"&gt;here&lt;/A&gt;.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Azure Service Bus is a fully managed enterprise message broker with message queues and publish-subscribe topics. Service Bus is used to decouple applications and services from each other, providing the following benefits:&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Load-balancing work across competing workers&lt;/LI&gt;
&lt;LI&gt;Safely routing and transferring data and control across service and application boundaries&lt;/LI&gt;
&lt;LI&gt;Coordinating transactional work that requires a high degree of reliability&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;What this means from a practical standpoint is that you will have some application which will publish messages to a queue or a topic, and one or many subscribing applications. Currently the Azure Service Bus client libraries support Python, C#, Java, and JavaScript, so the application communicating with Service Bus can take many forms, whether it's a C# or Python web app hosted on an App Service, a JavaScript running in a Static Web App, a containerized Java application, a no-code/low-code solution like Logic Apps - literally any application hosting mechanism which can leverage the Azure Service Bus client libraries can be used to communicate with Azure Service Bus.&lt;BR /&gt;&lt;BR /&gt;For this demo I chose to use Azure Functions running C#.&amp;nbsp; C# is of course a popular language, and the Azure Functions runtime allows for a lightweight, scalable means of communicating with Service Bus.&amp;nbsp; Furthermore, using the Azure Functions Core Tools, we can test the function without actually needing to deploy the functions into Azure, and we can debug from our local machine, even though Azure Service Bus is running in the cloud.&amp;nbsp; &lt;BR /&gt;&lt;BR /&gt;The publisher application is an Azure Function using the HttpTrigger, which will use the request body of the POST and send that message to a queue or topic. The subscriber applications are using the Service Bus triggers, a ServiceBusQueueTrigger and ServiceBusTopicTrigger to read from the queues and topics, respectively.&amp;nbsp; These allow us to seamlessly connect with Service Bus and instantly pick up messages as they arrive on the Queue or Topic Subscription.&lt;/P&gt;
&lt;H3&gt;&amp;nbsp;&lt;/H3&gt;
&lt;H3&gt;Running this demo&lt;/H3&gt;
&lt;P&gt;To run this demo, you will need the following components:&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;An Azure Subscription&lt;/LI&gt;
&lt;LI&gt;&lt;A href="https://learn.microsoft.com/en-us/cli/azure/install-azure-cli" target="_self"&gt;Azure CLI&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;A href="https://dotnet.microsoft.com/en-us/download/dotnet/8.0" target="_self"&gt;dotnet SDK 8&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/azure-functions/functions-run-local?tabs=windows%2Cisolated-process%2Cnode-v4%2Cpython-v2%2Chttp-trigger%2Ccontainer-apps&amp;amp;pivots=programming-language-csharp#install-the-azure-functions-core-tools" target="_self"&gt;Azure Function Core Tools&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azurite?tabs=visual-studio-code%2Cblob-storage#install-azurite" target="_self"&gt;Azurite&lt;/A&gt; storage emulator*&lt;/LI&gt;
&lt;LI&gt;A code editor such as &lt;A href="https://code.visualstudio.com/download" target="_self"&gt;Visual Studio Code&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;A REST client testing tool such as &lt;A href="https://www.postman.com/downloads/" target="_self"&gt;Postman&lt;/A&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;*If you are using Visual Studio 2022, Azurite is available automatically. All steps described in this post can also be accomplished within Visual Studio, as well as Visual Studio code, or from the command line using any code editor of your choice.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;Create the Azure Service Bus resource&lt;/H3&gt;
&lt;P&gt;Go into the Azure Portal and create a Service Bus resource.&amp;nbsp; Azure Service Bus is available in three pricing tiers: &lt;STRONG&gt;Basic, Standard and Premium&lt;/STRONG&gt;. For the purposes of this demo, we will want to create a &lt;STRONG&gt;Standard&lt;/STRONG&gt; tier instance, as it will support Topics.&amp;nbsp; You will need to pick a globally unique namespace, and you will want to make note of your fully qualified namespace in the form of servicebusnamespace.servicebus.usgovcloudapi.net.&amp;nbsp; You can accept the defaults for the rest of the configuration.&lt;BR /&gt;&lt;BR /&gt;Once your Service Bus resource is provisioned, go into the resource and select the "Access Control (IAM)" blade.&amp;nbsp; Add your user to the Azure Service Bus Data Owner role, which will allow your user identity to read and write from the Service Bus.&amp;nbsp; If you were to deploy the functions to Azure, you would create a managed identity for the Azure Function and assign it Azure Service Bus Data Owner, Azure Service Bus Data Reader, and/or Azure Service Bus Data Writer roles as appropriate.&amp;nbsp; However, for the purposes of this demo we will be testing locally and will use your Azure credentials to authenticate to Service Bus.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Finally, find the Queue blade and create a queue (in the demo this is called 'testq' but you can name it whatever you'd like).&amp;nbsp; Then find the Topic blade and create a new topic ('testTopic') and one or more subscriptions on the topic ('testSubscription1','testSubscription2', etc.).&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;Create the Azure Functions app locally&lt;/H3&gt;
&lt;P&gt;To build the scaffolding for the Azure Function Apps, we will use the Azure Function Core Tools to first initialize the Function App, then create the skeleton for the functions themselves.&lt;BR /&gt;&lt;BR /&gt;Open a Windows Terminal session and change directories to where you'd like your code to reside.&amp;nbsp; Enter the following command:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="powershell"&gt;func init ServiceBusQueueFunc --worker-runtime dotnet-isolated --target-framework net8.0&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Change directories to the "ServiceBusQueueFunc" directory (or whatever you called your function).&amp;nbsp; Then add these functions:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="powershell"&gt;func new --name ServiceBusQueueWriter --template "HTTP trigger"
func new --name ServiceBusQueueReader --template "ServiceBusQueueTrigger"
func new --name ServiceBusTopicReader --template "ServiceBusTopicTrigger"&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;And finally, we need to add two libraries which will add the Azure.Message.ServiceBus and Azure.Identity libraries so we can communicate with Service Bus.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="powershell"&gt;dotnet add package Azure.Messaging.ServiceBus
dotnet add package Azure.Identity&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;Authentication to Service Bus&lt;/H3&gt;
&lt;P&gt;When it comes to authenticating with Azure Service Bus, we have a couple of choices. A simple way to authenticate is via a connection string, but this requires using a shared access key credential that you have to store and maintain somewhere, and which could be potentially compromised by a bad actor. To mitigate this, we can use the Azure.Identity library and use a managed identity for our function. Of course, because we're testing locally, we do not have a managed identity, but the Azure.Identity library will also use our Azure CLI credentials to simulate one.&amp;nbsp; To login to Azure CLI:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="powershell"&gt;az cloud set --name AzureUSGovernment
az login --use-device-code #You may not need the use-device-code flag depending on your configuration&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;Configuring the Service Bus Writer Function&lt;/H3&gt;
&lt;P&gt;Now that we have the scaffolding of the Function App complete, it's time to write a little code.&amp;nbsp; If you open your code editor and open the ServiceBusQueueWriter.cs file, you'll see the scaffolding code created by the Azure Functions Core Tools. As the Azure.Messaging.ServiceBus library uses Async methods, we will need to change the method to by async. Also, by default the function will accept both GET and POST requests, but as we're reading from the request body, we will eliminate the GET.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="csharp"&gt;public async Task&amp;lt;IActionResult&amp;gt; Run([HttpTrigger(AuthorizationLevel.Function, "post")] HttpRequest req)&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Next, we need to do is get the values for the Service Bus namespace and the name of the queue or topic we will be writing to.&amp;nbsp; You can either hard code these values into your code, or retrieve them from your environment settings, e.g. a local.settings.json file.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="csharp"&gt;string? fullyQualifiedNamespace = Environment.GetEnvironmentVariable("ServiceBusConnection__fullyQualifiedNamespace");
string? queueName = Environment.GetEnvironmentVariable("ServiceBusQueueName");&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;In the case of the ServiceBusConnection value, this will be the fully qualified namespace you gave your Service Bus resource, e.g. myservicebus.servicebus.usgovcloudapi.net.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Next, we'll add a bit of code to parse the body of the HttpRequest so we can send this to the Service Bus queue or topic.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="csharp"&gt;string requestBody = await new StreamReader(req.Body).ReadToEndAsync();&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;And finally, we will create the Service Bus client, create a sender attached to our queue or topic, and send the message. Note on lines 1 and 2 below, I am specifying that the AuthorityHost points at AzureGovernment. By default, the authentication will be attempted against Azure Commercial.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="csharp"&gt;DefaultAzureCredentialOptions options = new DefaultAzureCredentialOptions();
options.AuthorityHost = AzureAuthorityHosts.AzureGovernment;

await using var client = new ServiceBusClient(fullyQualifiedNamespace, new DefaultAzureCredential(options));

var sender = client.CreateSender(queueName);
await sender.SendMessageAsync(new ServiceBusMessage(requestBody));&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-SPOILER&gt;If you get an authentication error when sending the message, you may need to add the following to your environment variables (e.g. local.settings.json)&amp;nbsp;&lt;LI-CODE lang="json"&gt;"AZURE_TENANT_ID": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"​&lt;/LI-CODE&gt;&lt;/LI-SPOILER&gt;
&lt;H3&gt;Configure the Service Bus Queue and Topic Reader Functions&lt;/H3&gt;
&lt;P&gt;To configure the Service Bus Queue and Topic Reader Functions is a bit simpler. To configure these, you simply need to add the queue/topic name and subscription (in the case of the Topic Reader) and the ServiceBusConnection.&lt;/P&gt;
&lt;P&gt;There is a bit of a "gotcha" here. This value will be read in from your environment variables (e.g. local.settings.json) but utilizes a "magic" word in the configuration.&amp;nbsp; For example, if you title your Connection "ServiceBusConnection", the value in your environment variables needs to be:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="json"&gt;"ServiceBusConnection__fullyQualifiedNamespace": "myservicebus.servicebus.usgovcloudapi.net",&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;For the Queue Reader:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="csharp"&gt;public async Task Run(
            [ServiceBusTrigger("testq", Connection = "ServiceBusConnection")]
            ServiceBusReceivedMessage message,
            ServiceBusMessageActions messageActions)&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;For the Topic Reader:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="csharp"&gt;public async Task Run(
            [ServiceBusTrigger("testTopic", "testSubscription", Connection = "ServiceBusConnection")]
            ServiceBusReceivedMessage message,
            ServiceBusMessageActions messageActions)&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;Testing the Service Bus Queue Writer and Reader Functions&lt;/H3&gt;
&lt;P&gt;Open a new terminal window and start the Azurite storage emulator - this will run and stay persistent as long as this terminal window is open.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="powershell"&gt;azurite&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;If you are using the command line only, you can now build the function app and start the local function run time with the following command:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="powershell"&gt;func host start&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;If you are using Visual Studio Code, you can enter debug mode (which will support breakpoints in code) by selecting Run-&amp;gt;Start Debugging.&lt;/P&gt;
&lt;P&gt;When the function host starts, you should see the following:&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;If you are using Visual Studio Code, set a breakpoint in both the Writer and Queue Reader code.&amp;nbsp; Opening your REST client tool (e.g. Postman), copy and paste the URL from the Writer function into a new POST request.&amp;nbsp; In the request body, add some content into the body (you can add JSON, XML or plain text, just make sure the content-type header matches your text type).&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;At this point, you should be able to send your request.&amp;nbsp; If you are in Visual Studio Code and have your breakpoints set, you should expect to see first the Writer function light up - go ahead and step through or hit F5 to let the code continue.&amp;nbsp; If your send is successful, you should next expect to see the Queue Reader function light up. Because we are using a Service Bus trigger, the function will execute immediately when an item lands in the queue.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;Testing the Topic Trigger&lt;/H3&gt;
&lt;P&gt;Writing to a topic does not require changing any code in the Queue Writer, we just need to change the destination from the name of the queue to the name of the topic.&amp;nbsp; When you run your test again against the topic, this time we expect the Topic Reader function to execute.&amp;nbsp; If you look at your topic and related subscriptions in the Azure portal, you should expect to see 0 messages in the subscription you are reading from, and 1 message each in the other subscriptions.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;Conclusions&lt;/H3&gt;
&lt;P&gt;Hopefully this has been informative about how easy it is to implement messaging for your applications and solutions using Azure Service Bus, as well as highlighting the small nuances for using this feature in Azure Government.&amp;nbsp; Future Service Bus topics may involve more advanced topics such as ordered message delivery and multi-region resiliency, but please suggest topics that you would like to see in a future post, and please bookmark&amp;nbsp;&lt;A href="https://aka.ms/AzureFedDevConnect" target="_blank" rel="noopener"&gt;Azure Federal Developer Connect&lt;/A&gt;&amp;nbsp;and register for our upcoming sessions!&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 25 Jul 2024 22:23:28 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/using-azure-service-bus-in-azure-us-government/ba-p/4193313</guid>
      <dc:creator>JohnScott</dc:creator>
      <dc:date>2024-07-25T22:23:28Z</dc:date>
    </item>
    <item>
      <title>Upcoming Azure Federal Developer Connect Webinar</title>
      <link>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/upcoming-azure-federal-developer-connect-webinar/ba-p/4125168</link>
      <description>&lt;P&gt;Join us on April 30th, 2024, for the Azure Federal Developer Connect Webinar &lt;SPAN&gt;with our speaker&amp;nbsp;&lt;/SPAN&gt;Haitham Shahin! The topic for this installment will be&amp;nbsp;&lt;STRONG&gt;Container Orchestration: Managing a large number of containers and how they interact&lt;/STRONG&gt;. Register today at&amp;nbsp;&lt;A href="https://aka.ms/AzureFedDevConnect" target="_self"&gt;https://aka.ms/AzureFedDevConnect!&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Follow the Azure Federal Developer Connect blog at aka.ms/AzureFedDevConnect&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 29 Apr 2024 15:11:21 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/upcoming-azure-federal-developer-connect-webinar/ba-p/4125168</guid>
      <dc:creator>JohnScott</dc:creator>
      <dc:date>2024-04-29T15:11:21Z</dc:date>
    </item>
    <item>
      <title>Welcome to the Azure Federal Developer Connect Blog!</title>
      <link>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/welcome-to-the-azure-federal-developer-connect-blog/ba-p/4113159</link>
      <description>&lt;P&gt;Greetings and welcome to the&amp;nbsp;Azure Federal Developer Connect Blog!&amp;nbsp; Our goal is to accelerate cloud adoption and empower every developer to innovate and build software with our platform. While the topics in this blog will reflect industry best practices and may be of interest to all Azure users worldwide, &lt;SPAN&gt;&lt;SPAN class="ui-provider a b c d e f g h i j k l m n o p q r s t u v w x y z ab ac ae af ag ah ai aj ak"&gt;our target audience is the Federal Government developer community.&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;This blog is a companion to our Azure Federal Developer Connect webinar series.&amp;nbsp; Please join us for our next Azure Federal Developer Connect session on April 16th.&amp;nbsp; This session will cover &lt;SPAN&gt;API Management - an Azure managed service for creating, publishing, and managing API's.&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;Register today at &lt;A href="https://aka.ms/AzureFedDevConnect" target="_blank" rel="noopener"&gt;https://aka.ms/AzureFedDevConnect&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 15 Apr 2024 15:27:30 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-federal-developer-connect/welcome-to-the-azure-federal-developer-connect-blog/ba-p/4113159</guid>
      <dc:creator>JohnScott</dc:creator>
      <dc:date>2024-04-15T15:27:30Z</dc:date>
    </item>
  </channel>
</rss>

