azure app service
451 TopicsPowering Observability: Dynatrace Integration with Linux App Service through Sidecars
In this blog we continue to dive into the world of observability with Azure App Service. If you've been following our recent updates, you'll know that we announced the Public Preview for the Sidecar Pattern for Linux App Service. Building upon this architectural pattern, we're going to demonstrate how you can leverage it to integrate Dynatrace, an Azure Native ISV Services partner, with your .NET custom container application. In this blog, we'll guide you through the process of harnessing Dynatrace's powerful monitoring capabilities, allowing you to gain invaluable insights into your application's metrics and traces. Setting up your .NET application To get started, you'll need to containerize your .NET application. This tutorial walks you through the process step by step. This is what a sample Dockerfile for a .Net 8 application # Stage 1: Build the application FROM mcr.microsoft.com/dotnet/sdk:8.0 AS build WORKDIR /app # Copy the project file and restore dependencies COPY *.csproj ./ RUN dotnet restore # Copy the remaining source code COPY . . # Build the application RUN dotnet publish -c Release -o out # Stage 2: Create a runtime image FROM mcr.microsoft.com/dotnet/aspnet:8.0 AS runtime WORKDIR /app # Copy the build output from stage 1 COPY --from=build /app/out ./ # Set the entry point for the application ENTRYPOINT ["dotnet", "<your app>.dll"] You're now ready to build the image and push it to your preferred container registry, be it Azure Container Registry, Docker Hub, or a private registry. Create your Linux Web App Create a new Linux Web App from the portal and choose the options for Container and Linux. On the Container tab, make sure that Sidecar support is Enabled. Specify the details of your application image. Note: Typically, .Net uses port 8080 but you can change it in your project. Setup your Dynatrace account If you don’t have a Dynatrace account, you can create an instance of Dynatrace on the Azure portal by following this Marketplace link. You can choose the Free Trial plan to get a 30 days subscription. AppSettings for Dynatrace Integration You need to set the following AppSettings. You can get more details about the Dynatrace related settings here. DT_TENANT – The environment ID DT_TENANTTOKEN – Same as DT_API_TOKEN. This is the PaaS token for your environment. DT_CONNECTIONPOINT DT_HOME - /home/dynatrace LD_PRELOAD - /home/dynatrace/oneagent/agent/lib64/liboneagentproc.so DT_LOGSTREAM - stdout DT_LOGLEVELCON – INFO We would encourage you to add sensitive information like DT_TENANTTOKEN to Azure Key vault Use Key Vault references - Azure App Service | Microsoft Learn. Add the Dynatrace Sidecar Go to the Deployment Center for your application and add a sidecar container. Image Source: Docker Hub and other registries Image type: Public Registry server URL: mcr.microsoft.com Image and tag: appsvc/docs/sidecars/sample-experiment:dynatrace-dotnet Port: <any port other than your main container port> Once you have added the sidecar, you would need to restart your website to see the data start flowing to the Dynatrace backend. Please note that this is an experimental container image for Dynatrace. We will be updating this blog with a new image soon. Disclaimer: Dynatrace Image Usage It's important to note that the Dynatrace image used here is sourced directly from Dynatrace and is provided 'as-is.' Microsoft does not own or maintain this image. Therefore, its usage is subject to the terms of use outlined by Dynatrace. Visualizing your Observability data in Dynatrace You are all set! You can now see your Observability data flow to Dynatrace backend. The Hosts tab gives you metrics about the VM which is hosting the application. Dynatrace also has a Services view which lets you look at your application specific information like Response Time, Failed Requests and application traces. You can learn more about Dynatrace’s Observability capabilities by going through the documentation. Observe and explore - Dynatrace Docs Next Steps As you've seen, the Sidecar Pattern for Linux App Service opens a world of possibilities for integrating powerful tools like Dynatrace into your Linux App Service-hosted applications. With Dynatrace being an Azure Native ISV Services partner, this integration marks just the beginning of a journey towards a closer and more simplified experience for Azure users. This is just the start. We're committed to providing even more guidance and resources to help you seamlessly integrate Dynatrace with your code-based Linux web applications and other language stacks. Stay tuned for upcoming updates and tutorials as we continue to empower you to make the most of your Azure environment. In the meantime, don't hesitate to explore further, experiment with different configurations, and leverage the full potential of observability with Dynatrace and Azure App Service.2.4KViews1like1CommentBuild Multi-Agent AI Systems on Azure App Service
Introduction: The Evolution of AI-Powered App Service Applications Over the past few months, we've been exploring how to supercharge existing Azure App Service applications with AI capabilities. If you've been following along with this series, you've seen how we can quickly integrate AI Foundry agents with MCP servers and host remote MCP servers directly on App Service. Today, we're taking the next leap forward by demonstrating how to build sophisticated multi-agent systems that leverage connected agents, Model Context Protocol (MCP), and OpenAPI tools - all running on Azure App Service's Premium v4 tier with .NET Aspire for enhanced observability and cloud-native development experience. 💡 Want the full technical details? This blog provides an overview of the key concepts and capabilities. For comprehensive setup instructions, architecture deep-dives, performance considerations, debugging guidance, and detailed technical documentation, check out the complete README on GitHub. What Makes This Sample Special? This fashion e-commerce demo showcases several cutting-edge technologies working together: 🤖 Multi-Agent Architecture with Connected Agents Unlike single-agent systems, this sample implements an orchestration pattern where specialized agents work together: Main Orchestrator: Coordinates workflow and handles inventory queries via MCP tools Cart Manager: Specialized in shopping cart operations via OpenAPI tools Fashion Advisor: Provides expert styling recommendations Content Moderator: Ensures safe, professional interactions 🔧 Advanced Tool Integration MCP Tools: Real-time connection to external inventory systems using the Model Context Protocol OpenAPI Tools: Direct agent integration with your existing App Service APIs Connected Agent Tools: Seamless agent-to-agent communication with automatic orchestration ⚡ .NET Aspire Integration Enhanced development experience with built-in observability Simplified cloud-native application patterns Real-time monitoring and telemetry (when developing locally) 🚀 Premium v4 App Service Tier Latest App Service performance capabilities Optimized for modern cloud-native workloads Enhanced scalability for AI-powered applications Key Technical Innovations Connected Agent Orchestration Your application communicates with a single main agent, which automatically coordinates with specialist agents as needed. No changes to your existing App Service code required. Dual Tool Integration This sample demonstrates both MCP tools for external system connectivity and OpenAPI tools for direct API integration. Zero-Infrastructure Overhead Agents work directly with your existing App Service APIs and external endpoints - no additional infrastructure deployment needed. Why These Technologies Matter for Real Applications The combination of these technologies isn't just about showcasing the latest features - it's about solving real business challenges. Let's explore how each component contributes to building production-ready AI applications. .NET Aspire: Enhancing the Development Experience This sample leverages .NET Aspire to provide enhanced observability and simplified cloud-native development patterns. While .NET Aspire is still in preview on App Service, we encourage you to start exploring its capabilities and keep an eye out for future updates planned for later this year. What's particularly exciting about Aspire is how it maintains the core principle we've emphasized throughout this series: making AI integration as simple as possible. You don't need to completely restructure your application to benefit from enhanced observability and modern development patterns. Premium v4 App Service: Built for Modern AI Workloads This sample is designed to run on Azure App Service Premium v4, which we recently announced is Generally Available. Premium v4 is the latest offering in the Azure App Service family, delivering enhanced performance, scalability, and cost efficiency. From Concept to Implementation: Staying True to Our Core Promise Throughout this blog series, we've consistently demonstrated that adding AI capabilities to existing applications doesn't require massive rewrites or complex architectural changes. This multi-agent sample continues that tradition - what might seem like a complex system is actually built using the same principles we've established: ✅ Incremental Enhancement: Build on your existing App Service infrastructure ✅ Simple Integration: Use familiar tools like azd up for deployment ✅ Production-Ready: Leverage mature Azure services you already trust ✅ Future-Proof: Easy to extend as new capabilities become available Looking Forward: What's Coming Next This sample represents just the beginning of what's possible with AI-powered App Service applications. Here's what we're working on next: 🔐 MCP Authentication Integration Enhanced security patterns for production MCP server deployments, including Azure Entra ID integration. 🚀 New Azure AI Foundry Features As Azure AI Foundry continues to evolve, we'll be updating this sample to showcase: New agent capabilities Enhanced tool integrations Performance optimizations Additional model support 📊 Advanced Analytics and Monitoring Deeper integration with Azure Monitor for: Agent performance analytics Business intelligence from agent interactions 🔧 Additional Programming Language Support Following our multi-language MCP server samples, we'll be adding support for other languages in samples that will be added to the App Service documentation. Getting Started Today Ready to add multi-agent capabilities to your existing App Service application? The process follows the same streamlined approach we've used throughout this series. Quick Overview Clone and Deploy: Use azd up for one-command infrastructure deployment Create Your Agents: Run a Python setup script to configure the multi-agent system Connect Everything: Add one environment variable to link your agents Test and Explore: Try the sample conversations and see agent interactions 📚 For detailed step-by-step instructions, including prerequisites, troubleshooting tips, environment setup, and comprehensive configuration guidance, see the complete setup guide in the README. Learning Resources If you're new to this ecosystem, we recommend starting with these foundational resources: Integrate AI into your Azure App Service applications - Comprehensive guide with language-specific tutorials for building intelligent applications on App Service Supercharge Your App Service Apps with AI Foundry Agents Connected to MCP Servers - Learn the basics of integrating AI Foundry agents with MCP servers Host Remote MCP Servers on App Service - Deploy and manage MCP servers on Azure App Service Conclusion: The Future of AI-Powered Applications This multi-agent sample represents the natural evolution of our App Service AI integration journey. We started with basic agent integration, progressed through MCP server hosting, and now we're showcasing sophisticated multi-agent orchestration - all while maintaining our core principle that AI integration should enhance, not complicate, your existing applications. Whether you're just getting started with AI agents or ready to implement complex multi-agent workflows, the path forward is clear and incremental. As Azure AI Foundry adds new capabilities and App Service continues to evolve, we'll keep updating these samples and sharing new patterns. Stay tuned - the future of AI-powered applications is being built today, one agent at a time. Additional Resources 🚀 Start Building GitHub repository for this sample - Comprehensive setup guide, architecture details, troubleshooting, and technical deep-dives 📚 Learn More Azure AI Foundry Documentation: Connected Agents Guide MCP Tools Setup: Model Context Protocol Integration .NET Aspire on App Service: Deployment Guide Premium v4 App Service: General Availability Announcement Have questions or want to share how you're using multi-agent systems in your applications? Join the conversation in the comments below. We'd love to hear about your AI-powered App Service success stories!403Views2likes0Comments🚀 Bring Your Own License (BYOL) Support for JBoss EAP on Azure App Service
We’re excited to announce that Azure App Service now supports Bring Your Own License (BYOL) for JBoss Enterprise Application Platform (EAP), enabling enterprise customers to deploy Java workloads with greater flexibility and cost efficiency. If you’ve evaluated Azure App Service in the past, now is the perfect time to take another look. With BYOL support, you can leverage your existing Red Hat subscriptions to optimize costs and align with your enterprise licensing strategy.61Views1like0CommentsAnnouncing the Public Preview of the New App Service Quota Self-Service Experience
What’s New? The updated experience introduces a dedicated App Service Quota blade in the Azure portal, offering a streamlined and intuitive interface to: View current usage and limits across the various SKUs Set custom quotas tailored to your App Service plan needs This new experience empowers developers and IT admins to proactively manage resources, avoid service disruptions, and optimize performance. Quick Reference - Start here! If your deployment requires quota for ten or more subscriptions, then file a support ticket with problem type Quota. If any subscription included in your request requires zone redundancy, then file a support ticket with problem type Quota. Otherwise, leverage the new self-service experience to increase your quota automatically. Self-service Quota Requests For non-zone-redundant needs, quota alone is sufficient to enable App Service deployment or scale-out. Follow the provided steps to place your request. 1. Navigate to the Quotas resource provider in the Azure portal 2. Select App Service Navigating the primary interface: Each App Service VM size is represented as a separate SKU. If the intention is to be able to scale up or down within a specific offering (e.g., Premium v3), then equivalent number of VMs need to be requested for each applicable size of that offering (e.g., request 5 instances for both P1v3 and P3v3). As with other quotas, you can filter by region, subscription, provider, or usage. You can also group the results by usage, quota (App Service VM type), or location (region). Current usage is represented as App Service VMs. This allows you to quickly identify which SKUs are nearing their quota limits. Adjustments can be made inline: no need to visit another page. This is covered in detail in the next section. 3. Request quota adjustments Clicking the pen icon opens a flyout window to capture the quota request: The quota type (App Service SKU) is already populated, along with current usage. Note that your request is not incremental: you must specify the new limit that you wish to see reflected in the portal. For example, to request two additional instances of P1v2 VMs, you would file the request like this: Click submit to send the request for automatic processing. How quota approvals work: Immediately upon submitting a quota request, you will see a processing dialog like the one shown: If the quota request can be automatically fulfilled, then no support request is needed. You should receive this confirmation within a few minutes of submission: If the request cannot be automatically fulfilled, then you will be given the option to file a support request with the same information. In the example below, the requested new limit exceeds what can be automatically granted for the region: 4. If applicable, create support ticket When creating a support ticket, you will need to repopulate the Region and App Service plan details; the new limit has already been populated for you. If you forget the region or SKU that was requested, you can reference them in your notifications pane: If you choose to create a support ticket, then you will interact with the capacity management team for that region. This is a 24x7 service, so requests may be created at any time. Once you have filed the support request, you can track its status via the Help + support dashboard. Known issues The self-service quota request experience for App Service is in public preview. Here are some caveats worth mentioning while the team finalizes the release for general availability: Closing the quota request flyout window will stop meaningful notifications for that request. You can still view the outcome of your quota requests by checking actual quota, but if you want to rely on notifications for alerts, then we recommend leaving the quota request window open for the few minutes that it is processing. Some SKUs are not yet represented in the quota dashboard. These will be added later in the public preview. The Activity Log does not currently provide a meaningful summary of previous quota requests and their outcomes. This will also be addressed during the public preview. As noted in the walkthrough, the new experience does not enable zone-redundant deployments. Quota is an inherently regional construct, and zone-redundant enablement requires a separate step that can only be taken in response to a support ticket being filed. Quota API documentation is being drafted to enable bulk non-zone redundant quota requests without requiring you to file a support ticket. Filing a Support Ticket If your deployment requires zone redundancy or contains many subscriptions, then we recommend filing a support ticket with issue type "Technical" and problem type "Quota": We want your feedback! If you notice any aspect of the experience that does not work as expected, or you have feedback on how to make it better, please use the comments below to share your thoughts!528Views2likes0CommentsBuild an AI Image-Caption Generator on Azure App Service with Streamlit and GPT-4o-mini
This tiny app just does one thing: upload an image → get a natural one-line caption. Under the hood: Azure AI Vision extracts high-confidence tags from the image. Azure OpenAI (GPT-4o-mini) turns those tags into a fluent caption. Streamlit provides a lightweight, Python-native UI so you can ship fast. All code + infra templates: image_caption_app in the App Service AI Samples repo: https://github.com/Azure-Samples/appservice-ai-samples/tree/main/image_caption_app What are these components? What is Streamlit? An open-source Python framework to build interactive data/AI apps with just a few lines of code—perfect for quick, clean UIs. What is Azure AI Vision (Vision API)? A cloud service that analyzes images and returns rich signals like tags with confidence scores, which we use as grounded inputs for captioning. How it works (at a glance) User uploads a photo in Streamlit. The app calls Azure AI Vision → gets a list of tags (keeps only high-confidence ones). The app sends those tags to GPT-4o-mini → generates a one-line caption. Caption is shown instantly in the browser. Prerequisites Azure subscription — https://azure.microsoft.com/en-us/pricing/purchase-options/azure-account Azure CLI — https://learn.microsoft.com/azure/cli/azure/install-azure-cli-linux Azure Developer CLI (azd) — https://learn.microsoft.com/azure/developer/azure-developer-cli/install-azd Python 3.10+ — https://www.python.org/downloads/ Visual Studio Code (optional) — https://code.visualstudio.com/download Streamlit (optional for local runs) — https://docs.streamlit.io/get-started/installation Managed Identity on App Service (recommended) — https://learn.microsoft.com/azure/app-service/overview-managed-identity Resources you’ll deploy You can create everything manually or with the provided azd template. What you need Azure App Service (Linux) to host the Streamlit app. Azure AI Foundry/OpenAI with a gpt-4o-mini deployment for caption generation. Azure AI Vision (Computer Vision) for image tagging. Managed Identity enabled on the Web App, with RBAC grants so the app can call Vision and OpenAI without secrets. One-command deploy with azd (recommended) The sample includes infra under image_caption_app/infra so azd up can provision + deploy in one go. # 1) Clone and move into the sample git clone https://github.com/Azure-Samples/appservice-ai-samples cd appservice-ai-samples/image_caption_app # 2) Log in and provision + deploy azd auth login azd up Manual path (if you prefer doing it yourself) Create Azure AI Vision, note the endpoint (custom subdomain). Create Azure AI Foundry/OpenAI and deploy gpt-4o-mini. Create App Service (Linux, Python) and enable System-Assigned Managed Identity. Assign roles to the Web App’s Managed Identity: Cognitive Services OpenAI User on your OpenAI resource. Cognitive Services User on your Vision resource. Add app settings for endpoints and deployment names (see repo), deploy the code, and run. Startup command (manual setting): If you’re configuring the Web App yourself (instead of using the Bicep), set the Startup Command to: streamlit run app.py --server.port 8000 --server.address 0.0.0.0 Portal path: App Service → Configuration → General settings → Startup Command. CLI example: az webapp config set \ --name <your-webapp-name> \ --resource-group <your-rg> \ --startup-file "streamlit run app.py --server.port 8000 --server.address 0.0.0.0" (The provided Bicep template already sets this for you.) Code tour (the important bits) Top-level flow (app.py) First we get tags from Vision, then ask GPT-4o-mini for a one-liner: tags = extract_tags(image_bytes) caption = generate_caption(tags) Vision call (utils/vision.py) Call the Vision REST API, parse JSON, and keep high-confidence tags (> 0.6): response = requests.post( VISION_API_URL, headers=headers, params=PARAMS, data=image_bytes, timeout=30, ) response.raise_for_status() analysis = response.json() tags = [ t.get('name') for t in analysis.get('tags', []) if t.get('name') and t.get('confidence', 0) > 0.6 ] Caption generation (utils/openai_caption.py) Join tags and ask GPT-4o-mini for a natural caption: tag_text = ", ".join(tags) prompt = f""" You are an assistant that generates vivid, natural-sounding captions for images. Create a one-line caption for an image that contains the following: {tag_text}. """ response = client.chat.completions.create( model=DEPLOYMENT_NAME, messages=[ {"role": "system", "content": "You are a helpful AI assistant."}, {"role": "user", "content": prompt.strip()} ], max_tokens=60, temperature=0.7 ) return response.choices[0].message.content.strip() Security & auth: Managed Identity by default (recommended) This sample ships to use Managed Identity on App Service—no keys in config. The Web App’s Managed Identity authenticates to Vision and Azure OpenAI via Microsoft Entra ID. Prefer Managed Identity in production; if you need to test locally, you can switch to key-based auth by supplying the service keys in your environment. Run it locally (optional) # From the sample folder python -m venv .venv && source .venv/bin/activate # Windows: .venv\Scripts\activate pip install -r requirements.txt # Set env vars for endpoints + deployment (and keys if not using MI locally) streamlit run app.py Repo map App + Streamlit UI + helpers: image_caption_app/ Bicep infrastructure (used by azd up): image_caption_app/infra/ What’s next — ways to extend this sample Richer vision signals: Add object detection, OCR, or brand detection; blend those into the prompt for sharper captions. Persistence & gallery: Save images to Blob Storage and captions/metadata to Cosmos DB or SQLite; add a Streamlit gallery. Performance & cost: Cache tags by image hash; cap image size; track tokens/latency. Observability: Wire up Application Insights with custom events (e.g., caption_generated). Looking for more Python samples? Check out the repo: https://github.com/Azure-Samples/appservice-ai-samples/tree/main For more Azure App Service AI samples and best practices, check out the Azure App Service AI integration documentation186Views0likes0CommentsAzure App Service Premium v4 plan is now in public preview
Azure App Service Premium v4 plan is the latest offering in the Azure App Service family, designed to deliver enhanced performance, scalability, and cost efficiency. We are excited to announce the public preview of this major upgrade to one of our most popular services. Key benefits: Fully managed platform-as-a-service (PaaS) to run your favorite web stack, on both Windows and Linux. Built using next-gen Azure hardware for higher performance and reliability. Lower total cost of ownership with new pricing tailored for large-scale app modernization projects. and more to come! [Note: As of September 1st, 2025 Premium v4 is Generally Available on Azure App Service! See the launch blog for more details!] Fully managed platform-as-a-service (PaaS) As the next generation of one of the leading PaaS solutions, Premium v4 abstracts infrastructure management, allowing businesses to focus on application development rather than server maintenance. This reduces operational overhead, as tasks like patching, load balancing, and auto-scaling are handled automatically by Azure, saving time and IT resources. App Service’s auto-scaling optimizes costs by adjusting resources based on demand and saves you the cost and overhead of under- or over-provisioning. Modernizing applications with PaaS delivers a compelling economic impact by helping you eliminate legacy inefficiencies, decrease long-term costs, and increase your competitive agility through seamless cloud integration, CI/CD pipelines, and support for multiple languages. Higher performance and reliability Built on the latest Dadsv6 / Eadsv6 series virtual machines and NVMe based temporary storage, the App Service Premium v4 plan offers higher performance compared to previous Premium generations. According to preliminary measurements during private preview, you may expect to see: >25% performance improvement using Pmv4 plans, relative to the prior generation of memory-optimized Pmv3 plans >50% performance improvement using Pv4 plans, relative to the prior generation of non-memory-optimized Pv3 plans Please note that these features and metrics are preliminary and subject to change during public preview. Premium v4 provides a similar line-up to Premium v3, with four non-memory optimized options (P0v4 though P3v4) and five memory-optimized options (P1mv4 through P5mv4). Features like deployment slots, integrated monitoring, and enhanced global zone resilience further enhance the reliability and user experience, improving customer satisfaction. Lower total cost of ownership (TCO) Driven by the urgency to adopt generative AI and to stay competitive, application modernization has rapidly become one of the top priorities in boardrooms everywhere. Whether you are a large enterprise or a small shop running your web apps in the cloud, you will find App Service Premium v4 is designed to offer you the most compelling performance-per-dollar compared to previous generations, making it an ideal managed solution to run high-demand applications. Using the agentic AI GitHub Copilot app modernization tooling announced in preview at Microsoft Build 2025, you can save up to 24% when you upgrade and modernize your .NET web apps running on Windows Server to Azure App Service for Windows on Premium v4 compared with Premium v3. You will also be able to use deeper commitment-based discounts such as reserved instances and savings plan for Premium v4 when the service is generally available (GA). For more detailed pricing on the various CPU and memory options, see the pricing pages for Windowsand Linux as well as the Azure Pricing Calculator. Get started The preview will roll out globally over the next few weeks. Premium v4 is currently available in the following regions [updated 08/22/2025]: Australia East Canada Central Central US East US East US 2 France Central Japan East Korea Central North Central US North Europe Norway East Poland Central Southeast Asia Sweden Central Switzerland North UK South West Central US West Europe West US West US 3 App Service is continuing to expand the Premium v4 footprint with many additional regions planned to come online over the coming weeks and months. Customers can reference the product documentation for details on how to configure Premium v4 as well as a regularly updated list of regional availability. We encourage you to start assessing your apps using the partners and tools for Azure App Modernization, start using Premium v4 to better understand the benefits and capabilities, and build a plan to hit the ground running when the service is generally available. Watch this space for more information on GA! Key Resources Microsoft Build 2025 on-demand session: https://aka.ms/Build25/BRK200 Azure App Service documentation: https://aka.ms/AppService/Pv4docs Azure App Service web page: https://www.azure.com/appservice Join the Community Standups: https://aka.ms/video/AppService/community Follow us on X: @AzAppService2.2KViews2likes2CommentsAnnouncing the Public Preview of the New Hybrid Connection Manager (HCM)
Update May 28, 2025: The new Hybrid Connection Manager is now Generally Available. The download links shared in this post will give you the latest Generally Available version. Learn more Key Features and Improvements The new version of HCM introduces several enhancements aimed at improving usability, performance, and security: Cross-Platform Compatibility: The new HCM is now supported on both Windows and Linux clients, allowing for seamless management of hybrid connections across different platforms, providing users with greater flexibility and control. Enhanced User Interface: We have redesigned the GUI to offer a more intuitive and efficient user experience. In addition to a new and more accessible GUI, we have also introduced a CLI that includes all the functionality needed to manage connections, especially for our Linux customers who may solely use a CLI to manage their workloads. Improved Visibility: The new version offers enhanced logging and connection testing, which provides greater insight into connections and simplifies debugging. Getting Started To get started with the new Hybrid Connection Manager, follow these steps: Requirements: Windows clients must have ports 4999-5001 available Linux clients must have port 5001 available Download and Install: The new HCM can be downloaded from the following links. Ensure you download the version that corresponds to your client. If you are new to the HCM, check out the existing documentation to learn more about the product and how to get started. If you are an existing Windows user, installing the new Windows version will automatically upgrade your existing version to the new version, and all your existing connections will be automatically ported over. There is no automated migration path from the Windows to the Linux version at this time. Windows download Download the MSI package and follow the installation instructions Linux download From your terminal running as administrator, follow these steps: sudo apt update sudo apt install tar gzip build-essential sudo wget "https://download.microsoft.com/download/HybridConnectionManager-Linux.tar.gz" sudo tar -xf HybridConnectionManager-Linux.tar.gz cd HybridConnectionManager/ sudo chmod 755 setup.sh sudo ./setup.sh Once that is finished, your HCM is ready to be used Run `hcm help` to see the available commands For interactive mode, you will need to install and login to the Azure CLI. Authentication from the HCM to Azure is done using this credential. Install the Azure CLI with: `install azure cli: curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash` Run `az login` and follow the prompts Add your first connection by running `hcm add` Configure Your Connections: Use the GUI or the CLI to add hybrid connections to your local machine. Manage Your Connections: Use the GUI or the CLI with the `hcm list` and `hcm remove` commands to manage your hybrid connections efficiently. Detailed help texts are available for each command to assist you. Join the Preview We invite you to join the public preview and provide your valuable feedback. Your insights will help us refine and improve the Hybrid Connections Manager to better meet your needs. Feedback and Support If you encounter any issues or have suggestions, please reach out to hcmsupport@service.microsoft.com or leave a comment on this post. We are committed to ensuring a smooth and productive experience with the new HCM. Detailed documentation and guidance will be available in the coming weeks as we get closer to General Availability (GA). Thank you for your continued support and collaboration. We look forward to hearing your thoughts and feedback on this exciting new release.1.4KViews2likes13Comments