Recent Discussions
Guidance on using Azure Virtual Desktop
We are looking for a VDI solution and choose AVD. I have gone through MS Documentation but things are quite confusing and complicated. We have a team of 10 developers who are mainly into Biztalk and another set of 10 developers who are salesforce and other development works. They use Visual Studio, VS Code, Biztalk Server, and various other applications and services for their regular work. Initially we thought of option to create Azure VM (20 VMs one for each developer, Windows 2019 Server Image for Biztalk devs and Windows 11 for other devs)use a Generalize VM and spin up the images using the capture of the previous generalized VM image. But there was a point raised why can't we use Azure Virtual Desktop ? As I learned there are 2 types of host pools: Pooled (Multi sessions) Personal (Direct assignment of Host VM to each individual Developer) This AVD Host pools/ VMs should be able to work with external SQL Servers. All the 20 VMs in AVD should be able to access, how can this be possible , what should be the connection string for granting access to external SQL Servers ? As I understand, if I opt for Pooled AVD Hostpools, all the above 10 developers can utilize the multi sessions (whatever number of VMs we choose, ex., 5 VMs, sessions will be distributed !) they will be assigned to developers as per load balancing. Similarly if I choose personal (10 Windows Server 2019 & another 10 Windows 11) I need to create 20 VMs and assign the 20 VMs individually to each of 20 developers in the team. is my understanding correct? Should I create 2 different workspaces and 2 application groups ? Also I learned if I use FSLogix along with Azure Virtual Desktop (AVD) I can manage users using profile management , will this actually help developers who uses the VS Code, Biztalk Server and other Azure services for their work daily ? Also based on all the facts above , please help me understand which suits the requirement well, please advice, if I choose Azure Virtual Desktop over Azure VMs or vice-versa, what are the benefits or disadvantages of AVD. Is it cost effective if I use pooled or personal AVD Host pool VMs or Azure VMs , which one suits better for us, please advice. Experts please help me with all this confusion ? Kind Regards46Views0likes2CommentsAPI Management service secure configuration for Standard v2 SKU
Hi all, I am transitioning an API Managment gateway from the Developer SKU to something Production ready. The Standard V2 SKU is the first tier that supports vnet integration which we require. The OWASP API security framework used by MS recommends that external connectivity to service configuration endpoints is disabled However, Direct Management API access is not supported in this tier (and the ps cmdlets are just wrappers for API calls) So it seem it is not possible to disable public access to these endpoints in this sku. Is this the case? and if so, how is it possible to safely configure an APIMG using this sku? any pointers appreciated... Chris15Views0likes1CommentHow can I monitor Entra Connect Health Sync?
Hello, How can I monitor Entra Connect Health Sync events and get alerts on failures? I have set up to be alearted to events in the Entra portal but I only get a summary email, and not instant notifications. I wish to informed if there is a loss of sync between OP and Entra, or with SSPR? Is this possible other than what MS give us in the Portal. Thanks24Views0likes4CommentsMicrosoft Entra ID App not accessible to other organisations
Hi all, I have an external PHP-based application that allows users to send emails via the Microsoft Graph API instead of SMTP which will be deprecated soon. For this, I registered an application underEntra Appregistrations with delegated permission scopes Mail.Send and offline_access. The app is configured to allow all types of Microsoft accounts, and during testing with personal and internal company Microsoft accounts, everything works perfectly. I'm using the authorization code flow for authentication: The user logs in. The user consents to the required permissions (prompt=consent). The user is redirected back to the application with an access and refresh token. However, when a user with a external company (organizational) account tries to authorize the app, they encounter the following error after logging in: AADSTS650053: The application 'My App Name' requested scope 'offline_access, Mail.Send' which doesn't exist on the resource '00000003-0000-0000-c000-000000000000'. Contact the app vendor. I’ve tried various configurations but haven’t been able to resolve the issue. My question is: does the external company tenant require any specific configuration, or am I missing something on my end? How can I get this working for organizational accounts if it's already functioning for personal Microsoft accounts? Any help or suggestions would be greatly appreciated. Best regards, Ricardas KauneckasSolved134Views0likes2CommentsAzure Arc - SQL Server instances - Error SqlServerDatabases_Update
Hi, I've been getting intermittent errors on the SQL section of Arc for a few days now. Also the section becomes inactive for me; then for a few minutes it seems to get back up and running and then it becomes inactive again; do any of you have similar problems? For the machine management part via Arc, on the other hand, no problem.136Views0likes1CommentMicrosoft Defender External Attack Surface Management (EASM)
Good day everyone, I am needing assistance please in regards to the Microsoft Defender EASM, we are looking to pull out executive reports or custom reports, to highlight all vulnerabilities etc can anyone guide me in the right path on how to achieve this, thanking you in advance24Views0likes1CommentAzure Stack HCI - adding NIC to Network ATC Intent
Hello fellow Azure Stack HCI Admins, I currently have a compute_management Intent for my two node switchless cluster with only one pNIC per node (port1) assigned. I now want to add port2 of each node to the intent for redundancy. (See image) I found this command-set on learn: https://learn.microsoft.com/en-us/azure-stack/hci/manage/manage-network-atc?tabs=21H2&pivots=azure-stack-hci Get-NetIntentStatus -ClusterName <YourClusterName> Update-NetIntentAdapter -IntentName <YourIntentName> -AdapterName pNIC2 Get-NetIntentStatus -ClusterName <YourClusterName> Is that the correct command set to archive that? Or is there another way to do so? I am a bit anxious on running those commands because I already have workloads deployed on that cluster and I don't want to break it. Thanks in advance! :) PS: I chose only one NIC per node because at the time of deployment I only had one ToR switch available.74Views0likes1CommentUnused Enterprise applications
I inherited an Azure\Entra AD domain with a crazy number of Enterprise applications configured. Some going back 5, 6 or more years. Practically all of them are configured to not require user assignment so I have no idea of who might be using these, if they are being used at all. Is there a way to determine last time any of these where actually used? I want to get rid of anything that doesn't need ot be there.35Views0likes3CommentsGolden image VM fails to intune enrolment. AVD Hostpool VMs Not Enrolling in Intune
Hi Team, I need some assistance. I’m trying to create a golden image for a VM in AVD hostpool, I observed provisioned VMs from this image are not enrolling in Intune. Here are the steps I followed: Created an Azure VM Installed and prepared the required software Disabled BitLocker (as recommended for Sysprep) Ran Sysprep Captured the VM image, saved it, and deleted the VM The VMs created using this image are successfully joined to Entra ID, and I am able to log in. However, the hostpool VMs are not enrolling in Intune while creating hostpool and creating VMs. Am I missing any Group Policy settings or registry configurations related to Intune auto-enrollment before running Sysprep? Do I need to install any extensions, add-ons, or tools before running Sysprep? Thank you! VC45Views0likes2CommentsAudit user accessing entreprise App by SPN sign-in
I'm in a Hybrid Entra ID environment. Some users can use an "Entreprise Application" by utilizing IDs and a certificate. In the activity or sign-in logs, I can find the access entries, but I don't have the information on which user used the app registration or which certificate was used. I would like to have logs that allow me to identify WHO is using an SPN/App registration. Do you have any ideas? Thank you. Here an example: In this screenshot, I can see access made to an app using, for example, an appid+secret/certificate connection. So, it’s "logical" not to see a username since it's not required for this type of connection. However, I would really like to have this information or some indicator to identify which of my users accessed it. Currently, I only have the machine's IP address, but I would like more information. Maybe in Purview or with another service, but I haven't found anything.32Views0likes3CommentsAzure GIT - Looking for a REST API to retrieve commits/PRs between two tags
Hi, I am looking for a rest API to retrieve a list of commits/PRs between two given tags. EX; assume I have one tag named release-202410 from main branch and another tag named release-202411 from dev branch. I want to get a list of commits happed between the release-202410 and release-202411. basically, the idea is to get the list of new changes in release-202411 commits and PRs wise workaround would also be fine if there is no out-of-the box feature for this20Views0likes2CommentsIssue with Media Playback in Azure Communication Services Using Python
Context:We are building a bot usingAzure Communication Services (ACS)andAzure Speech Servicesto handle phone calls. The bot usestext-to-speech (TTS)to play questions during calls and captures user responses. What We’ve Done: Created an ACS instance and acquired an active phone number. Set up an event subscription to handle the callback for incoming calls. Integrated Azure Speech Services for TTS using Python. Achievements: Successfully connected calls using ACS. Generated TTS audio files for trial questions. Challenges: Converted TTS audio files are not playing during the call. The playback method does not raise errors, but no audio is heard on the call. Help Needed: Are there specific requirements for media playback using the ACS SDK for Python? How can we debug why the audio is not playing despite being hosted on a public URL? Additional Context: Using Python 3.12.6 and the Azure Communication Services Python SDK. The audio files are hosted on a local server and accessible via public URLs. Steps Followed: Caller Initiates a Call: Someone calls the phone number linked to my ACS resource. ACS Sends an Incoming Call Event: ACS sends aMicrosoft.Communication.IncomingCallevent to my/calling-eventsendpoint. Application Answers the Call: My Flask app receives the event and answers the call using theincomingCallContext. Call Connected Event: Once the call is established, ACS sends aMicrosoft.Communication.CallConnectedevent. Start Interaction: I start the conversation by playing a welcome message to the caller. Play Audio Messages: The excel question text gets converted to speech using Azure text to speech API from Azure speech service This converted speech is stored as .wav files These .wav files need to be hosted on a publicly accessible URL so that the ACS can access them and play it on call Handle User Input: After the question is played, If speech recognition is implemented, the bot listens for and processes the caller's speech input. End the Call: After the conversation, the bot plays a goodbye message and hangs up. Clean Up: The bot handles the CallDisconnected event to clean up any resources or state. Code Snippet (Python): def play_audio(call_connection_id, audio_file_path): try: audio_url = f"http://example.com/{audio_file_path}" # Publicly accessible URL call_connection = call_automation_client.get_call_connection(call_connection_id) file_source = FileSource(url=audio_url) call_connection.play_media(play_source=file_source, play_to=True) print(f"Playing audio: {audio_url}") except Exception as e: print(f"Error playing audio: {e}")12Views0likes0CommentsIssue with Speech-to-Text Integration in Azure Communication Services Using C#
Context:We are building a bot usingAzure Communication Services (ACS)andAzure Speech Servicesto handle phone calls. The bot asks questions (via TTS) and captures user responses usingspeech-to-text (STT). What We’ve Done: Created an ACS instance and acquired an active phone number. Set up an event subscription to handle callbacks for incoming calls. Integrated Azure Speech Services for STT in C#. Achievements: Successfully connected calls using ACS. Played TTS prompts generated from an Excel file. Challenges: User responses are not being captured. Despite settingInitialSilenceTimeoutto 10 seconds, the bot skips to the next question after 1–2 seconds without recognizing speech. The bot does not reprompt the user even when no response is detected. Help Needed: How can we ensure accurate real-time speech-to-text capture during ACS telephony calls? Are there better configurations or alternate approaches for speech recognition in ACS? Additional Context: Following theofficial ACS C# sample. Using Azure Speech Services and ACS SDKs. Code Snippet (C#): // Recognize user speech async Task<string> RecognizeSpeechAsync(CallMedia callConnectionMedia, string callerId, ILogger logger) { // Configure recognition options var recognizeOptions = new CallMediaRecognizeSpeechOptions( targetParticipant: CommunicationIdentifier.FromRawId(callerId)) { InitialSilenceTimeout = TimeSpan.FromSeconds(10), // Wait up to 10 seconds for the user to start speaking EndSilenceTimeout = TimeSpan.FromSeconds(5), // Wait up to 5 seconds of silence before considering the response complete OperationContext = "SpeechRecognition" }; try { // Start speech recognition var result = await callConnectionMedia.StartRecognizingAsync(recognizeOptions); // Handle recognition success if (result is Response<StartRecognizingCallMediaResult>) { logger.LogInformation($"Result: {result}"); logger.LogInformation("Recognition started successfully."); // Simulate capturing response (replace with actual recognition logic) return "User response captured"; // Replace with actual response text from recognition } logger.LogWarning("Recognition failed or timed out."); return string.Empty; // Return empty if recognition fails } catch (Exception ex) { logger.LogError($"Error during speech recognition: {ex.Message}"); return string.Empty; } }15Views0likes0Comments6 resets due to hacks from an azure user with the last name of Jenkins this week. Please help.
I am just a home user. 5days ago I went to upgrade my computer to W11 and it took about 2 hours. When it was done I noticed it looked wrong. I had seen 11 before. I checked for certain features that were not on it. I wiped my disk and did a factory HP cloud recovery and went to upgrade again. It still did not work. I got my daughters computer, this one, and downloaded a copy of W11 to fresh install. I had to wait until morning because I had to sleep. When I awoke my daughters system had been affected. I noticed because my computer would never accept a usb copy after 4 seperate downloads. Anyway someone keeps setting my system to domain user, changing the firewall, making settings not work , using remote desktop and believe it or not they have somehow managed to access it over bluetooth that does not belong to them. They have hijacked my printing operations and I am now determined to press charges but I need to fix these two systems. I am old and not super skilled at this crap. I managed to activate and hide a set of logs which is how I found their name and methods.25Views0likes2CommentsAzure API Management Gateway - RBAC on the API level
Is it possible to grant access on specific APIs implementation, making users able to see some APIs but not others inside the same Azure API Management Gateway? For example: User1 can manage green ones, but not red ones. Thanks.30Views0likes3CommentsAzure Course Blueprints
Overview The Course Blueprint is a comprehensive visual guide to the Azure ecosystem, integrating all the resources, tools, structures, and connections covered in the course into one inclusive diagram. It enables students to map out and understand the elements they've studied, providing a clear picture of their place within the larger Azure ecosystem. It serves as a 1:1 representation of all the topics officially covered in the instructor-led training. Links: Each icon in the blueprint has a hyperlink to the pertinent document in the learning path on Learn. Layers: You have the capability to filter layers to concentrate on segments of the course by modules. I.E.: Just day 1 of AZ-104, using filters in Visio and selecting modules 1-3 Enhanced Integration: The Visio Template+ for expert courses such as SC-100 and AZ-305 now features an additional layer that allows you to compare SC-100, AZ-500, and SC-300 within the same diagram. Similarly, you can compare AZ-305, AZ-204, and AZ-104 to identify differences and study gaps. Since SC-300 and AZ-500 are potential prerequisites for SC-100, and AZ-204 or AZ-104 for AZ-305, this comparison is particularly useful for understanding the extra knowledge or skills required to advance to the next level. Advantages for Students Defined Goals: The blueprint presents learners with a clear vision of what they are expected to master and achieve by the course’s end. Focused Learning: By spotlighting the course content and learning targets, it steers learners’ efforts towards essential areas, leading to more productive learning. Progress Tracking: The blueprint allows learners to track their advancement and assess their command of the course material. New Feature:A comprehensive list of topics for each slide deck is now available in a downloadable .xlsx file. Each entry includes a link to Learn and its dependencies. Download links Associate Level PDF Visio Released Updated Contents! AZ-104 Azure Administrator Associate Blueprint [PDF] Template 12/14/2023 10/28/2024 Contents AZ-204 Azure Developer Associate Blueprint [PDF] Template 11/05/2024 11/11/2024 Contents AZ-500 Azure Security Engineer Associate Blueprint [PDF] Template+ 01/09/2024 10/10/2024 Contents AZ-700 Azure Network Engineer Associate Blueprint [PDF] Template 01/25/2024 11/04/2024 Contents SC-300 Identity and Access Administrator Associate Blueprint [PDF] Template 10/10/2024 Contents Specialty PDF Visio Released Updated AZ-140 Azure Virtual Desktop Specialty Blueprint [PDF] Template 01/03/2024 02/05/2024 Expert level PDF Visio Released Updated AZ-305 Designing Microsoft Azure Infrastructure Solutions Blueprint [PDF] Template+ AZ-104 AZ-204 AZ-700 05/07/2024 11/18/2024 Contents SC-100 Microsoft Cybersecurity Architect Blueprint [PDF] Template+ AZ-500 SC-300 10/10/2024 Contents Skill based Credentialing PDF Visio Released Updated AZ-1002 Configure secure access to your workloads using Azure virtual networking Blueprint [PDF] Template 05/27/2024 Contents AZ-1003 Secure storage for Azure Files and Azure Blob Storage Blueprint [PDF] Template 02/07/2024 02/05/2024 Contents Benefits for Trainers: Trainers can follow this plan to design a tailored diagram for their course, filled with notes. They can construct this comprehensive diagram during class on a whiteboard and continuously add to it in each session. This evolving visual aid can be shared with students to enhance their grasp of the subject matter. Introduction to Course Blueprint for Trainers [10 minutes + comments] Real life demo AZ-104 Advanced Networking section [3 minutes] Visio stencilsAzure icons - Azure Architecture Center | Microsoft Learn Subscribe if you want to get notified of any update like new releases or updates. My emaililan.nyska@microsoft.com LinkedInhttps://www.linkedin.com/in/ilan-nyska/ Celebrating 30,000 Downloads! Please consider sharing your anonymous feedback <-- [~ 40 seconds to complete]Solved57KViews24likes22CommentsAzure VMWare (AVS) Cost Optimization Using Azure Migrate Tool
What is AVS? Azure VMware Solution provides private clouds that contain VMware vSphere clusters built from dedicated bare-metal Azure infrastructure. Azure VMware Solution is available in Azure Commercial and Azure Government. The minimum initial deployment is three hosts, with the option to add more hosts, up to a maximum of 16 hosts per cluster. All provisioned private clouds have VMware vCenter Server, VMware vSAN, VMware vSphere, and VMware NSX. As a result, you can migrate workloads from your on-premises environments, deploy new virtual machines (VMs), and consume Azure services from your private clouds. Learn More:https://learn.microsoft.com/en-us/azure/azure-vmware/introduction What is Azure Migrate Tool? Azure Migrate is a comprehensive service designed to help you plan and execute your migration to Azure. It provides a unified platform to discover, assess, and migrate your on-premises resources, including servers, databases, web apps, and virtual desktops, to Azure. The tool offers features like dependency analysis, cost estimation, and readiness assessments to ensure a smooth and efficient migration process. Learn More: https://learn.microsoft.com/en-us/azure/migrate/migrate-services-overview How Azure Migrate can be used to Discover and Assess AVS? Azure Migrate enables the discovery and assessment of Azure VMware Solution (AVS) environments by collecting inventory and performance data from on-premises VMware environments, either through direct integration with vCenter (via Appliance) or by importing data from tools like RVTools. Using Azure Migrate, organizations can analyze the compatibility of their VMware workloads for migration to AVS, assess costs, and evaluate performance requirements. The process involves creating an Azure Migrate project, discovering VMware VMs, and generating assessments that provide insights into resource utilization, right-sizing recommendations, and estimated costs in AVS. This streamlined approach helps plan and execute migrations effectively while ensuring workloads are optimized for the target AVS environment. Note: We will be narrating the RVtools Import method in this article. What Is RVTools? RVTools is a lightweight, free utility designed for VMware administrators to collect, analyze, and export detailed inventory and performance data from VMware vSphere environments. Developed by Rob de Veij, RVTools connects to vCenter or ESXi hosts using VMware's vSphere Management SDK to retrieve comprehensive information about the virtual infrastructure. Key Features of RVTools: Inventory Management: Provides detailed information about virtual machines (VMs), hosts, clusters, datastores, networks, and snapshots. Includes details like VM names, operating systems, IP addresses, resource allocations (CPU, memory, storage), and more. Performance Insights: Offers visibility into resource utilization, including CPU and memory usage, disk space, and VM states (e.g., powered on/off). Snapshot Analysis: Identifies unused or orphaned snapshots, helping to optimize storage and reduce overhead. Export to Excel: Allows users to export all collected data into an Excel spreadsheet (.xlsx) for analysis, reporting, and integration with tools like Azure Migrate. Health Checks: Identifies configuration issues, such as disconnected hosts, orphaned VMs, or outdated VMware Tools versions. User-Friendly Interface: Displays information in tabular form across multiple tabs, making it easy to navigate and analyze specific components of the VMware environment. Hand-on LAB Disclaimer: The data used for this LAB has no relationship with real world scenarios. This sample data is self-created by the author and purely for understanding the concept. To discover and assess your Azure VMware Solution (AVS) environment using anRVTools extract report in the Azure Migrate tool, follow these steps: Prerequisites RVTools Setup: Download and install RVTools from the Official Website Ensure connectivity to your vCenter server. Extract the data by running RVTools and saving the output as an Excel (.xlsx) file Permissions: You need at least the Contributor role on the Azure Migrate project. Ensure that you have appropriate permissions in your vCenter environment to collect inventory and performance data. File Requirements: The RVTools file must be saved in .xlsx format without renaming or modifying the tabs or column headers. Note: Sample Sheet: Please check the attachment included with this article. Note that this is not the complete format; some tabs and columns have been removed for simplicity. During the actual discovery and assessment process, please do not modify the tabs or columns. Procedure Step 1: Export Data from RVTools Follow the steps provided in official website to get RVTools Extract Sample Sheet: Please check the attachment included with this article. Note that this is not the complete format; some tabs and columns have been removed for simplicity. During the actual discovery and assessment process, please do not modify the tabs or columns. Step 2: Discover Log in to the Azure portal. Navigate to Azure Migrate and select your project or create new project. UnderMigration goals, selectServers, databases and web apps. OnAzure Migrate | Servers, databases and web appspage, underAssessment tools, selectDiscoverand then selectUsing import. InDiscoverpage, inFile type, selectVMware inventory (RVTools XLSX). In theStep 1: Import the filesection, select the RVTools XLSX file and then selectImport. Wait for some time to Import Once import completed check for Error Messages if any and rectify those and re upload, otherwise wait 10-15 minutes to reflect imported VMs in the discovery. Post discovery Reference Link: https://learn.microsoft.com/en-us/azure/migrate/vmware/tutorial-import-vmware-using-rvtools-xlsx?context=%2Fazure%2Fmigrate%2Fcontext%2Fvmware-context Step 3: Assess After the upload is complete, navigate to the Servers tab. Click on Assess -->Azure VMware Solution to assess the discovered machines. Edit assessment settings based on your requirements and Save Target region: Select the Azure region for the migration. Node Type:Specify the Azure VMware Solution series (e.g., AV36, AV36P). Pricing model: Select pay-as-you-go or reserved instance pricing. Discount: Specify any available discounts. Note: We will be explaining all the parameters in optimize session. As of now just review and leave parameters as it is. InAssess Servers, selectNext. InSelect servers to assess>Assessment name> specify a name for the assessment. InSelect or create a group> selectCreate New and specify a group name. Select the appliance and select the servers you want to add to the group. Then select Next. InReview + create assessment, review the assessment details, and selectCreate Assessment to create the group and run the assessment. Step 4: Review the Assessment View an assessment InWindows, Linux and SQL Server>Azure Migrate: Discovery and assessment, select the number next toAzure VMware Solution. InAssessments, select an assessment to open it. As an example (estimations and costs, for example, only): Review the assessment summary. You can selectSizing assumptions to understand the assumptions that went in node sizing and resource utilization calculations. You can also edit the assessment properties or recalculate the assessment. Step 5: Optimize We have received a report without any optimization in our previous steps. Now we can follow below steps to optimize the cost and node count even further High level steps: Find limiting factor Find which component in settings are mapped for optimization depending on limiting factor Try to adjust the mapped component according to Scenario and Comfort Find Limiting factor: First understand which component (CPU, memory and storage) is deciding your ESXI Node count. This will be highlighted in the report The limiting factor shown in assessments could be CPU or memory or storage resources based on the utilization on nodes. It is the resource, which is limiting or determining the number of hosts/nodes required to accommodate the resources. For example, in an assessment if it was found that after migrating 8 VMware VMs to Azure VMware Solution, 50% of CPU resources will be utilized, 14% of memory is utilized and 18% of storage will be utilized on the 3 Av36 nodes and thus CPU is the limiting factor. Find which option in the setting can be used to optimize: This is depending on the limiting factor. For eg: If Limiting factor is CPU, which means you have high CPU requirement and CPU oversubscription can be used to optimize ESXI Node. Likewise, if storage is the limiting factor editing FTT, RAID or introducing External storage like ANF will help you to reduce Node count. Even reducing one node count will create a huge impact in dollar value. Let's understand how over commitment or over subscription works with simple example. Let's suppose I have two VMs with below specification Name CPU Memory Storage VM1 9 vCPU 200 GB 500 GB VM2 4 vCPU 200 GB 500 GB Total 13 vCPU 400 GB 1000 GB We have EXSI Node which has below capacity: vCPU 10 Memory 500 GB storage 1024 GB Now without optimization I need two ESXI node to accommodate 13 vCPU of total requirement. But let's suppose VM1 and VM2 doesn't consume entire capacity all the time. The total capacity usage at a time will not go beyond 10. then I can accommodate both VM in same ESXI node, Hence I can reduce my node count and cost. Which means it is possible to share resources among both VMs. Without optimization With optimization Parameters effecting Sizing and Pricing CPU Oversubscription Specifies the ratio of number of virtual cores tied to one physical core in the Azure VMware Solution node. The default value in the calculations is 4 vCPU:1 physical core in Azure VMware Solution. API users can set this value as an integer. Note that vCPU Oversubscription > 4:1 may impact workloads depending on their CPU usage. Memory overcommit factor Specifies the ratio of memory overcommit on the cluster. A value of 1 represents 100% memory use, 0.5, for example is 50%, and 2 would be using 200% of available memory. You can only add values from 0.5 to 10 up to one decimal place. Deduplication and compression factor Specifies the anticipated deduplication and compression factor for your workloads. Actual value can be obtained from on-premises vSAN or storage configurations. These vary by workload. A value of 3 would mean 3x so for 300GB disk only 100GB storage would be used. A value of 1 would mean no deduplication or compression. You can only add values from 1 to 10 up to one decimal place. FTT : How many device failure can be tolerated for a VM RAID : RAID stands for Redundant Arrays of Independent Disks Explains how data should be stored for redundancy Mirroring : Data will be duplicated as it is to another disk E.g.: To protect a 100 GB VM object by using RAID-1 (Mirroring) with an FTT of 1, you consume 200 GB. Erasure Coding : Erasure coding divides data into chunks and calculates parity information (redundant data) across multiple storage devices. This allows data reconstruction even if some chunks are lost, similar to RAID, but typically more space-efficient E.g.: to protect a 100 GB VM object by using RAID-5 (Erasure Coding) with an FTT of 1, you consume 133.33 GB. Comfort Factor: Azure Migrate considers a buffer (comfort factor) during assessment. This buffer is applied on top of server utilization data for VMs (CPU, memory and disk). The comfort factor accounts for issues such as seasonal usage, short performance history, and likely increases in future usage. For example, a 10-core VM with 20% utilization normally results in a 2-core VM. However, with a comfort factor of 2.0x, the result is a 4-core VM instead. AVS SKU Sizes Optimization Result In this example we got to know that CPU is my limiting factor hence I have adjusted CPU over subscription value from 4:1 to 8:1 Reduced node count from 6 (3 AV36P+3 AV64) to 5 AV36P Reduced Cost by 31% Note:Over-provisioning or over-committing can put your VMs at risk. However, in Azure Cloud, you can create alarms to warn you of unexpected demand increases and add new ESXi nodes on demand. This is the beauty of the cloud: if your resources are under-provisioned, you can scale up or down at any time. Running your resources in an optimized environment not only saves your budget but also allows you to allocate funds for more innovative ideas.108Views0likes0CommentsTriggering deployment settings validation failed
I trying to setup a test environment with two Lenovo servers. I have been running several test validations on the cluster and trying to resolve the requriments one by one. But now even my validation wont start - press the "Start validation" button and after a while I receive: Could not complete the operation. 200: OperationTimeout , No updates received from device for operation: [providers/microsoft.azurestackhci/locations/WESTEUROPE/operationStatuses/xxx?api-version=2024-04-01] beyond timeout of [600000] ms Looking in the Activity log I see: -Update DeploymentSettings resources Create failed - Update Deploymentsettings resources Create Started I have no clue that is wrong, am I out of resources ? and what to do ?, anyone601Views0likes3CommentsHow to backup your Azure VMs without Azure Guest Agent?
Hi All, Gusy I have some old Servers (that are 32 bit OS) and have migrated them to Azure, now I can't create there backups as Azure Guest Agent is not running on them. Azure Guest Agent is not compatible with the old 32 bit systems. I am not able to make their backup using Azure Recovery Vault (it failed also). Kindly provide me a robust and practical solution to create the backups of my Old VMs on Azure.17Views0likes1CommentIssues with Azure stack HCI 23H2 and and NVMe Drives 5520 Series
We recently purchased a Lenovo MX630 V3 Integrated System for Azure Stack HCI 23H2 deployment with 16 x 5520 series NVMe drives (Lenovo NVMe part number is 4XB7A13943/SSDPF2KX076T1O and the server model is 7D6U). We are using version 10.2408.0.29 25398.1085 of Azure Stack HCI. In the past few days, we have encountered some weird issues. The storage cluster degraded because the physical disks keep going to "Lost Communication," causing the storage volume to go into detached mode. We are using firmware version Lenovo NVMe 9CV10450, which is approved by Lenovo. I see that the newer driver is 9CV10490, and I was wondering if anyone had a similar experience with these drives and if the August update fixed it. The reason I ask is that using Solidigm driver/firmware is not supported by Lenovo. So far, we have replaced one disk, but when I observed the last problem, it seemed to fix itself.76Views0likes2Comments
Events
Recent Blogs
- 4 MIN READWe are excited to share important new enhancements to Oracle Database@Azure that unlock data governance, analytics, security and AI capabilities for mission-critical enterprise data workloads: Ne...Nov 26, 2024756Views2likes0Comments
- Microsoft Ignite 2024 brought with it groundbreaking announcements, and Red Hat stood at the forefront, unveiling a series of innovations designed to empower businesses across industries. These annou...Nov 25, 202439Views0likes0Comments