azure
576 TopicsHow does Azure Key Vault help in securing application secrets and managing encryption keys?
Azure Key Vault is a cloud service that helps secure cryptographic keys, secrets (such as API keys, passwords, certificates), and manage encryption keys for cloud applications. It provides a centralized, secure storage mechanism that enables fine-grained access control and compliance with security best practices. Key features of Azure Key Vault include: Secure Secret Storage: Protects sensitive application data by storing them in a secure vault, reducing exposure risks. Access Control with Azure AD: Uses Azure Active Directory (Azure AD) authentication to control access, ensuring only authorized applications and users can retrieve secrets. Automated Key Rotation: Supports automatic rotation of keys and certificates, minimizing security risks from stale credentials. Encryption Key Management: Allows applications to use Hardware Security Modules (HSMs) or software-based keys for encrypting and decrypting data. Logging and Monitoring: Integrates with Azure Monitor and Microsoft Defender for Cloud to track usage and detect anomalies. By using Azure Key Vault, organizations can enhance security, reduce the need for embedding secrets in code, and simplify the management of cryptographic assets across their cloud environments.53Views0likes1CommentAI Learning Hub
Explore microsoft learning hub, your trusted source to help you learn AI skills and get ready to power AI transformation with the Microsoft Cloud. Begin your AI learning journey with our curated Plans and resources, designed to support business and technical roles, individuals, and organizations in building AI skills. Discover to Topics: Explore a curated selection of the latest noteworthy content to inspire your AI learning journey. https://learn.microsoft.com/en-us/training/modules/prepare-azure-ai-development/?wt.mc_id=aihub_prepareAIsolutions_webpage_wwl https://aka.ms/rag-time?wt.mc_id=aihub_RAGTime_webpage_devrelstudios https://learn.microsoft.com/en-us/azure/architecture/ai-ml/architecture/baseline-openai-e2e-chat?wt.mc_id=aihub_baselineopenai_webpage_cnl https://techcommunity.microsoft.com/blog/azure-ai-services-blog/introducing-azure-ai-agent-service/4298357?wt.mc_id=aihub_aiagent_webpage_cnl For more details: https://learn.microsoft.com/en-us/ai/?tabs=developer51Views1like0CommentsLost in Employment Verification
Long story short: Our app needs to connect to Bing Ads API via OAuth. The OAuth UI tells us to get verified as a publisher. We follow the steps to do that and I find myself in some strange situation where at Employment Verification step there is no way to proceed nor any way to get any answers. It's not even clear to me if we really need to do this verification, since the app seems to work fine without. We've provided a domain invoice (Cloudflare) including: Domain name Exact company name My name Exact company address Dated within 12 months I have used "Fix Now" (exhausted my goes at that), and I've then been back and forth with a real person (who seems to only send half filled out email templates) for the past weeks too. The last time I tried to do the ultimate helpful email with all the info laid out in the most obvious and clear way possible with 3 email attachments with all the requested info and more, I wait a week and I get back: Issues with employment verification rejection a... - TrackingID#2501140040006597 Anjali S 04:49 (4 hours ago) to me Dear , This application is insufficient for the requirements of the program. Thank you, Vetting Operations Support I never received any further information or assistance in any of the prior emails, I'm 100% in the dark. Can anyone help me with these questions? Do I need to be a verified publisher for my use case? I just don't want an "unverified" to show on OAuth consent screen. What exactly is employment verification and what is the magic formula to get past this step. Why can a $3.3T market cap company not manage to provide basic customer service and communication. Thanks for reading and really hope I can get some help. John.95Views0likes5CommentsHow can avoid over consumption in Azure cloud esp AKS ?
Over consumption is Azure Kubernetes services can lead to unnecessary costs and resource wastage if you are not leverage properly. here are some key things to avoid it. 1.Right size your cluster: a. Optimise Node pools: use appropriate VM size for worker nodes based on work load requirements. do not go more than what is required. b.Use Autoscaling feature to scale nodes up/down c.Set requests & Limits: define CPU and memory requets/lmits for pods to prevent over allocation. 2.Optimise workload scaling 3.Enable Monitoring to see the optimise costs 4.Use managed AKS instead of Shared AKS 5.Use azure files or managed disk efficiently to optimise the storage 6.Cleanup unused resources. delete idle work loads, unused namespaces.19Views0likes0CommentsVM Login issues
Issue: The virtual machine has been created, but login is not possible ? what could be reason ? Root cause: Multiple stale ip addresses were attached to the virtual machine possibly due to multiple recreation using same name, Solution: Recreate the virtual machine with unique name in another subnet. issue is resolved.98Views0likes0CommentsBlueprint opportunity for Designing and Implementing a Microsoft Azure AI Solutions
Greetings! Microsoft is updating a certification for Designing and Implementing a Microsoft Azure AI Solution, and we need your input through our exam blueprinting survey. The blueprint determines how many questions each skill in the exam will be assigned. Please complete the online survey by February 24th, 2025. Please also feel free to forward the survey to any colleagues you consider subject matter experts for this certification. If you have any questions, feel free to contact John Sowles at josowles@microsoft.com or Rohan Mahadevan at rmahadevan@microsoft.com. Designing and Implementing a Microsoft Azure AI Solution blueprint survey link: https://microsoftlearning.co1.qualtrics.com/jfe/form/SV_9tUvZ4THa0dhDU2 Thank you!90Views2likes1CommentWhat are the best practices for data governance in Azure across hybrid and multi-cloud environments?
Best practices for data governance in Azure start with a unified strategy that leverages Microsoft Purview for data cataloging, classification, and lineage tracking. Purview enables organizations to gain end-to-end visibility across their data estate, even in hybrid and multi-cloud environments. Azure Policy plays a crucial role in enforcing governance by defining compliance rules that automatically apply across subscriptions and services. Combining it with Azure Blueprints can help ensure that governance frameworks are consistently deployed at scale. For enhanced security, integrating Microsoft Defender for Cloud allows continuous monitoring and risk assessment of data assets. Additionally, organizations should implement role-based access control (RBAC) and encryption mechanisms to safeguard sensitive information. One challenge in multi-cloud governance is achieving real-time data classification and policy enforcement. Has anyone successfully extended Purview’s capabilities to non-Azure environments, such as AWS or GCP?Solved163Views0likes1CommentForwarding email if the aimed adress doesnt exist?
Hi, I have setted my own domain in Entra. Is it possible to forward emails, which are sended to a none existing adress of my domain to a existing one? For example, an email send to ajnsfvfxdcb@mydomain (doesnt exist) should be forwarded to email_grabber@mydomain (does exist). Is it possible to handle this? Maybe there is another way to work around? Greetz, Tomek40Views0likes2CommentsImporting Terraform State in Azure
Some engineers start to provision services manually before they find out this might not be a good thing for the long run. So, they must use Terraform import. If you are using Hashicorp’s Terraform to manage your infrastructure, you can bring existing resources that have been provisioned outside of Terraform. This tuto help you to import Azure resources into a terraform state file. You can do that locally or if you want to initialise the tfstate in a remote local (form a Storage account) So, we are going to import a resource group, a virtual network and a subnet that are created manually; Screenshot from the portal (for the manual resources) First step : create a tf configuration file using manually created resource information (See Screenshot) ***** For you information, we use a tfstate stored remotely in a storage account Second Step : Import Resource details to terraform State After creating the configuration tf file, we can import these resources into it by using the "terraform import" command : terraform import terraform_id azure_resource_id 1- Resource Group : terraform import "azurerm_resource_group.rg_name_auto" "/subscriptions/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx/resourceGroups/d-210-rg-ado-si-p-to-6" You can find : **The terraform_id **The azure_resource_id 2- The Vnet : terraform import "azurerm_virtual_network.vnet_auto" "/subscriptions/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxid/resourceGroups/d-210-rg-ado-si-p-to-6/providers/Microsoft.Network/virtualNetworks/d-210-vnet-ado-si-p-to-1" 3-The Subnet : terraform import azurerm_subnet.sub_auto /subscriptions/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxid/resourceGroups/d-210-rg-ado-si-p-to-6/providers/Microsoft.Network/virtualNetworks/d-210-vnet-ado-si-p-to-1/subnets/d-210-snet-ado-si-p-to-2 So, before use this commands, please : 1- access the code folder : cd folder_code 2- connect to the subscription (where you have deployed the manual resources) : az login Select-AzSubscription -SubscriptionId "copy-past the id of the subsc" 3- Terraform init : terraform init -backend-config storage_account_name=xxxxxxxx -backend-config container_name=tfstate -backend-config resource_group_name=xxxxxxxx -backend-config key=xxxxxxx.tfstate Okey, now we can lunch the commands for import config : RG : Vnet: Subnet : Now, you can see the result : terraform state list you can see the content of each imported resource via the following commands: terraform state show azurerm_resource_group.rg_name_auto terraform state show azurerm_virtual_network.vnet_auto terraform state show azurerm_subnet.sub_auto Third Step : Test by running the terraform plan Finally, we can verify the imported resources in the tfstate file. For example, we can see the imported subnet. The purpose of this tutorial is to know the steps to import resources that are manually configured to the tfstate file69KViews2likes1Comment