#azure
4 TopicsModel Mondays S2:E2 - Understanding Model Context Protocol (MCP)
This week in Model Mondays, we focus on the Model Context Protocol (MCP) — and learn how to securely connect AI models to real-world tools and services using MCP, Azure AI Foundry, and industry-standard authorization. Read on for my recap About Model Mondays Model Mondays is a weekly series designed to help you build your Azure AI Foundry Model IQ step by step. Here’s how it works: 5-Minute Highlights – Quick news and updates about Azure AI models and tools on Monday 15-Minute Spotlight – Deep dive into a key model, protocol, or feature on Monday 30-Minute AMA on Friday – Live Q&A with subject matter experts from Monday livestream If you want to grow your skills with the latest in AI model development, Model Mondays is the place to start. Want to follow along? Register Here - to watch upcoming Mondel Monday livestreams Watch Playlists to replay past Model Monday episodes Register Here - to join the AMA on MCP on Friday Jun 27 Visit The Forum- to view Foundry Friday AMAs and recaps Spotlight On: Model Context Protocol (MCP) This week, the Model Monday’s spotlight was on the Model Context Protocol (MCP) with subject matter expert Den Delimarsky. Don't forget to check out the slides from the presentation, for resource links! In this blog post, I’ll talk about my five key takeaways from this episode: What Is MCP and Why Does It Matter? What Is MCP Authorization and Why Is It Important? How Can I Get Started with MCP? Spotlight: My Aha Moment Highlights: What’s New in Azure AI 1 . What Is MCP and Why is it Important? MCP is a protocol that standardizes how AI applications connect the underlying AI models to required knowledge sources (data) and interaction APIs (functions) for more effective task execution. Because these models are pre-trained, they lack access to real-time or proprietary data sources (for knowledge) and real-world environments (for interaction). MCP allows them to "discover and use" relevant knowledge and action tools to add relevant context to the model for task execution. Explore: The MCP Specification Learn: MCP For Beginners Want to learn more about MCP - check out the AI Engineer World Fair 2025 "MCP and Keynotes" track. It kicks off with a keynote from Asha Sharma that gives you a broader vision for Azure AI Foundry. Then look for the talk from Harald Kirschner on MCP and VS Code. 2. What Is MCP Authorization and Why Does It Matter? MCP (Model Context Protocol) authorization is a system that helps developers manage who can access their apps, especially when they are hosted in the cloud. The goal is to simplify the process of securing these apps by using common tools like OAuth and identity providers (such as Google or GitHub), so developers don't have to be security experts. Key Takeaways: The new MCP proposal uses familiar identity providers to simplify the authorization process. It allows developers to secure their apps without requiring deep knowledge of security. The update ensures better security controls and prepares the system for future authentication methods. Related Reading: Aaron Parecki, Let's Fix OAuth in MCP Den Delimarsky, Improving The MCP Authorization Spec - One RFC At A Time MCP Specification, Authorization protocol draft On Monday, Den joined us live to talk about the work he did for the authorization protocol. Watch the session now to get a sense for what the MCP Authorization protocol does, how it works, and why it matters. Have questions? Submit them to the forum or Join the Foundry Friday AMA on Jun 27 at 1:30pm ET. 3. How Can I Get Started? If you want to start working with MCP, here’s how to do it easily: Learn the Fundamentals: Explore MCP For Beginners Use an MCP Server: Explore VSCode Agent Mode support . Use MCP with AI Agents: Explore the Azure MCP Server 4. What’s New in Azure AI Foundry? Managed Compute for Cohere Models: Faster, secure AI deployments with low latency. Prompt Shields: New Azure security system to protect against prompt injection and unsafe content. OpenAI o3 Pro Model: A fast, low-cost model similar to GPT-4 Turbo. Codex Mini Model: A smaller, quicker model perfect for developer command-line tasks. MCP Security Upgrades: Now easier to secure AI apps using familiar OAuth identity providers. 5. My Aha Moment Before this session, I used to think that connecting apps to AI was complicated and risky. I believed developers had to build their own security systems from scratch, which sounded tough. But this week, I learned that MCP makes it simple. We can now use trusted logins like Google or GitHub and securely connect AI models to real-world apps without extra hassle. How I Learned This ? To be honest, I also used Copilot to help me understand and summarize this topic in simple words. I wanted to make sure I really understood it well enough to explain it to my friends and peers. I believe in learning with the tools we have, and AI is one of them. By using Copilot and combining it with what I learned from the Model Monday’s session, I was able to write this blog in a way that is easy to understand Takeaway for Beginners: It’s okay to use AI to learn what matters is that you grow, verify, and share the knowledge in your own way. Coming Up Next Week: Next week, we dive into SLMs & Reasoning (Phi-4) with Mojan Javaheripi, PhD, Senior Researcher at Microsoft Research. This session will explore how Small Language Models (SLMs) can perform advanced reasoning tasks, and what makes models like Phi-4 reasoning efficient, scalable, and useful in practical AI applications. Register Here! Join The Community Great devs don't build alone! In a fast-pased developer ecosystem, there's no time to hunt for help. That's why we have the Azure AI Developer Community. Join us today and let's journey together! Join the Discord - for real-time chats, events & learning Explore the Forum - for AMA recaps, Q&A, and help! About Me: I'm Sharda, a Gold Microsoft Learn Student Ambassador interested in cloud and AI. Find me on Github, Dev.to, Tech Community and Linkedin. In this blog series I have summarized my takeaways from this week's Model Mondays livestream.648Views1like2CommentsElastic Pools SKU recommendations in DMS Automation - Azure Powershell and CLI
We are excited to announce the addition of Azure SQL Database Elastic Pools (Elastic Pools) SKU recommendation in Azure Data Migration Service (DMS) automation tools. What has changed As part of our ongoing efforts to simplify and optimize the migration of on-premises SQL Server databases to Azure, we have now added the ability to recommend Azure SQL Elastic Pools for Azure SQL Database targets. This new feature is available through both Azure PowerShell and Azure CLI, making it easier than ever to leverage the cost benefits and scalability of Elastic Pools. Why This Matters Elastic Pools offer a cost-effective solution for managing and scaling multiple databases with varying and unpredictable usage demands. By sharing a set number of resources across multiple databases, Elastic Pools help you optimize resource utilization and reduce costs compared to provisioning standalone databases. This is particularly beneficial for scenarios where database usage patterns are unpredictable, allowing you to handle spikes in demand without overprovisioning resources. Key Benefits of Elastic Pools Cost Savings: Elastic Pools provide significant cost savings by allowing multiple databases to share resources. This eliminates the need to overprovision resources for individual databases, leading to more efficient resource utilization and lower costs. 2. Scalability: Elastic Pools share resources ensuring that your databases get the performance they need when they need it. This provides all the benefits of scaling on demand. 3. Simplified Management: Managing resources for a pool rather than individual databases simplifies your management tasks and provides a predictable budget for the pool. No Changes Needed to Command Line One of the best parts of this new feature is that there are no changes needed to the command line for the `Get-AzDataMigrationSkuRecommendation` and `get-sku-recommendation` commands. You can continue using these commands as you always have, and the tool will now include recommendations for Elastic Pools where appropriate. How to Use the Command To get started with the new Elastic Pool recommendations, simply run the `Get-AzDataMigrationSkuRecommendation` command in Azure PowerShell or the `get-sku-recommendation` command in Azure CLI. Here’s a quick guide on how to run the command and review the output: Azure PowerShell: # Run SKU Recommendation Get-AzDataMigrationSkuRecommendation -OutputFolder "C:\Output" -DisplayResult -Overwrite Azure CLI: # Run SKU Recommendation az datamigration get-sku-recommendation --output-folder "C:\Output" --display-result --overwrite Reviewing the Output JSON File The output of the SKU recommendation will be saved in a JSON file in the specified output folder. This file will include detailed recommendations for Azure SQL Database SKUs, including 1 or more Elastic Pools as needed to accommodate the databases in the SQL Server instance. Here’s what to look for in the JSON file: - ResourceType: Indicates whether the recommendation is for a SingleDatabase or ElasticPool. - PoolCount: The total number of Elastic Pools of the given configuration recommended. - ElasticPoolMemberDatabases: Lists the databases that are recommended to be included in the pool. Conclusion We believe that the addition of Elastic Pool recommendations to our DMS automation tools will provide significant cost savings while improving scalability, and simplifying management, making it easier than ever to migrate to Azure SQL Database. Next Steps We encourage you to try out this new feature and let us know your feedback. For more information: Review our Azure SQL Database Elastic Pools documentation. Review DMS automation module Az.DataMigration in Azure Powershell. Review DMS automation commands in module az datamigration in Azure CLI. Review the samples documented in the Data Migration Sample Repository. Thank you for your continued support and happy migrating!252Views1like0CommentsTLS 1.0 and 1.1 Support on Azure Web App
I know Azure is winding up support for TLS 1.0 and 1.1 by August 2025. Does anyone can help me to access our existing IoT devices to connect to Azure Web App using this TLS 1.0 and 1.1? Our device were connecting to azure server fine until Mid March 2025. At end of March 2025 we lost access to this IoT devices which uses TLS 1.0, 1.1. Any thoughts or any one have any idea why it stopped before the deadline of August 2025? And what can be done to get back this devices online? #IoTHub #WebApp #Azure #TLS #TLS1.0 #TLS1.1 #SNI