Blog Post

Apps on Azure Blog
5 MIN READ

Build AI faster and run with confidence

Mike_Hulme's avatar
Mike_Hulme
Icon for Microsoft rankMicrosoft
Nov 19, 2024

Intelligence is the new baseline for modern apps. We’re seeing the energy and excitement from AI coming to life in apps being built and modernized today.

AI is now the critical component of nearly every application and business strategy. And most importantly, we’ve reached an inflection point where organizations are moving from widespread experimentation to full production.

In fact, we just completed a global survey of more than 1,500 developers and IT influencers: 60% of respondents have AI apps under development that will go to production over the next year. What’s even more exciting is the evolution of what organizations are looking to achieve.

Agents and intelligent assistants continue to shape customer service, make customer experiences more personal and help employees make faster, more accurate decisions. And agents that unlock organizational knowledge are increasingly vital. But more than ever, AI is reinventing core business processes. From long-established business to startups (where our business has increased 200%), this holds incredible promise to drive revenue and growth as well as competitive differentiation.

AI is both an inspiration for how the next generation of apps will be defined and an accelerator for how we will build and deliver on that promise. This is the tension that we see with every business; intense pressure to move fast, to take advantage of incredible innovation with speed, while delivering every app securely with the performance and scale that meets the unique needs of AI apps. Developers are on the front lines, responsible for moving quickly from idea to code to cloud while delivering secure, private and trusted AI services that meet ROI and cost requirements.

This combination of building fast and running with confidence is core to Microsoft’s strategy. So, as we kick off Microsoft Ignite 2024 today, we’re bringing a slate of new and enhanced innovation that makes this transition to production faster and easier for all.

With new innovations across the Copilot and AI stack, new integrations that bring together services across our AI platform and developer tools and an expanding set of partnerships across the AI toolchain, there’s a ton of great innovation at Ignite. Let’s look at just a few…

GitHub Copilot

GitHub Copilot is the most widely used AI pair-programming tool in the world, with millions of developers using it daily. We’re seeing developers code up to 55% faster through real-time code suggestions and solutions, and elevating quality, generating cleaner, more resilient, code that is easier to maintain. This week we’re showing how Copilot is evolving to drive efficiency across the entire application lifecycle. Developers in VS Code now have the flexibility to select from an array of industry-leading models, including OpenAI’s GPT-4o and Anthropic Claude Sonnet 3.5. And they have the power to use natural language chat to implement complex code changes across multiple files.

GitHub Copilot upgrade assistant for Java 

Now the power of Copilot can help with existing apps as well. Keeping Java apps up to date can be a time-consuming task. GitHub Copilot upgrade assistant for Java uses AI to simplify Java applications upgrades with autonomous agents. The process is transparent, keeping a human in the loop, while your environment actively learns from your context and adjustments, improving accuracy for future upgrades.

GitHub Copilot for Azure

GitHub Copilot for Azure streamlines path from code to production on Azure for every developer, even those new to Azure. Through Copilot, you can use natural language to learn about your Azure services and resources, find sample applications and templates and quickly deploy to Azure while supporting your enterprise standards and guidelines. Once in production, GitHub Copilot for Azure helps you troubleshoot and resolve application issues stay on top of costs. Copilot knows the full context of you as a developer and your systems to make every recommendation tailored to your unique needs. Available now in public preview, it does it all from the tools you already use helping you minimize interruptions and stay focused.

Azure AI Foundry

New at Ignite, Azure AI Foundry brings together an end-to-end AI platform across models, tooling, safety, and monitoring to help you efficiently and cost-effectively design and scale your AI applications. By integrating with popular developer tools like GitHub, Visual Studio, and Copilot Studio, Azure AI Foundry opens up this full portfolio of services for developers, giving them access to the best, most advanced models in the world along with tools for building agents on Azure and a unified toolchain to access AI services through one interface. Azure AI Foundry is a key offering enabling easy integration of Azure AI capabilities into your applications.

AI Template Gallery

AI App Template Gallery is a new resource designed to help you build and deploy AI applications in a matter of minutes, with the flexibility to use the programming language, framework and architecture of your choice. The gallery offers more than 25 curated, ready-to-use application templates, creating a clear path to kickstart your AI projects with confidence and efficiency. And developers can easily discover and access each of them through GitHub Copilot for Azure, further simplifying access through your preferred developer tool.

Azure Native Integrations

Azure Native Integrations gives developers access to a curated set of ISV services available directly in the Azure portal, SDK, and CLI. This means that developers have the flexibility to work with their preferred vendors across the AI toolchain and other common solution areas, with simplified single sign-on and management, while staying in Azure. Joining our portfolio of integrated services are PineconeWeights & BiasesArize, and LambdaTest all now available in private preview. NeonPure Storage Cloud for Azure VMware Solution (AVS), and Dell APEX File Storage will also be available soon as part of Azure Native Integrations. 

Azure Container Apps with Serverless GPUs

Azure Container Apps now supports Serverless GPUs in public preview, enabling effortless scaling and flexibility for real-time custom model inferencing and other machine learning tasks. Serverless GPUs enable you to seamlessly run your AI workloads on-demand, accessing powerful NVIDIA accelerated computing resources, with automatic scaling, optimized cold start and per-second billing without the need for dedicated infrastructure management. 

Azure Essentials for AI Adoption 

We also recognize great technology is only part of your success. Microsoft has published design patterns, baseline reference architectures, application landing zones, and a variety of Azure service guides for Azure OpenAI workloads along with FinOps guidance for AI. This week, we are excited to announce new AI specific guidance in the Cloud Adoption Framework and the Azure Well-Architected Framework to help adopt AI at scale while meeting requirements for reliability, security, operations and cost. 

 

Updated Nov 19, 2024
Version 8.0
No CommentsBe the first to comment