Ignite 2023: What’s new in Azure AI Platforms – Charting the Future with Innovative AI and ML
Published Nov 15 2023 08:02 AM 10.2K Views
Microsoft

AI is a powerful driver of innovation and customers around the globe and across industries are using Azure AI to transform how their businesses operate and grow. Our latest updates mark a significant leap forward, offering a suite of tools and integrations that are not just novel but visionary. These enhancements are designed to meet the diverse and expanding needs of developers, data scientists, and machine learning experts. By equipping these professionals with advanced tools and capabilities, we’re enabling them to be at the forefront of innovation, empowering their organizations to navigate and lead in the new era of AI. From the unified development environment of Azure AI Studio to the precision of prompt flow, each platform and capability update is a testament to our commitment to providing cutting-edge, user-friendly, and useful AI solutions.

 

In this blog, I’ll highlight some of the major announcements we made for Azure AI Platforms at Microsoft Ignite 2023.

 

Public Preview of Azure AI Studio

In Satya Nadella’s opening keynote, he announced the public preview of Azure AI Studio, a state-of-the-art platform that simplifies generative AI application and copilot development. We built Azure AI Studio to be the cornerstone of our Azure AI portfolio, by seamlessly integrating our fast-growing model catalog, tools, and services to provide a unified development experience. Azure AI Studio is an exciting step forward towards our vision for a cohesive, user-friendly AI environment, where every ingredient a developer needs, from the newest models to data management systems, is integrated and available. AI developers and machine learning professionals can now start their journey with the intuitive interface, Command Line Interface (CLI), and Software Development Kit (SDK) of Azure AI Studio and smoothly transition to Azure Machine Learning for more granular controls and advanced model monitoring. Read this blog to learn more about the announcement.

 

General Availability of Prompt Flow

Prompt flow, now generally available in Azure Machine Learning and in public preview in Azure AI Studio, is a tool designed for LLMOps at enterprise scale. It supports version-control and collaboration through any source code management tool. Thanks to the valuable feedback from our community, we have added many features to prompt flow since it was first launched in May. With prompt flow, you can now manage different versions of your flow assets such as prompts, code, configurations, and environments through a code repository, and easily track changes and revert to previous versions when needed. Integration with CI/CD pipelines automates tasks like connection setup, prompt tuning, flow evaluation, and experimentation, helping to simplify processes. Prompt flow also makes it easier to evaluate LLM flows for quality and safety, so organizations can get their applications into production with higher confidence. Learn more about prompt flow’s latest features in this detailed blog.

 

Public Preview of OneLake as a datastore in Azure Machine Learning and Azure AI Studio

OneLake from Microsoft Fabric is a transformative data lakehouse platform that integrates seamlessly with Azure AI platforms. At Ignite, we announced the public preview of OneLake integration with Azure Machine Learning and Azure AI Studio. OneLake's unique ability to handle vast and varied datasets in a unified, efficient way optimizes data storage and retrieval for AI, while also accelerating the development and deployment of AI models. This is especially relevant in scenarios demanding high-volume data processing and complex computational tasks, which are commonplace in advanced AI and ML projects. OneLake, as a key component of Microsoft Fabric, brings a powerful data lakehouse architecture that significantly enhances Azure Machine Learning's data management capabilities. This integration within Fabric offers advanced data integration features, marking a pivotal advancement in handling and processing complex AI and ML data sets. Read this blog to learn more about integration with OneLake.

 

Upcoming General Availability of Model Catalog in Azure AI

Azure AI model catalog, soon to be generally available, is experiencing an exciting expansion with the inclusion of new, diverse, state-of-the-art AI models from leading industry providers. This expansion features significant additions like Mistral 7B, Phi from Microsoft Research, Stable Diffusion, Meta's Code Llama, and NVIDIA's latest models. Each model offers unique capabilities, giving developers a broader spectrum of choices for their projects. We're also thrilled to announce the upcoming preview of Model-as-a-Service (MaaS) with Llama 2, Cohere's Command, G42’s Jais model, and Mistral’s premium models, through inference APIs and hosted fine-tuning. MaaS simplifies the process for developers, especially during the dev-test phase, by eliminating the need for dedicated VMs for hosting models. With cost-effective, token-based billing for inference APIs, MaaS presents an attractive, easy-start option for generative AI projects. Explore new models in the model catalog and learn more about MaaS in this blog.

 

Public Preview of Model Benchmarks

The model benchmarks feature, now in public preview in Azure Machine Learning and Azure AI Studio, provides a vital tool for evaluating and comparing the performance of various foundation models. This feature simplifies the model selection process by helping users make informed decisions based on accuracy metrics, so they can optimize the performance of their AI solutions in alignment with specific project needs. The benchmarking process itself is robust and transparent. It utilizes a base model from Azure AI model catalog, a publicly available dataset, and a metric score for evaluation. The datasets used are sourced from reputable repositories, ensuring the integrity and relevance of the benchmarks. Each model is evaluated using prompts designed according to industry best practices, ensuring that the comparisons are fair and meaningful. To learn more about model benchmarks, read this announcement blog.

 

General Availability of Managed Feature Store

Managed feature store empowers machine learning professionals to develop and productionize features independently. We’re excited to announce the general availability of managed feature store in Azure Machine Learning. Machine learning professionals can simply provide a feature set specification and let the system handle serving, securing, and monitoring of their features, freeing them from the overhead of setting up and managing the underlying feature engineering pipelines. This integration across the machine learning lifecycle accelerates model experimentation, enhances model reliability, and reduces operational costs, significantly simplifying the MLOps experience. Dive deeper into the new capabilities of managed feature store in this doc.

 

General Availability of Pipeline Component Deployments for Batch Endpoints  

The introduction of pipeline component deployments for batch endpoints marks an evolution in our MLOps capabilities. This feature facilitates the movement and control of machine learning pipelines as single units across different environments. It represents a significant advancement in managing ML pipelines, enabling more effective and streamlined operations in organizations. Read this blog to learn more about pipeline component deployments.

 

General Availability of Serverless Compute

Azure Machine Learning serverless compute, now in generally available, allows machine learning experts to concentrate on crafting ML models without delving into the complexities of compute infrastructure. It simplifies job submission, dynamically allocates resources, and supports all Azure Machine Learning job types, including generative AI tasks. Serverless compute not only minimizes the administrative burden with its managed network isolation and rigorous security protocols but also offers cost optimization and reduced wait times, elevating the efficiency and focus of machine learning professionals in their projects. Read this announcement to learn more about this feature.

 

Looking Ahead

Azure AI is not just keeping pace with AI advancements; it's setting the pace. Each new feature and integration is a step towards a more connected, efficient, and powerful AI ecosystem to help our customers achieve more. The AI Platform and Tooling team believes that the future of AI is collaborative, combining the best of technology and creativity with a focus on disciplined AI governance. As we move forward and continue to innovate, breaking new ground and opening doors to uncharted territory, we remain committed to the principles and practices of safe, responsible and accessible AI. Join us on this transformative journey and be a part of the next wave of AI-driven transformation.

 

Learn more

To learn more, watch the Microsoft Ignite 2023 sessions to get familiar with other Azure AI announcements and start experimenting with Azure AI Studio and Azure Machine Learning.

 

Co-Authors
Version history
Last update:
‎Nov 15 2023 10:51 AM
Updated by: