DevOps in the era of Generative AI: Foundations of LLMOps
Published Jun 02 2024 12:00 AM 3,271 Views

Spotlight on AI in your DevOps Lifecycle

Explore the transformative power of artificial intelligence in DevOps with our comprehensive series, "Spotlight on AI in Your DevOps Lifecycle." This series delves into the integration of AI into every stage of the DevOps process, providing invaluable insights and practical guidance. Whether you're a seasoned professional or new to the field, these episodes will equip you with the knowledge to leverage AI effectively in your development and operations lifecycle.




Sessions: Register Now.


DevOps in the era of Generative AI: Foundations of LLMOps

With the advent of generative AI, the development life cycle of intelligent applications has undergone a significant change. This shift from classical ML to LLMs-based solutions leads to implications not only on how we build applications but also in how we test, evaluate, deploy, and monitor them. The introduction of LLMOps is an important development that requires understanding the foundations of this new approach to DevOps.

The session "DevOps in the era of Generative AI: Foundations of LLMOps" will explore the basics of LLMOps, providing examples of tools and practices available in the Azure ecosystem. This talk will be held on June 12th, 2024.


Watch On Demand

Continuous Integration and Continuous Delivery (CI/CD) for AI
The session "Continuous Integration and Continuous Delivery (CI/CD) for AI" will focus on MLOps for machine learning and AI projects. This talk will cover how to set up CI/CD and collaborate with others using GitHub. It will also discuss version control, automated testing, and deployment strategies.

The session will take place on June 20th, 2024, from 6:00 PM to 7:00 PM (UTC).
Register Now.

Monitoring, Logging, and AI Model Performance
Building an AI application does not stop at deployment. The core of any AI application is the AI model that performs certain tasks and provides predictions to users. However, AI models and their responses change over time, and our applications need to adapt to these changes in a scalable and automated way.

The session "Monitoring, Logging, and AI Model Performance" will explore how to use tools to monitor the performance of AI models and adapt to changes in a scalable way. This talk will be held on June 26th, 2024, from 4:00 PM to 5:00 PM (UTC).
Register Now.

Scaling and Maintaining Your Applications on Azure
Azure is a popular cloud platform that provides many benefits for running AI applications. This session will focus on the practical aspects of running your applications on Azure, with a special emphasis on leveraging Azure OpenAI and Python FastAPI. The talk will cover best practices for scaling your applications to meet demand and maintaining their health and performance.

The session will be held on July 3rd, 2024, from 4:00 PM to 5:00 PM (UTC).
Register Now.

Security, Ethics, and Governance in AI
AI brings many exciting new features into the tech landscape, but it also introduces new security risks and challenges. In this session, we will learn about the best practices and tools for securing AI-enabled applications and addressing ethical and governance issues related to AI.

The session will take place on July 10th, 2024, from 4:00 PM to 5:00 PM (UTC).

Register Now.

Version history
Last update:
‎Jun 19 2024 12:35 AM
Updated by: