Challenge: generate PowerPoint slides for AI Agents series, how do you use AI and automate this?
Introduction
A Learn Live is a series of events where over a period of 45 to 60 minutes, a presenter walks attendees through a learning module or pathway. The show/series, takes you through a Microsoft Learn Module, Challenge or a particular sample. Between April 15 to May 13, we will be hosting a Learn Live series on "Master the Skills to Create AI Agents." This premise is necessary for the blog because I was tasked with generating slides for the different presenters.
Challenge: generation of the slides
The series is based on the learning path: Develop AI agents on Azure and each session tackles one of the learn modules in the path. In addition, Learn Live series usually have a presentation template each speaker is provided with to help run their sessions. Each session has the same format as the learn modules: an introduction, lesson content, an exercise (demo), knowledge check and summary of the module. As the content is already there and the presentation template is provided, it felt repetitive to do create the slides one by one. And that's where AI comes in - automating slide generation for Learn Live modules.
Step 1 - Gathering modules data
The first step was ensuring I had the data for the learn modules, which involved collecting all the necessary information from the learning path and organizing it in a way that can be easily processed by AI. The learn modules repo is private and I have access to the repo, but I wanted to build a solution that can be used externally as well. So instead of getting the data from the repository, I decided to scrape the learn modules using BeautifulSoup into a word document. I created a python script to extract the data, and it works as follows:
- Retrieving the HTML – It sends HTTP requests to the start page and each unit page.
- Parsing Content – Using BeautifulSoup, it extracts elements (headings, paragraphs, lists, etc.) from the page’s main content.
- Populating a Document – With python-docx, it creates and formats a Word document, adding the scraped content.
- Handling Duplicates – It ensures unique unit page links by removing duplicates.
- Polite Scraping – A short delay (using time.sleep) is added between requests to avoid overloading the server.
First, I installed the necessary libraries using: pip install requests beautifulsoup4 python-docx. Next, I ran the script below, converting the units of the learn modules to a word document:
import requests
from bs4 import BeautifulSoup
from docx import Document
from urllib.parse import urljoin
import time
headers = {"User-Agent": "Mozilla/5.0"}
base_url = "https://learn.microsoft.com/en-us/training/modules/orchestrate-semantic-kernel-multi-agent-solution/"
def get_soup(url):
response = requests.get(url, headers=headers)
return BeautifulSoup(response.content, "html.parser")
def extract_module_unit_links(start_url):
soup = get_soup(start_url)
nav_section = soup.find("ul", {"id": "unit-list"})
if not nav_section:
print("❌ Could not find unit navigation.")
return []
links = []
for a in nav_section.find_all("a", href=True):
href = a["href"]
full_url = urljoin(base_url, href)
links.append(full_url)
return list(dict.fromkeys(links)) # remove duplicates while preserving order
def extract_content(soup, doc):
main_content = soup.find("main")
if not main_content:
return
for tag in main_content.find_all(["h1", "h2", "h3", "p", "li", "pre", "code"]):
text = tag.get_text().strip()
if not text:
continue
if tag.name == "h1":
doc.add_heading(text, level=1)
elif tag.name == "h2":
doc.add_heading(text, level=2)
elif tag.name == "h3":
doc.add_heading(text, level=3)
elif tag.name == "p":
doc.add_paragraph(text)
elif tag.name == "li":
doc.add_paragraph(f"• {text}", style='ListBullet')
elif tag.name in ["pre", "code"]:
doc.add_paragraph(text, style='Intense Quote')
def scrape_full_module(start_url, output_filename="Learn_Module.docx"):
doc = Document()
# Scrape and add the content from the start page
print(f"📄 Scraping start page: {start_url}")
start_soup = get_soup(start_url)
extract_content(start_soup, doc)
all_unit_links = extract_module_unit_links(start_url)
if not all_unit_links:
print("❌ No unit links found. Exiting.")
return
print(f"🔗 Found {len(all_unit_links)} unit pages.")
for i, url in enumerate(all_unit_links, start=1):
print(f"📄 Scraping page {i}: {url}")
soup = get_soup(url)
extract_content(soup, doc)
time.sleep(1) # polite delay
doc.save(output_filename)
print(f"\n✅ Saved module to: {output_filename}")
# 🟡 Replace this with any Learn module start page
start_page = "https://learn.microsoft.com/en-us/training/modules/orchestrate-semantic-kernel-multi-agent-solution/"
scrape_full_module(start_page, "Orchestrate with SK.docx")
Step 2 - Utilizing Microsoft Copilot in PowerPoint
To automate the slide generation, I used Microsoft Copilot in PowerPoint. This tool leverages AI to create slides based on the provided data. It simplifies the process and ensures consistency across all presentations. As I already had the slide template, I created a new presentation based on the template. Next, I used copilot in PowerPoint to generate the slides based on the presentation. How did I achieve this?
- I uploaded the word document generated from the learn modules to OneDrive
- In PowerPoint, I went over to Copilot and selected ```view prompts```, and selected the prompt: create presentations
- Next, I added the prompt below and the word document to generate the slides from the file.
Create a set of slides based on the content of the document titled "Orchestrate with SK". The slides should cover the following sections:
• Introduction
• Understand the Semantic Kernel Agent Framework
• Design an agent selection strategy
• Define a chat termination strategy
• Exercise - Develop a multi-agent solution
• Knowledge check
• Summary
Slide Layout:
Use the custom color scheme and layout provided in the template.
Use Segoe UI family fonts for text and Consolas for code.
Include visual elements such as images, charts, and abstract shapes where appropriate.
Highlight key points and takeaways.
Step 3 - Evaluating and Finalizing Slides
Once the slides are generated, if you are happy with how they look, select keep it. The slides were generated based on the sessions I selected and had all the information needed. The next step was to evaluate the generated slides, add the Learn Live introduction, knowledge check and conclusion. The goal is to create high-quality presentations that effectively convey the learning content. What more can you do with Copilot in PowerPoint?
- Add speaker notes to the slides
- Use agents within PowerPoint to streamline your workflow.
- Create your own custom prompts for future use cases
Summary - AI for automation
In summary, using AI for slide generation can significantly streamline the process and save time. I was able to automate my work and only come in as a reviewer. The script and PowerPoint generation all took about 10 minutes, something that would have previously taken me an hour and I only needed to counter review based on the learn modules. It allowed for the creation of consistent and high-quality presentations, making it easier for presenters to focus on delivering the content. Now, my question to you is, how can you use AI in your day to day and automate any repetitive tasks?