Welcome to Episode 8! This week, we explored how AI is moving from the cloud to your own device, making it faster, more private, and more accessible. We also saw a real-world customer story from Xander Glasses, showing how AI can help people with hearing loss by providing sight for sound.
1. Weekly Highlights
The episode kicks off with new highlights covering the top 5 news items from the past week.
- RFT Observability is in public preview: With auto-evals for Reinforcement Fine Tuning
- GitHub Copilot Pro with Spark is in public preview. Go from idea to deployed app in minutes!
- DAViD: Data-efficient and Accurate Vision Models from Synthetic Data: Paper & Datasets!
- The Future Of AI: Optimize Your Site For Agents - best practices including Microsoft NLWeb.
- MCP For Beginners: The Video Series - 10+ part video series accompanying the curriculum!
2. Spotlight On: Foundry Local with Maanav Dalal
This episode put the spotlight on local and on-device AI by looking the Foundry Local product from Core AI. Watch the video to see a live demo of the capabilities from Core AI Product Manager, Maanav Dalal.
- What is Foundry Local? - A toolkit to run open-source AI models directly on your device (CPU, GPU, NPU) for fast, private, and efficient inference.
- Why does it matter? - Developers can work with data locally, switch seamlessly between local and cloud, and leverage rich tooling (CLI, SDK, REST API).
- Demo Highlights:
- Install Foundry Local (GitHub Repo)
- List and run models on device
- Use RAG (Retrieval-Augmented Generation) with local documents
- Monitor deployments and cache models
- Integrate with VS Code and GitHub Copilot (AI Toolkit for VS Code)
3. Customer Story: Xander Glasses
Customer Stories puts the focus on Azure AI usage in the real world. In this episode we talk to the team behind Xander Glasses, a wearable assistive device that brings sight to sound with real-time captioning.
- Problem: - Real-time captioning for people with hearing loss, especially in noisy environments or where Wi-Fi/cloud is unreliable.
- Solution: - Smart glasses with embedded Microsoft Speech SDK (Speech SDK), running speech-to-text locally for instant captions.
- Innovation: - Using generative AI to simplify captions for users with cognitive disabilities, and supporting multiple languages and translation features.
- Learn more about the device experience: XanderGlasses
4. Key Takeaways
Here are the key learnings I had from this episode.
- On-Device AI Inference is Here -Foundry Local lets you run powerful models on your own hardware, unlocking speed and privacy benefits for flexibility.
- Seamless Integration With Existing Workflows - Foundry Local provides an intuitive CLI, SDK, and REST API, with easy scaling from local development to Azure AI Foundry (cloud) as needed.
- Real-World Azure AI Impact - Xander Glasses show how AI can make a difference for accessibility, with instant, reliable captions using Microsoft Speech SDK.
- Site Optimization For Agents: Building APIs and semantic markup into your site makes it accessible to AI agents. Explore NLWeb and other technologies that can help.
Sharda's Tips: How I Wrote This Blog
Writing this blog felt like sharing my own learning journey with friends. I started by thinking about why this topic matters and how it could help someone new to Azure. I tried to explain things in a way that’s easy to understand, using simple language and real examples from the episode. To organize my thoughts and make sure I didn’t miss anything important, I used GitHub Copilot.
Here’s the prompt I gave Copilot to help me draft this blog:
Generate a technical blog post for Model Mondays S2E8 based on the transcript and episode details. Focus on Foundry Local, Xander Glasses, and real-world demos. Explain the concept for students, add a section on practical applications, and share tips for writing technical blogs. Make it clear, engaging, and useful for developers and students.
After watching the video, I felt inspired to try out these tools myself. The way the speakers explained and demonstrated everything made me believe that anyone can get started, no matter their background. My goal with this blog is to help you feel the same way—curious, confident, and ready to explore what AI and Azure can do for you. If you have questions or want to share your own experience, I’d love to hear from you.
Coming Up Next Week:
Want to build agentic AI applications but looking for resources to learn configuration and design patterns? Join us as we talk to Mona Whalin about the Azure AI Foundry Agent Catalog – open-source samples you can explore to accelerate your agent development by integration into projects!
1️⃣ | Register For The Livestream - Aug 11, 2025
2️⃣ | Register For The AMA - Aug 15, 2025
3️⃣ | Ask Questions & View Recaps - Discussion Forum
About Model Mondays
Model Mondays is a weekly series designed to help you build your Azure AI Foundry Model IQ with three elements:
- 5-Minute Highlights – Quick news and updates about Azure AI models and tools on Monday
- 15-Minute Spotlight – Deep dive into a key model, protocol, or feature on Monday
- 30-Minute AMA on Friday – Live Q&A with subject matter experts from Monday livestream
Want to get started?
- Register For Livestreams - every Monday at 1:30pm ET
- Watch Past Replays to revisit other spotlight topics
- Register For AMA - to join the next AMA on the schedule
- Recap Past AMAs - check the AMA schedule for episode specific links
Join The Community
Great devs don't build alone! In a fast-paced developer ecosystem, there's no time to hunt for help. That's why we have the Azure AI Developer Community. Join us today and let's journey together!
- Join the Discord - for real-time chats, events & learning
- Explore the Forum - for AMA recaps, Q&A, and help!
About Me:
I'm Sharda, a Gold Microsoft Learn Student Ambassador interested in cloud and AI. Find me on Github, Dev.to, Tech Community and Linkedin. In this blog series I have summarized my takeaways from this week's Model Mondays livestream.