faculty
410 TopicsEdge AI for Beginners : Getting Started with Foundry Local
In Module 08 of the EdgeAI for Beginners course, Microsoft introduces Foundry Local a toolkit that helps you deploy and test Small Language Models (SLMs) completely offline. In this blog, I’ll share how I installed Foundry Local, ran the Phi-3.5-mini model on my windows laptop, and what I learned through the process. What Is Foundry Local? Foundry Local allows developers to run AI models locally on their own hardware. It supports text generation, summarization, and code completion — all without sending data to the cloud. Unlike cloud-based systems, everything happens on your computer, so your data never leaves your device. Prerequisites Before starting, make sure you have: Windows 10 or 11 Python 3.10 or newer Git Internet connection (for the first-time model download) Foundry Local installed Step 1 — Verify Installation After installing Foundry Local, open Command Prompt and type: foundry --version If you see a version number, Foundry Local is installed correctly. Step 2 — Start the Service Start the Foundry Local service using: foundry service start You should see a confirmation message that the service is running. Step 3 — List Available Models To view the models supported by your system, run: foundry model list You’ll get a list of locally available SLMs. Here’s what I saw on my machine: Note: Model availability depends on your device’s hardware. For most laptops, phi-3.5-mini works smoothly on CPU. Step 4 — Run the Phi-3.5 Model Now let’s start chatting with the model: foundry model run phi-3.5-mini-instruct-generic-cpu:1 Once it loads, you’ll enter an interactive chat mode. Try a simple prompt: Hello! What can you do? The model replies instantly — right from your laptop, no cloud needed. To exit, type: /exit How It Works Foundry Local loads the model weights from your device and performs inference locally.This means text generation happens using your CPU (or GPU, if available). The result: complete privacy, no internet dependency, and instant responses. Benefits for Students For students beginning their journey in AI, Foundry Local offers several key advantages: No need for high-end GPUs or expensive cloud subscriptions. Easy setup for experimenting with multiple models. Perfect for class assignments, AI workshops, and offline learning sessions. Promotes a deeper understanding of model behavior by allowing step-by-step local interaction. These factors make Foundry Local a practical choice for learning environments, especially in universities and research institutions where accessibility and affordability are important. Why Use Foundry Local Running models locally offers several practical benefits compared to using AI Foundry in the cloud. With Foundry Local, you do not need an internet connection, and all computations happen on your personal machine. This makes it faster for small models and more private since your data never leaves your device. In contrast, AI Foundry runs entirely on the cloud, requiring internet access and charging based on usage. For students and developers, Foundry Local is ideal for quick experiments, offline testing, and understanding how models behave in real-time. On the other hand, AI Foundry is better suited for large-scale or production-level scenarios where models need to be deployed at scale. In summary, Foundry Local provides a flexible and affordable environment for hands-on learning, especially when working with smaller models such as Phi-3, Qwen2.5, or TinyLlama. It allows you to experiment freely, learn efficiently, and better understand the fundamentals of Edge AI development. Optional: Restart Later Next time you open your laptop, you don’t have to reinstall anything. Just run these two commands again: foundry service start foundry model run phi-3.5-mini-instruct-generic-cpu:1 What I Learned Following the EdgeAI for Beginners Study Guide helped me understand: How edge AI applications work How small models like Phi 3.5 can run on a local machine How to test prompts and build chat apps with zero cloud usage Conclusion Running the Phi-3.5-mini model locally with Foundry Localgave me hands-on insight into edge AI. It’s an easy, private, and cost-free way to explore generative AI development. If you’re new to Edge AI, start with the EdgeAI for Beginners course and follow its Study Guide to get comfortable with local inference and small language models. Resources: EdgeAI for Beginners GitHub Repo Foundry Local Official Site Phi Model Link141Views0likes0CommentsStep-by-Step: Setting Up GitHub Student and GitHub Copilot as an Authenticated Student Developer
To become an authenticated GitHub Student Developer, follow these steps: create a GitHub account, verify student status through a school email or contact GitHub support, sign up for the student developer pack, connect to Copilot and activate the GitHub Student Developer Pack benefits. The GitHub Student Developer Pack offers 100s of free software offers and other benefits such as Azure credit, Codespaces, a student gallery, campus experts program, and a learning lab. Copilot provides autocomplete-style suggestions from AI as you code. Visual Studio Marketplace also offers GitHub Copilot Labs, a companion extension with experimental features, and GitHub Copilot for autocomplete-style suggestions. Setting up your GitHub Student and GitHub Copilot as an authenticated Github Student Developer399KViews14likes16CommentsModel Mondays S2E13: Open Source Models (Hugging Face)
1. Weekly Highlights 1. Weekly Highlights Here are the key updates we covered in the Season 2 finale: O1 Mini Reinforcement Fine-Tuning (GA): Fine-tune models with as few as ~100 samples using built-in Python code graders. Azure Live Interpreter API (Preview): Real-time speech-to-speech translation supporting 76 input languages and 143 locales with near human-level latency. Agent Factory – Part 5: Connecting agents using open standards like MCP (Model Context Protocol) and A2A (Agent-to-Agent protocol). Ask Ralph by Ralph Lauren: A retail example of agentic AI for conversational styling assistance, built on Azure OpenAI and Foundry’s agentic toolset. VS Code August Release: Brings auto-model selection, stronger safety guards for sensitive edits, and improved agent workflows through new agents.md support. 2. Spotlight – Open Source Models in Azure AI Foundry Guest: Jeff Boudier, VP of Product at Hugging Face Jeff showcased the deep integration between the Hugging Face community and Azure AI Foundry, where developers can access over 10 000 open-source models across multiple modalities—LLMs, speech recognition, computer vision, and even specialized domains like protein modeling and robotics. Demo Highlights Discover models through Azure AI Foundry’s task-based catalog filters. Deploy directly from Hugging Face Hub to Azure with one-click deployment. Explore Use Cases such as multilingual speech recognition and vision-language-action models for robotics. Jeff also highlighted notable models, including: SmoLM3 – a 3 B-parameter model with hybrid reasoning capabilities Qwen 3 Coder – a mixture-of-experts model optimized for coding tasks Parakeet ASR – multilingual speech recognition Microsoft Research protein-modeling collection MAGMA – a vision-language-action model for robotics Integration extends beyond deployment to programmatic access through the Azure CLI and Python SDKs, plus local development via new VS Code extensions. 3. Customer Story – DraftWise (BUILD 2025 Segment) The finale featured a customer spotlight on DraftWise, where CEO James Ding shared how the company accelerates contract drafting with Azure AI Foundry. Problem Legal contract drafting is time-consuming and error-prone. Solution DraftWise uses Azure AI Foundry to fine-tune Hugging Face language models on legal data, generating contract drafts and redline suggestions. Impact Faster drafting cycles and higher consistency Easy model management and deployment with Foundry’s secure workflows Transparent evaluation for legal compliance 4. Community Story – Hugging Face & Microsoft The episode also celebrated the ongoing collaboration between Hugging Face and Microsoft and the impact of open-source AI on the global developer ecosystem. Community Benefits Access to State-of-the-Art Models without licensing barriers Transparent Performance through public leaderboards and benchmarks Rapid Innovation as improvements and bug fixes spread quickly Education & Empowerment via tutorials, docs, and active forums Responsible AI Practices encouraged through community oversight 5. Key Takeaways Open Source AI Is Here to Stay Azure AI Foundry and Hugging Face make deploying, fine-tuning, and benchmarking open models easier than ever. Community Drives Innovation: Collaboration accelerates progress, improves transparency, and makes AI accessible to everyone. Responsible AI and Transparency: Open-source models come with clear documentation, licensing, and community-driven best practices. Easy Deployment & Customization: Azure AI Foundry lets you deploy, automate, and customize open models from a single, unified platform. Learn, Build, Share: The open-model ecosystem is a great place for students, developers, and researchers to learn, build, and share their work. Sharda's Tips: How I Wrote This Blog For this final recap, I focused on capturing the energy of the open source AI movement and the practical impact of Hugging Face and Azure AI Foundry collaboration. I watched the livestream, took notes on the demos and interviews, and linked directly to official resources for models, docs, and community sites. Here’s my Copilot prompt for this episode: "Generate a technical blog post for Model Mondays S2E13 based on the transcript and episode details. Focus on open source models, Hugging Face, Azure AI Foundry, and community workflows. Include practical links and actionable insights for developers and students! Learn & Connect Explore Open Models in Azure AI Foundry Hugging Face Leaderboard Responsible AI in Azure Machine Learning Llama-3 by Meta Hugging Face Community Azure AI Documentation About Model Mondays Model Mondays is your weekly Azure AI learning series: 5-Minute Highlights: Latest AI news and product updates 15-Minute Spotlight: Demos and deep dives with product teams 30-Minute AMA Fridays: Ask anything in Discord or the forum Start building: Watch Past Replays Register For AMA Recap Past AMAs Join The Community Don’t build alone! The Azure AI Developer Community is here for real-time chats, events, and support: Join the Discord Explore the Forum About Me I'm Sharda, a Gold Microsoft Learn Student Ambassador focused on cloud and AI. Find me on GitHub, Dev.to, Tech Community, and LinkedIn. In this blog series, I share takeaways from each week’s Model Mondays livestream.189Views0likes0Comments📣 Começando com IA e o MS Copilot — Português
👋 Olá, docentes! Espero que estejam bem! Vamos explorar juntos o mundo da inteligência artificial com o Microsoft Copilot? Participe da sessão “Introdução à IA com o MS Copilot”, pensada especialmente para educadores que estão começando a usar o Copilot. Nesta conversa prática e interativa, vamos desenhar juntos um destino dos sonhos com IA, enquanto aprendemos os fundamentos da IA generativa, como criar bons prompts e como aplicar essas ferramentas em sala de aula. 📌 A sessão será em português, com exemplos reais, materiais disponíveis e espaço para tirar dúvidas e praticar. 📅 Reunião via Microsoft Teams: Clique aqui para participar no horário.📣 Introdução à IA com o MS Copilot — Português
👋 Olá, docentes! Espero que estejam bem! Vamos explorar juntos o mundo da inteligência artificial com o Microsoft Copilot? Participe da sessão “Introdução à IA com o MS Copilot”, pensada especialmente para educadores que estão começando a usar o Copilot. Nesta conversa prática e interativa, vamos desenhar juntos um destino dos sonhos com IA, enquanto aprendemos os fundamentos da IA generativa, como criar bons prompts e como aplicar essas ferramentas em sala de aula. 📌 A sessão será em português, com exemplos reais, materiais disponíveis e espaço para tirar dúvidas e praticar. 📅 Reunião via Microsoft Teams: Clique aqui para participar no horário.Simulation game designed to train students to handle a bankruptcy case using Power Platform
Technology is transforming the education sector, increasingly helping teachers around the world to provide better, more interactive learning experiences. Dutch institution Avans Applied Science University is a prime example. Here, a law teacher (with no technical background) has recently created a simulation game designed to train students to handle a bankruptcy case as it would happen in real life. Built using Microsoft Power Platform, the game takes students through various steps of the case. All with the help of a chatbot created using Power Virtual Agents, which interacts with students and asks them legal questions.2KViews1like2CommentsAzure Digital Twins Microsoft Learn - Learning Pathway
The goal of the new learn path is to take the learner on a journey of creating an end-to-end industry based solution using ADT. The hands-on exercises leverage a manufacturing scenario, where learners will be using the Chocolate Manufacturing Factory example to complete the practical units and accomplish e2e solution building. Microsoft Azure learning path for developers: Develop with Azure Digital Twins.11KViews1like5CommentsFree Microsoft Fundamentals certifications for worldwide students
Microsoft offers Fundamentals exam vouchers and practice resources to eligible students free through June 2023. Students will need to verify their enrollment at an accredited academic institution to claim the benefits. 496KViews4likes18CommentsAnnouncing GitHub Universe Cloud Skills Challenge!
Join the GitHub Universe Cloud Skills Challenge and start your exiting journey in AI! Whether you’re beginning or looking to change your career, this learning experience is designed to introduce you to some of the most requested GitHub tools for AI beginners, and to explore new opportunities. Join the GitHub Universe Cloud Skills Challenge and start your exciting journey in AI!40KViews13likes25Comments