github
230 TopicsGitHub Copilot Blueprinting Opportunity
Greetings! GitHub is updating a credential for GitHub Copilot, and we need your input through our exam blueprinting survey. This is the third of three BP opportunities that we’ll be asking you to participate! If you’re qualified, feel free to complete them all! The blueprint determines how many questions each skill in the exam will be assigned. Please complete the online survey by November 20, 2025. Please also feel free to forward the survey to any colleagues you consider subject matter experts for this certification. If you have any questions, feel free to contact Alicia Gorriaran at aliciagorriaran@github.com. GitHub Copilot blueprint survey link: https://microsoftlearning.co1.qualtrics.com/jfe/form/SV_0Nh4GDZTXjMcOua Thank you!29Views0likes0CommentsGitHub Actions Blueprint Survey Opportunity
Greetings! This is the second of three BP opportunities that we’ll be asking you to participate! If you’re qualified, feel free to complete them all! GitHub is updating a credential for GitHub Actions, and we need your input through our exam blueprinting survey. The blueprint determines how many questions each skill in the exam will be assigned. Please complete the online survey by November 20, 2025. Please also feel free to forward the survey to any colleagues you consider subject matter experts for this certification. If you have any questions, feel free to contact Alicia Gorriaran at aliciagorriaran@github.com. GitHub Actions blueprint survey link: https://microsoftlearning.co1.qualtrics.com/jfe/form/SV_3JfLywDLgRfD3oy Thank you!16Views0likes0CommentsGitHub Foundations Blueprint Survey Opportunity
Greetings! GitHub is updating a credential for GitHub Foundations, and we need your input through our exam blueprinting survey. The blueprint determines how many questions each skill in the exam will be assigned. Please complete the online survey by November 20, 2025. Please also feel free to forward the survey to any colleagues you consider subject matter experts for this certification. You may send this to people external to Microsoft and GitHub as well. If you have any questions, feel free to contact Alicia Gorriaran at aliciagorriaran@github.com. GitHub Foundations blueprint survey link: https://microsoftlearning.co1.qualtrics.com/jfe/form/SV_d0waGGB2LawUlam Thank you!10Views0likes0CommentsEdge AI for Beginners : Getting Started with Foundry Local
In Module 08 of the EdgeAI for Beginners course, Microsoft introduces Foundry Local a toolkit that helps you deploy and test Small Language Models (SLMs) completely offline. In this blog, I’ll share how I installed Foundry Local, ran the Phi-3.5-mini model on my windows laptop, and what I learned through the process. What Is Foundry Local? Foundry Local allows developers to run AI models locally on their own hardware. It supports text generation, summarization, and code completion — all without sending data to the cloud. Unlike cloud-based systems, everything happens on your computer, so your data never leaves your device. Prerequisites Before starting, make sure you have: Windows 10 or 11 Python 3.10 or newer Git Internet connection (for the first-time model download) Foundry Local installed Step 1 — Verify Installation After installing Foundry Local, open Command Prompt and type: foundry --version If you see a version number, Foundry Local is installed correctly. Step 2 — Start the Service Start the Foundry Local service using: foundry service start You should see a confirmation message that the service is running. Step 3 — List Available Models To view the models supported by your system, run: foundry model list You’ll get a list of locally available SLMs. Here’s what I saw on my machine: Note: Model availability depends on your device’s hardware. For most laptops, phi-3.5-mini works smoothly on CPU. Step 4 — Run the Phi-3.5 Model Now let’s start chatting with the model: foundry model run phi-3.5-mini-instruct-generic-cpu:1 Once it loads, you’ll enter an interactive chat mode. Try a simple prompt: Hello! What can you do? The model replies instantly — right from your laptop, no cloud needed. To exit, type: /exit How It Works Foundry Local loads the model weights from your device and performs inference locally.This means text generation happens using your CPU (or GPU, if available). The result: complete privacy, no internet dependency, and instant responses. Benefits for Students For students beginning their journey in AI, Foundry Local offers several key advantages: No need for high-end GPUs or expensive cloud subscriptions. Easy setup for experimenting with multiple models. Perfect for class assignments, AI workshops, and offline learning sessions. Promotes a deeper understanding of model behavior by allowing step-by-step local interaction. These factors make Foundry Local a practical choice for learning environments, especially in universities and research institutions where accessibility and affordability are important. Why Use Foundry Local Running models locally offers several practical benefits compared to using AI Foundry in the cloud. With Foundry Local, you do not need an internet connection, and all computations happen on your personal machine. This makes it faster for small models and more private since your data never leaves your device. In contrast, AI Foundry runs entirely on the cloud, requiring internet access and charging based on usage. For students and developers, Foundry Local is ideal for quick experiments, offline testing, and understanding how models behave in real-time. On the other hand, AI Foundry is better suited for large-scale or production-level scenarios where models need to be deployed at scale. In summary, Foundry Local provides a flexible and affordable environment for hands-on learning, especially when working with smaller models such as Phi-3, Qwen2.5, or TinyLlama. It allows you to experiment freely, learn efficiently, and better understand the fundamentals of Edge AI development. Optional: Restart Later Next time you open your laptop, you don’t have to reinstall anything. Just run these two commands again: foundry service start foundry model run phi-3.5-mini-instruct-generic-cpu:1 What I Learned Following the EdgeAI for Beginners Study Guide helped me understand: How edge AI applications work How small models like Phi 3.5 can run on a local machine How to test prompts and build chat apps with zero cloud usage Conclusion Running the Phi-3.5-mini model locally with Foundry Localgave me hands-on insight into edge AI. It’s an easy, private, and cost-free way to explore generative AI development. If you’re new to Edge AI, start with the EdgeAI for Beginners course and follow its Study Guide to get comfortable with local inference and small language models. Resources: EdgeAI for Beginners GitHub Repo Foundry Local Official Site Phi Model Link250Views1like0CommentsStep-by-Step: Setting Up GitHub Student and GitHub Copilot as an Authenticated Student Developer
To become an authenticated GitHub Student Developer, follow these steps: create a GitHub account, verify student status through a school email or contact GitHub support, sign up for the student developer pack, connect to Copilot and activate the GitHub Student Developer Pack benefits. The GitHub Student Developer Pack offers 100s of free software offers and other benefits such as Azure credit, Codespaces, a student gallery, campus experts program, and a learning lab. Copilot provides autocomplete-style suggestions from AI as you code. Visual Studio Marketplace also offers GitHub Copilot Labs, a companion extension with experimental features, and GitHub Copilot for autocomplete-style suggestions. Setting up your GitHub Student and GitHub Copilot as an authenticated Github Student Developer399KViews14likes16CommentsRedeeming Azure for Student from your GitHub Student Pack when you do not have an Academic Email
GitHub Student Developer Pack Learn to ship software like a pro. There's no substitute for hands-on experience. But for most students, real world tools can be cost-prohibitive. That's why we created the GitHub Student Developer Pack with some of our partners and friends. Sign up for Student Developer Pack22KViews1like3CommentsCI/CD GitHub Deployment from Dev to UAT Synapse Workspace not Picking Up UAT Resources
Hello, I am setting up CI/CD for Azure Synapse Analytics using GitHub Actions with multiple environments (Dev, UAT, Prod). My Synapse resources are: Dev: ************-dev, azcalsbdatalakedev, calsbvaultdev, SQL DB azcalsbazuresqldev / MetaData UAT: ************-uat, azcalsbdatalakeuat, calsbvaultuat, SQL DB azcalsbazuresqluat / MetaData Prod: ***********-prod, azcalsbdatalakeprod, azcalsbvaultprod, SQL DB azcalsbazuresqlprod / MetaData I have environment-specific parameter override files like uat.json and prod.json. My GitHub workflows (synapse-dev.yml, synapse-uat.yml, etc.) deploy the Synapse publish artifacts (TemplateForWorkspace.json and TemplateParametersForWorkspace.json) with those overrides. Issue: When I run the UAT workflow, deployment completes successfully but the UAT Synapse workspace still shows Dev resources. For example, linked services like LS_ADLS still point to azcalsbdatalakedev instead of azcalsbdatalakeuat. What I have tried: Created overrides for UAT (uat.json) with correct workspace name and connection strings Checked GitHub workflow YAML to confirm the override file is being passed in the az deployment group create step Verified that Dev deployment works fine Tried changing default values in linked services JSON but behavior is inconsistent Questions: Is there a specific way to structure override files (uat.json) for Synapse CI/CD deployments so environment values are correctly replaced? Do I need separate branches in GitHub for Dev, UAT, and Prod, or can I deploy to all environments from main with overrides? Has anyone else seen linked services or parameters still pointing to Dev even after a UAT deployment? Any guidance, best practices, or sample YAML and override examples would be very helpful. Thanks in advance.73Views0likes1CommentUnlock the Power of AI with GitHub Models: A Hands-On Guide
Ready to elevate your coding game? Imagine having the power of advanced AI at your fingertips, ready to integrate into your projects with just a few clicks. Whether you're building a smart assistant, automating workflows, or creating the next big thing, GitHub Models are here to make it happen. Dive into our guide and discover how to get started, customize responses, and even build your own AI-powered applications—all from within the familiar GitHub interface. Your journey into the world of AI starts now. Click to explore and let your creativity take flight!4KViews1like1CommentCreate an Active Student badge on Microsoft Learn
Create an Active Student badge on Microsoft Learn Description: I suggest adding an official Active Student” badge in the Microsoft Community and Microsoft Learn platforms. This badge would Highlight students’ commitment to learning. Encourage continuous participation through visible recognition. Connect learning achievements (Learn) with community contributions (Community Hub). Provide a public credential that can be showcased on a CV or professional profile. Such a symbolic addition would strengthen motivation, visibility, and the bridge between Microsoft Learn and the Community.Solved330Views2likes5CommentsGitHub Administrator Blueprinting Opportunity
Greetings! GitHub is updating a credential for GitHub Administrator, and we need your input through our exam blueprinting survey. The blueprint determines how many questions each skill in the exam will be assigned. Please complete the online survey by October 17, 2025. Please also feel free to forward the survey to any colleagues you consider subject matter experts for this certification. You may send this to people external to Microsoft and GitHub as well. If you have any questions, feel free to contact John Sowles at josowles@microsoft.com or Alicia Gorriaran at aliciagorriaran@github.com. GitHub Administrator blueprint survey link: https://microsoftlearning.co1.qualtrics.com/jfe/form/SV_9st4ir1V8sngkdw Thank you!61Views2likes0Comments