edge
800 TopicsAzure IoT Hub with ADR (preview): Extending Azure capabilities and certificate management to IoT
Operational excellence in every industry begins by linking the physical world to the digital, enabling organizations to turn raw data from connected assets into actionable insights and real-world improvements. Azure IoT Hub and Azure IoT Operations make this possible by seamlessly integrating data from machines whether on a single factory floor or spread across the globe into a unified platform. Together, they serve as the backbone of connected operations, ensuring that assets, sensors this data is then moved to Microsoft Fabric for real-time analytics and further leveraged by AI agents to drive informed decisions. This approach lets organizations scale efficiently, unifying teams, sites, and systems under the Adaptive Cloud Strategy. It enables use of cloud-native and AI technologies across hybrid, multi-cloud, edge, and IoT environments in a single operational model. Azure IoT Hub empowers organizations to securely and reliably manage connected assets across the globe, providing real-time visibility and control over diverse operations. With proven scalability, broad device support, and robust management tools, IoT Hub delivers a unified platform for developing and operating IoT solutions. Organizations in various industries are using Azure IoT Hub to enhance their operations. In mining, sensors provide real-time safety data and support compliance. Fleet managers track equipment health to boost efficiency and prevent failures, while rail operators use GPS and vibration sensors for precise monitoring and issue detection. Ports utilize conveyor and loading system metrics to optimize scheduling and reduce delays. These examples show how Azure IoT Hub delivers actionable insights, greater safety, and operational efficiency through connected devices. As customers evolve, Azure IoT Hub continues to advance, deepening its integration with the Azure ecosystem and enabling AI-driven, connected operations for the next generation of applications. Today, we’re announcing the public preview of Azure IoT Hub integration with Azure Device Registry bringing IoT devices under the purview of Azure management plane via ARM resource representation and securing them with best-in-class Microsoft-backed X.509 certificate management capabilities. From Connected Devices to Connected Operations Ready-to-use AI platforms are enabling organizations to unlock untapped operational data and gain deeper insights. Organizations are leveraging AI to unify machine and enterprise data, extract actionable insights, and translate them into measurable business gains. They are broadly transitioning from connected devices that simply gather and transmit telemetry, to connected operations which empower supervisors and AI agents to interpret events and respond to scenarios in real time. The integration of Azure IoT Hub with ADR enhancements extends the comprehensive capabilities of Azure to IoT devices. With this integration, Azure Device Registry (ADR) acts as the unified control plane for managing both physical assets from Azure IoT Operations and devices from Azure IoT Hub. It provides a centralized registry, ensuring every entity whether an industrial asset or a connected device is uniquely represented and managed throughout its lifecycle. By integrating with Azure IoT Hub, ADR enables consistent device onboarding, certificate management, and operational visibility at scale. This integration simplifies large-scale IoT fleet management and supports compliance and auditability across diverse deployments. What’s New in this Preview We’re excited to announce the public preview of new capabilities that bring IoT devices into the broader Azure ecosystem. This integration allows IoT to be managed at scale through the Azure management plane. It also strengthens security and enables consistent governance across large deployments: Deep integration with Azure: The Azure Device Registry (ADR) now offers a unified control plane, simplifying identity, security, and policy management for millions of devices. New ADR features make it easier to register, classify, and monitor devices, supporting consistent governance and better operational insights. Combined with Device Provisioning Service (DPS), these enhancements help reduce deployment challenges, speed up time-to-value, and lower operational risks. With IoT Hub integration, IoT Hub devices are represented as Azure resources, providing: One unified registry across multiple IoT Hubs and Azure IoT Operations (AIO) instances. ARM-based management for all Azure resources from cloud to edge. A consolidated view of the entire IoT fleet, simplifying large-scale deployments, monitoring and management. Certificate lifecycle management: Now in public preview, this capability enables secure onboarding and automated certificate rotation for IoT devices, directly integrated with ADR and IoT Hub. X.509 certificates are widely recognized for providing a robust security posture by establishing trusted, cryptographically verifiable device identities. Starting today, customers can use a Microsoft-backed PKI to issue X.509 certificates across their IoT fleets. Devices receive operational certificates that authenticate with IoT Hub, chained to Certificate Authorities (CAs). Policy-driven lifecycle management makes certificate renewal simpler and keeps state in sync with your Hubs. This integration sets the stage for Physical AI by connecting digital and physical systems, thus unlocking new possibilities for data and artificial intelligence. Customer feedback from Private Preview This release has received positive feedback from private preview customers. Particularly the Microsoft-supported PKI and certificate management capabilities, highlighting that previous manual processes were inefficient and fragmented. Customers further noted the advantages of grouping devices from multiple IoT Hubs under a unified namespace, which streamlined management. Moreover, the integration of certificate management within ADR has diminished the reliance on custom solutions. “We were genuinely impressed by how seamless it was to implement. With just a few clicks, clear policy definitions, and two calls in firmware, the entire process became automated, frictionless, and reliable with no external dependencies.” – Uriel Kluk, CTO, Mesh Systems Why It Matters These investments make Azure IoT Hub the cornerstone for connected operations at scale, empowering customers to: Reduce manual cert ops with policy‑driven rotation (fewer outages due to expired certs). Consolidate device registry in ADR for cross‑hub fleet governance. Accelerate compliance audits with centralized certificate lineage. Apply advanced AI tooling for predictive insights and automation. Call to Action Explore the new capabilities in public preview today and start building the next generation of connected operations with Azure IoT Hub and ADR. Learn more on Azure IoT Hub documentation384Views0likes0CommentsBridging the Digital and Physical Worlds with Azure IoT Hub and Azure IoT Operations
Operational excellence starts with people. Empowering those people with the most up to date insights and recommendations requires bridging the gap between the physical and digital worlds to generate the best possible outcomes for real time decision making. Creating this bridge transforms data into insights, insights into intelligent actions, and actions into real-world results. Digital Operations, integrated with AI insights, help make this possible by combining data from connected assets across a variety of physical locations and deployment topologies, and transforming that data into insights and decisions that scale using AI and Analytics. At Microsoft Ignite, we’re extending this vision with new Azure IoT Hub and Azure IoT Operations capabilities to manage connected assets at scale, unify digital operations, and realize AI-enabled outcomes across your enterprise. Connected Operations in Action Azure IoT Hub and Azure IoT Operations form the backbone of connected operations, where every asset, sensor, and system contributes to a continuous loop of intelligence by moving data to Microsoft Fabric for real-time analytics, and for use with AI agents. This pattern applies to nearly every sector of the economy. In manufacturing, these capabilities allow production engineers to predict and avoid equipment failures by analyzing vibration and temperature data at the edge before costly downtime occurs. In energy and utilities, distributed sensors can provide data to control points that help balance load, optimize grid efficiency, and ensure safe operations even in remote areas. In transportation and logistics, connected fleets use edge AI models to detect safety risks in real time, while cloud-based analytics optimize routing and fuel efficiency across entire regions. Across industries, this edge-to-cloud collaboration enables the ability for intelligent systems to sense, reason, and act in the physical world with speed, safety, and precision. From Data to Intelligent Action Organizations today must capture and act on data from both geographically dispersed and tightly collocated assets. That data needs to be processed close to where it’s generated, at the edge, to enable real-time decision-making, reduce latency, and enhance security. At the same time, the cloud remains vital for contextualizing operational data with enterprise systems, training AI models, and managing a consistent identity and security framework across all assets. AI models trained in the cloud can then be deployed back to the edge, where they act on events in real time. Operators can work with AI agents to reason over this data whether it’s structured or unstructured, organized in silos, or contained in free-text fields, to provide results to a mixed team of human and AI operational assets. We have a portfolio of products uniquely designed to make this continuum, from edge to cloud, more intelligent, secure, and repeatable. Together with our partners, we help bridge Operational Technology (OT) with Information Technology (IT) to deliver better business outcomes. New at Ignite: Accelerating Digital Operations We’re excited to share our latest set of investments at Ignite across our portfolio of services. A few key announcements: Azure IoT Hub New Features (Preview): Simplifying Secure Connectivity at Scale Azure IoT Hub empowers organizations to securely and reliably manage connected assets across the globe, providing real-time visibility and control over diverse operations. With proven scalability, broad device support, and robust management tools, IoT Hub delivers a unified platform for developing and operating IoT solutions. As customers evolve, Azure IoT Hub continues to advance, deepening its integration with the Azure ecosystem and enabling AI-driven, connected operations for the next generation of applications. The next generation of Azure IoT Hub investments makes it easier and more secure than ever to connect and manage distributed assets. At Ignite, we’re previewing: New certificate management capabilities that simplify device onboarding and lifecycle management. Integration with Azure Device Registry (ADR) that brings all devices into a common control plane, enabling unified identity, security, and policy management. ADR enhancements that make it easier to register, classify, and monitor assets, paving the way for consistent governance and operational insight across millions of devices. This deeper Azure integration with ADR standardizes operations, simplifies oversight of edge portfolios including IoT devices, and brings the full power of Azure’s management ecosystem to IoT and Digital Operations workloads. Azure IoT Operations New Features (GA): The Foundation for AI in the Physical World Azure IoT Operations is more than an edge-to-cloud data plane, it’s the foundation for achieving AI in the physical world, enabling intelligent operational systems that can perceive, reason, and act to drive new operational efficiencies. Built on Arc-enabled Kubernetes, Azure IoT Operations unifies operational and business data across distributed environments, eliminating silos and providing a repeatable, scalable foundation for autonomous, adaptive operations. By extending familiar Azure management concepts to physical sites, Azure IoT Operations creates an AI-ready infrastructure that supports autonomous, adaptive operations at scale. Our latest GA release of Azure IoT Operations introduced major enhancements: Wasm-powered data graphs deliver fast, modular analytics helping businesses make near real-time decisions at the edge. Expanded connectors now include OPC UA, ONVIF, REST/HTTP, Server-Sent Events (SSE), and direct MQTT for richer industrial and IT integrations. OpenTelemetry (OTel) endpoint support enables seamless telemetry pipelines and observability. Asset health monitoring to provide unprecedented visibility and control. These capabilities help bridge Information Technology, Operational Technology, and data domains, empowering customers to discover, collect, process, and send data using open standards while laying the groundwork for self-optimizing environments where AI agents and human supervisors collaborate seamlessly. Integration with Fabric IQ and Digital Twin Builder To fully unlock the value of connected data, organizations need to contextualize it, linking operational signals to business meaning. Fabric IQ, a new offering announced at Ignite, and Digital Twin Builder in Fabric make this possible, transforming raw telemetry into AI-ready context. This integration allows companies to model complex systems, run simulations, and create intelligent feedback loops across manufacturing, logistics, and energy environments. Edge AI: Real-Time Intelligence in the Physical World Azure’s AI capabilities for edge environments bring intelligence closer to where it matters most. And, because these services are Arc-enabled, organizations can develop, manage and scale AI workloads across diverse environments using consistent tooling. Today, we are announcing updates to two of our key services that enable AI at the edge: Live Video Analysis features (Public Preview) in Azure AI Video Indexer enabled by Arc: delivers real-time agentic video intelligence to improve safety, quality, and operations. Edge RAG (Retrieval Augmented Generation) Public Preview Refresh enables local generative AI reasoning with contextual awareness - empowering AI agents to act within industrial constraints securely and efficiently. These innovations accelerate time to insight and help organizations deploy AI where milliseconds matter. Partner Innovation: Scaling Real Business Value Last year, we showcased the breadth of Azure IoT Operations’ industrial ecosystem. This year, we’re celebrating how partners are integrating, co-innovating, and scaling real customer outcomes. Our partners are packaging repeatable, scalable solutions that connect operational data to enterprise systems—enabling AI-driven insights and automation across sites, regions, and industries. At this year’s Ignite, we’re highlighting some great new partner innovations: NVIDIA is working with Microsoft to enable factory digital twins using the OpenUSD standard Siemens is enabling adaptive production through AI- and digital-twin-powered solutions supported by the integration of Siemens Industrial Edge with Azure IoT Operations Litmus Edge integrates with Azure IoT Operations via the Akri framework to automatically discover industrial devices, enable secure data flows, and support Arc-enabled deployment. Rockwell Automation is streamlining edge-to-cloud integration with its FactoryTalk Optix platform by delivering contextualized, AI-ready data seamlessly within Microsoft Azure IoT Operations architectures. Sight Machine is driving advanced analytics for quality and efficiency across multi-site operations. Through initiatives like Akri, Co-Innovate, and Co-Sell Readiness, our ecosystem is developing managed applications, packaged solutions, and marketplace offerings that accelerate deployment and unlock new revenue streams. These collaborations show how Azure IoT Operations is not just a platform, but a growth engine for industrial transformation. The Path Forward With these advancements, we’re helping organizations bring AI to the physical world by turning data into intelligence and intelligence into action. Customers like Chevron and Husqvarna are scaling beyond initial pilots, expanding their deployments from single-site to multi-site rollouts, unlocking new use cases from predictive maintenance to worker safety, and proving how adaptive cloud architectures deliver measurable impact across global operations. By connecting assets, empowering partners, and delivering open, scalable platform solutions, Microsoft is helping industries achieve resilient, adaptive operations that drive measurable business value. The digital and physical worlds are coming together with solutions that are secure, observable, AI-ready, and built to scale from a single site to global operations. Together, we’re creating a smarter, more connected future. Learn More Learn more about Azure IoT Hub and Azure IoT Operations here: Azure IoT – Internet of Things Platform | Microsoft Azure Learn more about new IoT Hub public preview features here: Azure IoT Hub documentation Discover Partner Solutions: Learn how Litmus and Sight Machine are advancing industrial analytics and integration with Azure IoT Operations. Explore Rockwell Automation and Siemens for more on adaptive cloud architectures and shop floor intelligence. Going to Ignite? If you’re at Ignite this week, you can learn more about how Microsoft enables Industrial Transformation at the following sessions: The New Industrial Frontier Reshaping Digital Operations with AI from Cloud and Edge Or come visit us on the show floor at the Azure Arc Expert Meet Up Focus Area in the Cloud and AI Platforms neighborhood348Views0likes0CommentsAzure IoT Operations 2510 Now Generally Available
Introduction We’re thrilled to announce the general availability of Azure IoT Operations 2510, the latest evolution of the adaptive cloud approach for AI in industrial and large scale commercial IoT. With this release, organizations can unlock new levels of scalability, security, and interoperability, empowering teams to seamlessly connect, manage, and analyze data from edge to cloud. What is Azure IoT Operations? Azure IoT Operations is more than an edge-to-cloud data plane, it’s the foundation for AI in physical environments, enabling intelligent systems to perceive, reason, and act in the real world. Built on Arc-enabled Kubernetes clusters, Azure IoT Operations unifies operational and business data across distributed environments, eliminating silos and delivering repeatability and scalability. By extending familiar Azure management concepts to physical sites, AIO creates an AI-ready infrastructure that supports autonomous, adaptive operations at scale. This approach bridges information technology (IT), operational technology (OT), and data domains, empowering customers to discover, collect, process, and send data using open standards while laying the groundwork for self-optimizing environments where AI agents and human supervisors collaborate seamlessly. We've put together a quick demo video showcasing the key features of this 2510 release. Watch below to discover how Azure IoT Operations' modular and scalable data services empowers IT, OT and developers. What’s New in Azure IoT Operations 2510? Management actions: Powerful management actions put you in control of processes and asset configurations, making operations simpler and smarter. Web Assembly (Wasm) data graphs: Wasm-powered data graphs for advanced edge processing, delivering fast, modular analytics and business logic right where your data lives. New connectors: Expanded connector options now include OPC UA, ONVIF, Media, REST/HTTP, and Server-Sent Events (SSE), opening the door to richer integrations across diverse industrial and IT systems. OpenTelemetry (OTel) endpoints: Data flows now support sending data directly to OpenTelemetry collectors, integrating device and system telemetry into your existing observability infrastructure. Improved observability: Real-time health status for assets gives you unmatched visibility and confidence in your IoT ecosystem. Reusable Connector templates: Streamline connector configuration and deployment across clusters. Device support in Azure Device Registry: Azure Device Registry (ADR) now treats devices as first‑class resources within ADR namespaces, enabling logical isolation and role‑based access control at scale. Automatic device and asset discovery and onboarding: Akri‑powered discovery continuously detects devices and industrial assets on the network, then automatically provisions and onboards them (including creating the right connector instances) so telemetry starts flowing with minimal manual setup. MQTT Data Persistence: Data can now be persisted to disk, ensuring durability across broker restarts. X.509 Auth in MQTT broker: The broker now supports X.509 authentication backed by Azure's Device Registry. Flexible RBAC: Built-in roles and custom role definitions to simplify and secure access management for AIO resources. Customers and partners Chevron, through its Facilities and Operations of the Future initiative, deployed Azure IoT Operations with Azure Arc to manage edge-to-cloud workloads across remote oil and gas sites. With a single management plane, the strategy unifies control over thousands of distributed sensors, cameras, robots, and drones. Real-time monitoring and AI enabled anomaly detection not only to enhance operational efficiency but also significantly improve worker safety by reducing routine inspections and enabling remote issue mitigation. This reuse of a global, AI-ready architecture positions Chevron to deliver more reliable, cleaner energy. [microsoft.com] Husqvarna implemented Azure IoT Operations across its global manufacturing network as part of a comprehensive strategy. This adaptive cloud approach integrates cloud, on-premises, and edge systems, preserves legacy investments, and enables real-time edge analytics. The result: data operationalization is 98% faster, imaging costs were slashed by half, productivity was improved, and downtime was reduced. Additionally, AI-driven capabilities like the Factory Companion powered by Azure AI empower technicians with instant, data-informed troubleshooting, shifting maintenance from reactive to predictive across sites. [microsoft.com] Together, these success stories show how Azure IoT Operations, combined with capabilities like Azure Arc, can empower industrial leaders to advance from siloed operations to unified, intelligent systems that boost efficiency, safety, and innovation. Additionally, this year we are celebrating how our partners are integrating, co-innovating, and scaling real customer outcomes. You can learn more about our partner successes at https://aka.ms/Ignite25/DigitalOperationsBlog. Learn more at our launch event Join us at Microsoft Ignite to dive deeper into the latest innovations in Azure IoT Operations 2510. Our sessions will showcase real-world demos plus expert insights on how new capabilities accelerate industrial transformation. Don’t miss the chance to connect with product engineers, explore solution blueprints, and see how Azure IoT Operations lays the foundation for building and scaling physical AI. Get Started Ready to experience the new capabilities in Azure IoT Operations 2510? Explore the latest documentation and quickstart guides at https://aka.ms/AzureIoTOperations Connect with the Azure IoT Tech Community to share feedback and learn from peers.200Views0likes0CommentsSolving the Data Challenge for Manufacturers with Sight Machine & Azure IoT Operations
Delivering Industrial AI: From Data to Results As manufacturers accelerate their digital transformation, the ability to unify and leverage operational data is the difference between incremental improvement and competitive advantage. Today, we’re launching a joint solution with Sight Machine, purpose-built to solve the OT data challenge and deliver the full Industrial AI stack in weeks, not months: Sight Machine and Microsoft Integrated Industrial AI Stack on Azure This offering is proven in the field, already driving measurable productivity gains for customers in automotive, food, and other sectors with rapid POC cycles and commercial-scale deployments. By integrating Sight Machine’s industrial AI platform with Azure IoT Operations and Microsoft Fabric, we standardize and contextualize machine data at scale, enabling analytics, automation, and actionable insights across the enterprise. What Sets This Solution Apart Fast Deployment: Get the full Industrial AI stack up and running in weeks, not months. End-to-End Integration: Sight Machine’s industrial AI platform works seamlessly with Azure IoT Operations and Microsoft Fabric, standardizing OT data for enterprise-wide use. Real Results: Customers in automotive, food, and other industries are already seeing measurable productivity gains and faster decision cycles. Scalable & Secure: Built on Azure’s adaptive cloud and zero-trust security, with SI partners ready to support commercial scale. Delivering a unified Industrial AI stack Today marks a pivotal moment for manufacturers: the launch of a fully integrated Industrial AI solution, jointly delivered by Microsoft and Sight Machine. This offering brings together the entire Industrial AI stack spanning cloud, edge, and on-premises, enabling organizations to unlock transformative business value. The integrated solution enables customers to transform data into business value by seamlessly contextualizing and moving data from the Edge using Sight Machine and Azure IoT Operations to Microsoft Fabric. Within Microsoft Fabric, the data can be further contextualized and enriched to support AI agents and can be extended to visualize 3D digital twins using NVIDIA Omniverse. The integrated solution has following key components: Azure IoT Operations Streams secure, real-time telemetry from industrial assets to the cloud, enabling visibility and control across edge and enterprise environments. Microsoft Fabric Provides a single analytics and governance platform, merging IT and OT data for enterprise-wide insights. Sight Machine Industrial AI Platform Refines data into “gold-level” quality, fully contextualized and structured for AI, predictive maintenance, and process optimization. M365 Copilot & Agentic Intelligence Surfaces actionable insights directly in familiar tools like Teams and Excel, empowering operators and managers to make informed decisions instantly. NVIDIA Omniverse Integration Extends capabilities into immersive 3D digital twins and physics-based simulations, enabling manufacturers to visualize live operations and test changes virtually before implementing them. Customer Impact Manufacturing is the world’s largest sector, generating twice as much data as any other industry. Yet, the complexity and fragmentation of OT (Operational Technology) data have long limited the adoption of AI at scale. Sight Machine solves this challenge by integrating with every level of the Azure stack, structuring raw OT data into high-quality, contextualized “gold” data, ready for advanced analytics and AI. This integrated offering removes barriers to AI adoption. Manufacturers can connect assets, contextualize data, and deliver actionable insights directly to teams, whether in Teams, Excel, or immersive 3D digital twins. The result: higher productivity, smarter operations, and continuous improvement. Take the Next Step Ready to accelerate your digital transformation? Explore the Sight Machine + Azure IoT Operations solution in the Marketplace. Start your journey to smarter manufacturing today: Sight Machine on Azure196Views0likes0CommentsMicrosoft and Rockwell Automation: Transforming Industrial AI Together
Unlocking the Future of Connected Operations In today’s rapidly evolving industrial landscape, manufacturers face mounting pressure to increase agility, optimize operations, and harness data-driven insights across every level of production. The collaboration between Microsoft and Rockwell Automation represents a pivotal step toward achieving these goals. By combining Rockwell’s deep expertise in operational technology (OT) with Microsoft’s adaptive cloud approach, this partnership bridges the gap between OT and IT, creating a unified, intelligent ecosystem that empowers manufacturers to innovate at scale. Together, we enable seamless connectivity, advanced analytics, and AI-driven optimization across the factory floor from edge and cloud environments. Connected Operations powered by Microsoft and Rockwell Rockwell Automation’s FactoryTalk Optix and Microsoft’s Azure IoT Operations together deliver a powerful foundation for industrial transformation. FactoryTalk Optix provides a modern, flexible visualization platform for real-time monitoring and control of OT systems. FactoryTalk Optix supports numerous industrial protocols for secure interoperability and “smart-object” data modeling to provide analytics-ready data. Paired with Azure IoT Operations, a unified, adaptive cloud solution built on open standards and powered by Azure Arc, manufacturers gain seamless connectivity across the factory floor enabling edge to cloud orchestration. With support for protocols like OPC UA and MQTT, camera and third-party integration through Akri and WASM connectors, and Copilot-driven automation for observability and deployment, this partnership bridges OT and IT to unlock advanced analytics, AI-driven optimization, and predictive maintenance at scale. A Partnership That Delivers Scalable Innovation Customers can start utilizing FactoryTalk Optix with Azure IoT Operations as a scalable physical to digital foundation for transforming how they manufacture, design, and operate going forward. In partnership with Rockwell, there is a published GitHub sample that demonstrates how FactoryTalk Optix native IIoT connectivity protocols unlock contextualized data from industrial assets into Azure IoT Operations. With the 2510 Azure IoT Operations release , OPC Write capability is now available as well, creating a true read/write path for richer interoperability. The synergy between these technologies is a game-changer for manufacturers, unlocking advanced analytics, and AI-driven use cases. This collaboration delivers: Improved efficiency and reduced downtime through real-time connectivity and predictive maintenance Scalable edge-to-cloud architecture leveraging OPC UA and MQTT standards for unified OT/IT data Highly replicable, scalable deployments across hybrid and multicloud environments Proactive optimization with AI-driven design and analytics Democratized automation via Copilot capabilities for observability and deployment Unified IT management and centralized monitoring for streamlined operations Robust security and reduced integration complexity for faster time-to-value From the Shop Floor to the Boardroom By combining Rockwell’s industrial expertise with Microsoft’s cloud innovation, manufacturers can break down data silos, unify operations, and drive continuous optimization. AI-powered insights become accessible at every level, helping organizations anticipate change, improve safety and efficiency, and maintain a competitive edge in the digital era. Join Us at Rockwell Automation Fair Visit the Microsoft booth at Automation Fair to experience end-to-end demonstrations, explore customer stories, and see firsthand how the Rockwell–Microsoft ecosystem accelerates your digital transformation journey. Join live sessions at the Discovery Theatre – o Tuesday Nov 18th, 11:15am – 11:45am → The new industrial frontier - Using AI to scale faster, work smarter and unlock new value o Tuesday Nov 18 th 2pm – 3pm, and Thursday Nov 20 th at 10:00am – 11:00am → Bringing AI to the Factory Floor o Wednesday Nov 19 th , 1:45pm – 2:15pm → Start with Secure Solutions From Edge to Cloud Visit us at the Expo at Booth #1931 – For demos and conversations to see what we have to offer. Explore the products Learn more about Azure IoT Operations → https://azure.microsoft.com/en-us/products/iot-operations Explore FactoryTalk Optix → https://www.rockwellautomation.com/en-us/products/software/factorytalk/optix.html Hear more about our integration story at Microsoft Ignite → The new industrial frontier417Views3likes0CommentsBug: Empty bar at bottom of screen on Android 14 Honor 70 / MagicOS 8.0
After updating Microsoft Edge to version 142.0.3595.66 on Android 14 with MagicOS 8.0 on an Honor 70 smartphone, a permanent empty strip appears at the bottom of the screen when using gesture navigation. This issue was not present in previous versions, such as 141.0.3537.99. Disabling the "Edge-to-edge Everywhere" or "Edge-to-edge Tablet" flags does not resolve the problem. It seems to be related to the new edge-to-edge design introduced in the latest update. Please fix this issue or add an option to disable the bottom padding. Thank you.86Views0likes1CommentRunning Phi-4 Locally with Microsoft Foundry Local: A Step-by-Step Guide
In our previous post, we explored how Phi-4 represents a new frontier in AI efficiency that delivers performance comparable to models 5x its size while being small enough to run on your laptop. Today, we're taking the next step: getting Phi-4 up and running locally on your machine using Microsoft Foundry Local. Whether you're a developer building AI-powered applications, an educator exploring AI capabilities, or simply curious about running state-of-the-art models without relying on cloud APIs, this guide will walk you through the entire process. Microsoft Foundry Local brings the power of Azure AI Foundry to your local device without requiring an Azure subscription, making local AI development more accessible than ever. So why do you want to run Phi-4 Locally? Before we dive into the setup, let's quickly recap why running models locally matters: Privacy and Control: Your data never leaves your machine. This is crucial for sensitive applications in healthcare, finance, or education where data privacy is paramount. Cost Efficiency: No API costs, no rate limits. Once you have the model downloaded, inference is completely free. Speed and Reliability: No network latency or dependency on external services. Your AI applications work even when you're offline. Learning and Experimentation: Full control over model parameters, prompts, and fine-tuning opportunities without restrictions. With Phi-4's compact size, these benefits are now accessible to anyone with a modern laptop—no expensive GPU required. What You'll Need Before we begin, make sure you have: Operating System: Windows 10/11, macOS (Intel or Apple Silicon), or Linux RAM: Minimum 16GB (32GB recommended for optimal performance) Storage: At least 5 - 10GB of free disk space Processor: Any modern CPU (GPU optional but provides faster inference) Note: Phi-4 works remarkably well even on consumer hardware 😀. Step 1: Installing Microsoft Foundry Local Microsoft Foundry Local is designed to make running AI models locally as simple as possible. It handles model downloads, manages memory efficiently, provides OpenAI-compatible APIs, and automatically optimizes for your hardware. For Windows Users: Open PowerShell or Command Prompt and run: winget install Microsoft.FoundryLocal For macOS Users (Apple Silicon): Open Terminal and run: brew install microsoft/foundrylocal/foundrylocal Verify Installation: Open your terminal and type. This should return the Microsoft Foundry Local version, confirming installation: foundry --version Step 2: Downloading Phi-4-Mini For this tutorial, we'll use Phi-4-mini, the lightweight 3.8 billion parameter version that's perfect for learning and experimentation. Open your terminal and run: foundry model run phi-4-mini You should see your download begin and something similar to the image below Available Phi Models on Foundry Local While we're using phi-4-mini for this guide, Foundry Local offers several Phi model variants and other open-source models optimized for different hardware and use cases: Model Hardware Type Size Best For phi-4-mini GPU chat-completion 3.72 GB Learning, fast responses, resource-constrained environments with GPU phi-4-mini CPU chat-completion 4.80 GB Learning, fast responses, CPU-only systems phi-4-mini-reasoning GPU chat-completion 3.15 GB Reasoning tasks with GPU acceleration phi-4-mini-reasoning CPU chat-completion 4.52 GB Mathematical proofs, logic puzzles with lower resource requirements phi-4 GPU chat-completion 8.37 GB Maximum reasoning performance, complex tasks with GPU phi-4 CPU chat-completion 10.16 GB Maximum reasoning performance, CPU-only systems phi-3.5-mini GPU chat-completion 2.16 GB Most lightweight option with GPU support phi-3.5-mini CPU chat-completion 2.53 GB Most lightweight option, CPU-optimized phi-3-mini-128k GPU chat-completion 2.13 GB Extended context (128k tokens), GPU-optimized phi-3-mini-128k CPU chat-completion 2.54 GB Extended context (128k tokens), CPU-optimized phi-3-mini-4k GPU chat-completion 2.13 GB Standard context (4k tokens), GPU-optimized phi-3-mini-4k CPU chat-completion 2.53 GB Standard context (4k tokens), CPU-optimized Note: Foundry Local automatically selects the best variant for your hardware. If you have an NVIDIA GPU, it will use the GPU-optimized version. Otherwise, it will use the CPU-optimized version. run the command below to see full list of models foundry model list Step 3: Test It Out Once the download completes, an interactive session will begin. Let's test Phi-4-mini's capabilities with a few different prompts: Example 1: Explanation Phi-4-mini provides a thorough, well-structured explanation! It starts with the basic definition, explains the process in biological systems, gives real-world examples (plant cells, human blood cells). The response is detailed yet accessible. Example 2: Mathematical Problem Solving Excellent step-by-step solution! Phi-4-mini breaks down the problem methodically: 1. Distributes on the left side 2. Isolates the variable terms 3. Simplifies progressively 4. Arrives at the final answer: x = 11 The model shows its work clearly, making it easy to follow the logic and ideal for educational purposes Example 3: Code Generation The model provides a concise Python function using string slicing ([::-1]) - the most Pythonic approach to reversing a string. It includes clear documentation with a docstring explaining the function's purpose, provides example usage demonstrating the output, and even explains how the slicing notation works under the hood. The response shows that the model understands not just how to write the code, but why this approach is preferred - noting that the [::-1] slice notation means "start at the end of the string and end at position 0, move with the step -1, negative one, which means one step backwards." This showcases the model's ability to generate production-ready code with proper documentation while being educational about Python idioms. To exit the interactive session, type `/bye` Step 4: Extending Phi-4 with Real-Time Tools Understanding Phi-4's Knowledge Cutoff Like all language models, Phi-4 has a knowledge cutoff date from its training data (typically several months old). This means it won't know about very recent events, current prices, or breaking news. For example, if you ask "Who won the 2024 NBA championship?" it might not have the answer. The good thing is, there's a powerful work-around. While Phi-4 is incredibly capable, connecting it to external tools like web search, databases, or APIs transforms it from a static knowledge base into a dynamic reasoning engine. This is where Microsoft Foundry's REST API comes in. Microsoft Foundry provides a simple API that lets you integrate Phi-4 into Python applications and connect it to real-time data sources. Here's a practical example: building a web-enhanced AI assistant. Web-Enhanced AI Assistant This simple application combines Phi-4's reasoning with real-time web search, allowing it to answer current questions accurately. Prerequisites: pip install foundry-local-sdk requests ddgs Create phi4_web_assistant.py: import requests from foundry_local import FoundryLocalManager from ddgs import DDGS import json def search_web(query): """Search the web and return top results""" try: results = list(DDGS().text(query, max_results=3)) if not results: return "No search results found." search_summary = "\n\n".join([ f"[Source {i+1}] {r['title']}\n{r['body'][:500]}" for i, r in enumerate(results) ]) return search_summary except Exception as e: return f"Search failed: {e}" def ask_phi4(endpoint, model_id, prompt): """Send a prompt to Phi-4 and stream response""" response = requests.post( f"{endpoint}/chat/completions", json={ "model": model_id, "messages": [{"role": "user", "content": prompt}], "stream": True }, stream=True, timeout=180 ) full_response = "" for line in response.iter_lines(): if line: line_text = line.decode('utf-8') if line_text.startswith('data: '): line_text = line_text[6:] # Remove 'data: ' prefix if line_text.strip() == '[DONE]': break try: data = json.loads(line_text) if 'choices' in data and len(data['choices']) > 0: delta = data['choices'][0].get('delta', {}) if 'content' in delta: chunk = delta['content'] print(chunk, end="", flush=True) full_response += chunk except json.JSONDecodeError: continue print() return full_response def web_enhanced_query(question): """Combine web search with Phi-4 reasoning""" # By using an alias, the most suitable model will be downloaded # to your device automatically alias = "phi-4-mini" # Create a FoundryLocalManager instance. This will start the Foundry # Local service if it is not already running and load the specified model. manager = FoundryLocalManager(alias) model_info = manager.get_model_info(alias) print("🔍 Searching the web...\n") search_results = search_web(question) prompt = f"""Here are recent search results: {search_results} Question: {question} Using only the information above, give a clear answer with specific details.""" print("🤖 Phi-4 Answer:\n") return ask_phi4(manager.endpoint, model_info.id, prompt) if __name__ == "__main__": # Try different questions question = "Who won the 2024 NBA championship?" # question = "What is the latest iPhone model released in 2024?" # question = "What is the current price of Bitcoin?" print(f"Question: {question}\n") print("=" * 60 + "\n") web_enhanced_query(question) print("\n" + "=" * 60) Run It: python phi4_web_assistant.py What Makes This Powerful By connecting Phi-4 to external tools, you create an intelligent system that: Accesses Real-Time Information: Get news, weather, sports scores, and breaking developments Verifies Facts: Cross-reference information with multiple sources Extends Capabilities: Connect to databases, APIs, file systems, or any other tool Enables Complex Applications: Build research assistants, customer support bots, educational tutors, and personal assistants This same pattern can be applied to connect Phi-4 to: Databases: Query your company's internal data APIs: Weather services, stock prices, translation services File Systems: Analyze documents and spreadsheets IoT Devices: Control smart home systems The possibilities are endless when you combine local AI reasoning with real-world data access. Troubleshooting Common Issues Service not running: Make sure Foundry Local is properly installed and the service is running. Try restarting with foundry --version to verify installation. Model downloads slowly: Check your internet connection and ensure you have enough disk space (5-10GB per model). Out of memory: Close other applications or try using a smaller model variant like phi-3.5-mini instead of the full phi-4. Connection issues: Verify that no other services are using the same ports. Foundry Local typically runs on http://localhost:5272. Model not found: Run foundry model list to see available models, then use foundry model run <model-name> to download and run a specific model. Your Next Steps with Foundry Local Congratulations! You now have Phi-4 running locally through Microsoft Foundry Local and understand how to extend it with external tools like web search. This combination of local AI reasoning with real-time data access opens up countless possibilities for building intelligent applications. Coming in Future Posts In the coming weeks, we'll explore advanced topics using Hugging Face: Fine-tuning Phi models on your own data for domain-specific applications Phi-4-multimodal: Analyze images, process audio, and combine multiple data types Advanced deployment patterns: RAG systems and multi-agent orchestration Resources to Explore EdgeAI for Beginners Course: Comprehensive 36-45 hour course covering Edge AI fundamentals, optimization, and production deployment Phi-4 Technical Report: Deep dive into architecture and benchmarks Phi Cookbook on GitHub: Practical examples and recipes Foundry Local Documentation: Complete technical documentation and API reference Module 08: Foundry Local Toolkit: 10 comprehensive samples including RAG applications and multi-agent systems Keep experimenting with Foundry Local, and stay tuned as we unlock the full potential of Edge AI! What will you build with Phi-4? Share your ideas and projects in the comments below!474Views1like1CommentAbility to Manager or Delete single autocomplete form data in Edge browser (new for 2025)
Someone has asked exactly this question before, in 2023, asking if the feature could be added, when maybe the feature already existed and the OP didn't know. Three months after the post, the OP wrote that they'd found the method and marked it as solved. It involved a few simple key strokes. See here https://techcommunity.microsoft.com/discussions/edgeinsiderdiscussions/ability-to-manager-or-delete-single-autocomplete-form-data-in-edge-browser-/3984284 This old post has attracted a few extra comments in the time since it was marked as solved, as the method described has been removed in more recent builds of Edge. I find the removal of this method frustrating, and I believe it invites security issues. For example, more than once, I've accidentally put a password in a username box. Now when I visit the site and try to log in, it shows my password in plain text on screen. Users have to search through autocomplete data to correct it, instead of being able to resolve the issue with one click (as I believe it is in Chrome) or the old Edge method of arrowing down to the suggestion and pressing delete while holding shift. Please could this feature be brought back in a future Edge build? Because I think it is a valuable component of old Edge, and I miss its ease of use very much, the sooner the better please.20Views0likes0CommentsStolen session token from Edge
We can steal the session token from Edge using tools like Burp Suite or Fiddler to intercept proxy traffic on the mobile phone, even when the Edge is MAM protected by Intune. This makes the Edge browser unsafe to use for Enterprise Applications on personal mobile. Recently I discovered that the https://learn.microsoft.com/en-us/entra/identity/conditional-access/concept-token-protection in Conditional Access Policy. However it is only available for Windows. I am wondering if anyone knows when it would become available for mobile on Entra roadmap. Also, if you know any Edge configuration, I could use to stop Token Theft, please let me know! Thank you everyone.102Views0likes1CommentPWA shortcuts don't keep asociated with window - Linux
I recently upgraded to version140.0.3485.14 (Official build) beta (64-bit) and noticed that my PWA, even if they open correctly in a independent window, don't show the "running window counter" on the taskbar. So, for example, if a open Outlook, the window open and loads the app, but if a I click the icon again, it opens another window, wich is a little annoying. I tried removing the PWA from Edge, rebooting, and installing again, but didn't work. Because the only change I made is updating Edge, I think is related to this build. (too lazy to go back or install the stable release)682Views9likes8Comments