The Artificial Intelligence: Cloud and Edge implementations is a full-stack AI course I teach at the University of Oxford. Because the course covers MLOps, Edge, and AI – the scope is vast. In addition, the course takes an engineering-led approach. This blog explains the rationale and implications of taking an engineering-led approach for teaching AI for Edge devices and how we use Microsoft LEARN to achieve our course objectives.
Mark Weiser first proposed the ubiquitous computing vision in the 1990s. Fast forward twenty years, and we have still not quite achieved the vision of any device connecting to any other device to achieve ubiquitous, seamless computing. But we have achieved a lot within the context of a cloud (and for enterprises). By that, I mean we can connect IoT devices and drive the processes using AI algorithms by using a cloud platform framework. Furthermore, we can run trained models on edge devices if we deploy models using containers and train AI models in the cloud. Thus, we have the foundations for building complex engineering systems spanning the cloud and the edge powered by intelligence (machine learning / deep learning models).
Enticing as this AI Edge Engineering vision sounds, teaching this subject in the context of a cloud platform like Azure is not easy.
As a technology, the cloud is evolving rapidly.
We encounter too many cognitive dependencies and risk cognitive overload for the students. In other words, concepts depend on other concepts, and learners risk being overloaded by many new ideas in a short period. Hence, we encapsulated the learning as an AI Edge Engineer learning path.
This learning path includes everything that students need to learn to master the domain.
So, we could subdivide these modules as
Modules covering fundamentals:
Modules covering deployment using Azure functions or containers:
Modules covering image recognition and object detection:
Modules covering Azure Sphere (secure IoT devices)
Implications of teaching AI Edge Engineering
So, how does the AI Edge Engineer learning path help in overcoming learner challenges?
The interplay between AI, cloud, and edge is a rapidly evolving domain. Currently, many IoT solutions are based on basic telemetry. The telemetry function captures data from edge devices and stores it in a data store. But for the true engineering capabilities, you have to extend beyond basic telemetry.
We aim to model problems in the real world through machine learning and deep learning algorithms and implement the model through AI and Cloud on to edge devices. Containers are central to this approach. When deployed to edge devices, containers can encapsulate deployment environments for a range of diverse hardware. CICD (Continuous integration - continuous deployment) is a logical extension to deploying containers on edge devices.
So, as you see from the modules listed above, learners are provided a logical path comprising progressive complexity levels to provide a smooth learning curve.
We should note that related concepts are not necessarily connected to the idea of “AI Edge Engineering." If we take an interdisciplinary approach to learning, a learner may need to draw upon a much wider repertoire of concepts – especially as cloud platforms themselves evolve rapidly. Here are some examples:
In all these cases, the LEARN platform provides a scaffolding/framework for learning complex technologies and the interconnections between them – both at a peer level and a hierarchical level – thereby making the learning of complex subjects easier
Welcome your comments and feedback, especially from educators
Image source: Introduction to Azure IoT – part of the AI Edge Engineer learning path. showing how to depict a process in LEARN modules
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.