Semantic Kernel planners are an easy way in which to build agents to do useful things. This blog describes how to integrate a BBC Micro:bit an educational single board compute device used in many UK schools to teach programming with Azure OpenAI using Semantic Kernel. This makes for a simple and engaging set of labs on how to control a device using Azure OpenAI.
Background
The BBC Micro:bit has proved very popular in UK schools as a cheap and simple device that can be used to demonstrate basic coding, either using a building block approach or by using Python code in an editor. There are browser-based tools to allow editing of the code in either mode.
The BBC Micro:bit is then connected via USB and the browser is then able to upload the code to the BBC Micro:bit. Thus making it a very interactive experience.
What if the power of Azure OpenAI prompts could be combined with the simplicity of the BBC Micro:bit to allow prompts to directly program the BBC Micro:bit?
This is what this blog explains.
Semantic Kernel
This blog assumes the reader has a basic understanding of Azure OpenAI and that there is an API that allows you to send requests and get responses from a model provisioned in Azure OpenAI. If not please look here
A little background on Semantic Kernel may be useful. This is an open-source framework for the building of AI agents. Semantic Kernel makes use of plugins as a mechanism to perform specific tasks.
A plugin is a specific piece of code that is tagged with metadata in a way such that Semantic Kernel will know whether to use the plugin and if so, how to call the plugin. For those conversant with OpenAI function calling, Semantic Kernel automates the process of registering a function with Azure OpenAI and calling that function when the response from Azure OpenAI believes that the user's query can be best answered by calling a function. This mechanism is extremely powerful as it opens up huge possibilities for an Azure OpenAI model to integrate with any number of things.
In this example, the BBC Microbit will be a plugin to Semantic Kernel.
The Demo Semantic Kernel app
In order to keep things really simple, this demonstration is a console application that runs a Semantic Kernel planner with a plugin for the BBC Micro:bit.
The code for this is really simple.
In the first step below, create the Semantic Kernel with your model, endpoint and key. You will need an instance of Azure OpenAI with a model like gpt-4o provisioned.
#pragma warning disable SKEXP0050
#pragma warning disable SKEXP0060
using System.ComponentModel;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
using Microsoft.SemanticKernel.Connectors.OpenAI;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.DependencyInjection;
using System;
// Create the kernel
var builder = Kernel.CreateBuilder();
builder.AddAzureOpenAIChatCompletion(
"YOUR_MODEL",
"YOUR_ENDPOINT",
"YOUR_KEY");
Next register the plugin
builder.Plugins.AddFromType<LightsPlugin>();
builder.Plugins.AddFromType<MicrobitPlugin>();
in the sample there are 2 plugins, one that is for the BBC Micro:bit and another one that emulates the grid of lights on the BBC Micro:bit, so you can get Semantic Kernel to control this if you don't have a BBC Micro:bit connected.
We will, however, concentrate on the BBC Micro:bit in this blog.
Next we enable automatic function calling and a chat history object that we will initialise with some instructions to explain to the model what it can do:
Kernel kernel = builder.Build();
// Retrieve the chat completion service from the kernel
IChatCompletionService chatCompletionService = kernel.GetRequiredService<IChatCompletionService>();
// 2. Enable automatic function calling
OpenAIPromptExecutionSettings openAIPromptExecutionSettings = new()
{
ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions
};
// Create the chat history
ChatHistory history = new ChatHistory("""
You have a matrix of 5 rows and columns each of which can have a brightness of 0 to 9.
Rows make up the horizontal axis. The top row has coordinates 0,1 to 4, 0.
Columns represent the vertical axis. The left most column is 0,0 to 0, 4 and the right column is 4,0 to 4,4.
""");
This is all the initialisation needed and so the final step is to run the prompt loop where you can prompt the application and it will respond.
string? userInput;
do {
// Collect user input
Console.Write("User > ");
userInput = Console.ReadLine();
if (userInput is not null)
{
// Add user input
history.AddUserMessage(userInput);
// Get the response from the AI with automatic function calling
var result = await chatCompletionService.GetChatMessageContentAsync(
history,
executionSettings: openAIPromptExecutionSettings,
kernel: kernel);
// Print the results
Console.WriteLine("Assistant > " + result);
// Add the message from the agent to the chat history
history.AddMessage(result.Role, result.Content ?? string.Empty);
}
} while (userInput is not null);
As can be seen above, this is just a loop that requests input, passes to the chatCompletionService, gets the response, displays it and then it adds this to the history.
In this manner the agent is conversational and so will take into account previous prompts when answering the current one.
In this case, the conversation history is not not persisted.
BBC Microbit plugin
The BBC Micro:bit has a USB port and when plugged into a PC, presents itself as a serial port device to your PC. The port can be found out from Windows Device Manager, but it is often COM3.
The plugin uses a feature of the BBC Micro:bit where commands can interactively be sent. This is referred to as Read-Evaluate-Print-Loop REPL
The plugin opens a connection to a COM port on initialisation, so this should error if the BBC Micro:bit is not plugged-in or presents itself on a different COM port. The code that may need amending is in the MicrobitPlugin.cs
SerialPort serialPort;
public MicrobitPlugin()
{
serialPort = new SerialPort("COM3", 115200);
serialPort.Open();
//send crt-c to stop any running program
serialPort.Write(new byte[] { 0x03 }, 0, 1);
}
The above opens the serial port for later use so that commands may be sent
As explained in the Semantic Kernel description, meta data is used against a plugin function in order to indicate to Semantic Kernel and then Azure OpenAI what that function does.
The simplest function is to set a specific pixel on the BBC Micro:bit display
[KernelFunction("set_microbit_light_brightness")]
[Description("sets the brightness of a microbit pixel by its row and column ID.")]
public async Task<int> SetLightBrightness(
Kernel kernel,
int rowid,
int columnid,
int brightness
)
{
SendCommand(serialPort, $"display.set_pixel({rowid},{columnid},{brightness})");
return brightness;
}
This then calls the serial port to send the command to the BBC Microbit using the serialPort object that was previously opened:
private void SendCommand(SerialPort serialPort, string command)
{
//send crt-c to stop any running program
serialPort.Write(new byte[] { 0x03 }, 0, 1);
// Wait for the micro:bit to stop any running program
System.Threading.Thread.Sleep(50); // Adjust the delay as needed
// Send the command to the micro:bit
serialPort.Write(command);
serialPort.WriteLine(" ");
// send a carriage return to execute the command
serialPort.Write(new byte[] { 0x0d }, 0, 1);
// Wait for the micro:bit to finish executing the command
System.Threading.Thread.Sleep(50); // Adjust the delay as needed
// Read the response from the micro:bit
var response = serialPort.ReadExisting();
// Print the response to the console
Console.WriteLine(response);
}
As can be seen above, there is metadata which describes the purpose of the function and the parameters are inspected too. For the light, these are the row and column and brightness. This then sends a specific display.set_pixel command to the BBC Micro:bit.
The number of rows and columns is not fixed here, nor the brightness levels. These are explained to the model in the original prompt (chat history) before the main Semantic Kernel planner loop is run.
Some sample prompts
Once the program is built and run. You can try any number of prompts:
- set the top left light to brightness 4
- set it the last light to a lower brightness
- turn off all lights
- draw a circle
- make the circle bigger
- Draw the letter "W"
Using the application, you can see that even with this basic plugin function, the planner can do more complicated things than expected, it has history so can refer to a previous prompt, but can do much more interesting things like draw shapes or letters that require multiple calls to the plugin function!
Extending the plugin
The REPL interface to the BBC Micro:bit can send all sorts of other commands besides the display.set_pixel one, so it makes sense to expand the plugin to give it more broad capability:
[KernelFunction("set_microbit_command")]
[Description("send a command to the microbit using REPL.")]
public void SendGenericCommand(
Kernel kernel,
string command
)
{
SendCommand(serialPort, command);
}
The above plugin function allows any arbitrary command to be sent to the BBC Microbit. Allowing a wider set of use cases or more compact commands to be sent to the BBC Microbit.
Some more prompts
As this extra plugin function need not be doing things just to the lights you can ask it to:
- make a sound
- ask the current temperature
- ask it if one of the buttons is pressed
For the lights too, there are extra possibilities:
- display a word with each letter in sequence
- display a scrolling word or sentence
Summary
What can be seen in this demonstration is how simple it is for Semantic Kernel to control an external device using a plugin and a simple planning loop.
Many of the models on Azure OpenAI have enough training data to natively understand what REPL commands can be sent to the BBC Micro:bit to accomplish a whole task. The main limiting factor seen is the ability of the REPL interface to accept complex commands from the planner. This is largely to do with Python as a language and how control flow get expressed as indentations in the code (rather than using some form of brackets like C#).
Further reading
There are some labs that may be used to explore this further. In addition, for those who do not have access to a BBC Micro:bit, there are earlier labs that are essentially the same but with an array of virtual lights which represent the light matrix on the BBC Micro:bit.
Have a play 🙂