This post is co-authored by Abe Omorogbe, Program Manager, Azure Machine Learning.
The Azure Machine Learning (Azure ML) team is excited to announce the release of an enhanced developer experience for ‘compute instance’ and ‘notebooks’ users, through a VS Code integration in the Azure ML Studio! It is now easier than ever to work directly on your Azure ML compute instances from within Visual Studio Code, , and with full access to a remote terminal, your favorite VS Code extensions, Git source control UI, and a debugger.
Bringing VS Code to Azure Machine Learning
The Azure Machine Learning and VS Code teams have been working in collaboration over the past couple of months to better understand user workflows for authoring, editing, and managing code files. The demand for VS Code became clear after speaking to a wide variety of users tasked with managing larger projects and operationalizing their models. Users were eager to continue working on their Azure ML compute resources and retain the development context initially defined through the Studio UI.
The first step to enabling a better editing experience for users was to evaluate what was currently used in VS Code. Users were familiar with extensions such as Remote-SSH and , the former used to connect to their remote compute and the latter to author notebook files. The advantage of using Jupyter, JupyterLab, or Azure ML notebooks was that they could be used for all compute instance types without requiring any additional configuration or networking changes.
To enable users to work against their compute instances without requiring SSH or additional networking changes, the Azure ML and VS Code teams built a Notebook-specific compute instance connect experience. The Azure ML extension was responsible for facilitating the connection between VS Code – Jupyter and the compute instance, taking care of authenticating on the user’s behalf. After a month or so of releasing this capability, it was clear that users were excited about connectivity without SSH and being able to work from directly within VS Code. However, working in the editor implied expectations around being able to use other VS code features such as the remote terminal, debugger, and language server. Users expressed their frustration with being limited to working in a single Notebook file, being unable to view files on the remote server, and not being able to use their preferred extensions.
VS Code Integration: Features
Learning from prior releases and talking to users led the Azure ML and VS code teams, to build a complete VS Code experience for compute instances without using SSH. Getting started with this experience is trivial – entry points have been integrated within the Azure ML Studio in both the Compute Instance and Notebooks tabs.
Through this VS Code integration customers will now have access to the following features and benefits:
- Full integration with Azure ML file share and notebooks: All file operations in VS Code are fully synced with the Azure ML Studio. For example, if a user drags and drops files from their local machine into VS Code connected to Azure ML, all files will be synced and appear in the Azure ML Studio.
- Git UI Experiences: Fully manage Git repos in Azure ML with the rich VS Code source control UI.
- Notebook Editor: Seamlessly click out from the Azure ML notebooks and continue to work on notebooks in the new native VS code editor.
- Debugging: Use the native debugging in VS Code to debug any training script before submitting it to an Azure ML cluster for batch training.
- VS Code Terminal: Work in the VS Code terminal that is fully connected to the compute instance.
- VS Code Extension Support: All VS Code extensions are fully supported in VS Code connected to the compute instance.
- Enterprise Support: Work with VS Code securely in private endpoints without additional, complicated SSH and networking configuration. AAD credentials and RBAC are used to establish a secure connection to VNET/private link enabled Azure ML workspaces.
VS Code Integration: How it Works
Clicking out to VS Code will launch a desktop VS Code session which initiates a secondary remote connection to the target compute. Within the remote connection window, the Azure ML extension creates a WebSocket connection between your local VS Code client and the remote compute instance.
The connected window now provides you with:
- Access to the mounted file share, with consistent syncing between what is seen in Jupyter* and the Azure ML Notebooks experience.
- Access to the machine’s local SSD in case you would like to clone and manage repos outside of the shared file share.
- The ability to manage repositories through the source control UI.
- The ability to create, interact and debug running applications.
- A remote terminal for executing commands directly against the remote compute.
Below is a high-level overview of the remote connection
This new connect capability and direct integration in the Azure ML Studio creates a better-together experience between Azure ML and VS Code! When working on your machine learning projects you can get started with a notebook in the Azure ML Studio for early data prep and exploratory work, when you’re ready to start fleshing out the rest of your project, work on multiple file types, and use more advanced editing capabilities and VS Code extension, you can seamlessly transition over to working in VS Code. The retained context and file share usage enables you to move bi-directionally (from notebooks to VS Code and vice-versa) without requiring additional work.
Getting Started
You can initiate the connection to VS Code directly from the Studio UI through either the Compute Instance or Notebook pages. Alternatively, there are routes starting directly within VS Code if you would prefer. Given you have the Azure Machine Learning extension installed, you can find the compute instance in the tree view and right-click on it to connect. You can also invoke the command “Azure ML: Connect to compute instance” and follow the prompts to initiate the connection.
For more details on how you can get started with this experience, please take a look at our public documentation.
Both the Azure ML and VS Code extension teams are always looking for feedback on our current experiences and what we should work on next. If there is anything you would like us to prioritize, please feel free to suggest so via our GitHub repo; if you would like to provide more general feedback, please fill out our survey.