Azure Machine Learning's new integration with
Published Jun 21 2022 09:10 AM 3,947 Views




In today’s complex cloud environments, it is common for companies to use technologies from many sources. Cloud computing platforms are often the foundation of an enterprise IT landscape, with supplement from independent software vendors and products that meet business and industrial vertical needs.  With integration between and Azure Machine Learning, we enable customers to democratize model creation with SaaS-like tools and pick the deployment technologies that align with their corporate requirements. 


Using’s AzureML integration, models built in now appear as a deployed model within an AzureML workspace. This means that any user or product can now leverage an inferencing endpoint of the models hosted in AzureML with a simple API call. This makes organizational model adoption easier, allowing users to access models across their enterprise IT landscape without changing existing deployment strategies. 


“H2O's integration with Azure provides an innovative approach to deliver models as an Azure service, resulting in higher model adoption across Momentum Financial Services Group. As an example, 80% of credit checks are now reviewed using H2O models and more than 6K requests a day”.  

- Nima Norouzi, Director - Advanced Analytics 


At a high level, many organizations use databases, services, and repositories on Azure. Those assets can be used to build a machine learning model with products that include Driverless AI or H2O-3. As you may expect, Driverless AI has some unique capabilities to create new features as part of the Automated Machine Learning workflow and generate a model that is highly accurate and explainable. 


The resulting artifact from automated training process is the model. This is used in the scoring (inference) stage of the workflow. This is where our integration takes root: organizations can now host that model for inference within Azure Machine Learning. 


A high level diagram of the integration between and Azure Machine LearningA high level diagram of the integration between and Azure Machine Learning




Azure Machine Learning can now invoke the models directly as services. This means the models generated within can be quickly and easily used within the current pipelines as well as used in new ways. For example, using the new AzureML Managed Online Endpoint feature enables the H2O-3 or Driverless AI model to be accessible directly as a HTTP call. 


The integration is available as part of’s Rest Server. The Rest Server also generates an Azure Machine Learning deployment template as well as other artifacts to enable a seamless user experience when using consuming models within Azure Machine Learning. 


Additionally, H2O’s Wave Application enables users to deploy the models into Azure using the Wave UI. 


A screenshot of the Wave UI connecting to Azure Machine LearningA screenshot of the Wave UI connecting to Azure Machine Learning



The benefit to customers is simple: the deployment and reuse of models generated within with hosted inferencing on Azure Machine Learning environment is a natural extension of our combined capabilities that empowers customers to build models where they like, while enabling enterprise data science teams to keep model governance and management on Azure Machine Learning. 

Version history
Last update:
‎Jun 21 2022 02:11 PM
Updated by: