We are excited to release the preview of ONNX Runtime, a high-performance inference engine for machine learning models in theOpen Neural Network Exchange (ONNX)format. ONNX Runtime is compatible with ONNX version 1.2 and comes in Python packages that support bothCPUandGPUto enable inferencing usingAzure Machine Learning serviceand on any Linux machine running Ubuntu 16.
ONNX is an open source model format for deep learning and traditional machine learning. Since we launched ONNX in December 2017 it has gained support from more than 20 leading companies in the industry. ONNX gives data scientists and developers the freedom to choose the right framework for their task, as well as the confidence to run their models efficiently on a variety of platforms with the hardware of their choice.