Benchmarks and scalability of Hexagon simulation portfolio on Azure

Published Mar 25 2022 08:09 AM 1,210 Views

Technical contribution: Łukasz Mirosław, Tomasz Józefiak (Microsoft), Prafulla Kulkarni and Abhishek Gaur (Hexagon)


Hexagon Design & Engineering (previously known as MSC Software) develops simulation software technology that enables engineers to validate and optimize their designs using virtual prototypes. Customers in almost every part of manufacturing use solutions developed by Hexagon D&E to complement, and in some cases even replace the physical “build and test” process that has been traditionally used in product design.


Hexagon D&E’s technology is used by leading manufacturers for linear and nonlinear finite element analysis (FEA), acoustics, fluid-structure interaction (FSI), multi-physics, optimization, fatigue and durability, multi-body dynamics, and control systems simulation.

We are happy to announce that the official Hexagon MI deployment guide of MSC simulation software on Azure has been published. In a joint effort between Microsoft and Hexagon D&E teams, the comprehensive guide has been created that includes the deployment steps needed to run Hexagon solutions on Azure High Performance Computing (HPC) infrastructure as well as the benchmarking results.


The following Hexagon D&E solutions have been benchmarked: MSC Nastran, Marc, Actran, Cradle CFD, Digimat, Adams, and Simufact in a range of different scenarios. This way we were able to demonstrate the scale and performance of running simulations with parallelization available with Hexagon D&E solvers in a robust scalable infrastructure available in Azure.



Fig. Performance of MSC Nastran on Azure HC44rs instances (example from the report).

Azure Cyclecloud was used to orchestrate the definition, deployment, and control of the HPC clusters used for testing. Each cluster was equipped with the scheduler for job orchestration and leveraged the autoscaling feature, e.g. the clusters were scaling out when the jobs were submitted and scaling down when all the jobs were finished. For testing we used Slurm scheduler, but other schedulers such as OpenPBS, IBM Spectrum LSF, Univa Grid Engine, HPC Pack or Sun Grid Engine are also available.

The whole infrastructure was deployed in the Azure Virtual Network that could be isolated from internet and integrated with on premises networking infrastructure to allow seamless access from on premises clients with protocols of choice, e.g. SSH to access the master node with a Slurm scheduler and HTTPS to access Azure Cyclecloud User Interface.



Fig. Architecture of the infrastructure used for benchmarking.

The usage of HPC images provided by Microsoft reduced the complexity of installation and configuration of Infiniband drivers, connectivity between the nodes as well as the MPI libraries. These images were used as a baseline to install the simulation software and then these images were automatically attached to the VMs in the clusters to automate the deployment.

A local disk of the master node was attached to the compute nodes using NFS protocol to persist the simulation results. While this proved to be sufficient for benchmarking, more performant storage options are recommended to sustain the storage bandwidth and IOPS such Azure NetApp Files, Azure Premium Files or parallel file systems such as BeeGFS or Lustre.

For the benchmarking the HBv2 and HC44 VMs were used. The first VM type features 120 AMD EPYC 7002-series CPU cores, 4 GB of RAM per CPU core while the HC44 is equipped with 44 Intel Xeon Platinum 8168 processor cores and 8 GB RAM per CPU core.


This document/guide demonstrates that Hexagon D&E solvers are cloud ready and easy to install on Azure cloud. In addition it shows how customers can benefit with solver parallelization options to gain productivity in terms of cost and time when scalable cloud infrastructure is available".

Prafullla Kulkarni, Senior Manager at Hexagon D&E



  • Azure Deployment Guide at Hexagon D&E [link]
  • Azure Cyclecloud [link]
  • Azure HBv2 [link] and HC44 [link]




Version history
Last update:
‎Mar 18 2022 08:19 AM
Updated by: