Hi Project Bonsai Team,
I'm using Bonsai & AnyLogic for my MSc graduation project. I'm training a brain for a job scheduling task, the agent-based AnyLogic model records a runtime of around 5sec per simulation iteration. I'm currently maxing out the quota for a standard Azure subscription, and found the most efficient training setup at: 10 parallel instances, 0.4 cores/instance, 16GB RAM. The training performance results in ~0.5 iterations/sec. I'm using the APEX algorithm (discrete action space) in standard configuration, training acceleration activated.
Previous experiments trained a Brain for a less complex model in around 8h at ~150K iterations, at ~6 iterations/sec running 6 parallel instances, 1 core/instance. Assuming my current model requires >150K iterations to train the brain as well, it will take min. 83h to be trained. Is there a way to request a temporary quota increase to boost the number of parallel instances I could run, and to support my academic mission?