My organization is just beginning a long-term project to Migrate to Cloud. My department runs a set of applications that use ETL as the primary technology for performing a nightly batch and Java to power a User Interface to the data during work hours. The size of our primary database is > 10 TB, and that size is maintained only by aggressive Archive/Purge cycles. During an average nightly batch, we are shuttling > 1 TB of data between the DB and ETL servers.
My question is, would anyone even consider moving just the database/UI for an application like ours to the Azure Cloud? The ETL application we use is not Azure friendly, and our batch code complex enough that converting to another platform could take multiple years.
Can Cloud-to-Premise data transfer rates support the kind of large data extracts/writes that a nightly batch like ours requires?
Azure can absolutely support this. In my opinion the best way to accomplish this is to leverage an Express route. This is a private connection between on premises and Azure. If you workload is currently on premises and not in a datacenter you will have to work with an ISP to establish this circuit working through an data exchange provider. If your servers are sitting in a datacenter you could see if they are a data exchange provider and the circuit setup is much more streamlined. In the link below you will see you have a few options metered, unlimited, and direct. As you can see the speed is there in all three just depends on the pricing model you wanting to use. The metered plan may get really expensive if you are sending 1TB of data over the wire nightly. Hopefully that was helpful. I would be happy to answer any follow up questions you may have.