In my work with ISVs and start-ups, a common pattern is to see that even though the current database is relatively small (below 1TB), the future is unpredictable.
The business plan of the company is to have massive grow (double digits) every year. If (hopefully) those forecasts become true, the deployed architecture might become invalid.
On the other hand, designing and provisioning big data solutions like data lake, spark clusters and Synapse workspace comes with complexity, development effort and some possible unwanted price tag.
The Azure offering named Azure SQL Database Hyperscale can be a smart choice. This tier of Azure database supports up to 100TB with a lot of additional goodies.
The good part, and the reasoning to this suggestion, is that you can stay with your existing Azure SQL database tier and migrate to the hyperscale when you reach the 4TB current limit by Migrate an existing database to Hyperscale. This can help you delay the decision to a time when both your business needs will be clearer and new technologies or features might offer you more options.