ADF Data Flow connectors for Common Data Model (CDM) and Delta Lake are both now generally available (GA).
This is the documentation link for CDM to learn more about how to read model.json and manifest style of CDM models into ADF. With data flows, you can build powerful ETL processes using CDM formats and then also generate updated manifest files that point to your new, transformed data using CDM as a sink. ADF can use your CDM entity definitions to build ETL projections for transformation and mapping.
Here is the documentation link for Delta Lake, which is a Spark-based data lake format that makes working with data in your lake for analytics solution super-easy. With ADF data flows, you can read from Delta Lake folders, transform data, and even update, upsert, insert, delete, and generate new Delta Lake folders using the Delta Lake sink format. You don't need to bring your own Spark cluster, either. ADF provides the Spark compute to create Delta Lake databases.
Both of these Data Lake formats are available now in GA under the data flow "Inline Dataset" feature.