Thank you Linda_Wang for working with us on this. The connector is amazing and has the potential to simplify a lot of our ELT work with Snowflake.
My scenario is this: the source data has three columns (C1, C2, C3) stored in a Parquet file in Blob. The target table in snowflake has 4 columns (C1, C2, C3, C4) where the last column, C4, has a default value CURRENT_TIMESTAMP(). When I manually run the copy into from an external table, I can specify the column list explicitly from the source like this:
copy into snowflake_table (C1, C2, C3)
from 'azure://...'
credentials=(azure_sas_token='...');
If I don't specify the column names, I get the same error I see from the ADF connector: Snowflake can't copy 3 columns of data into a table with 4 columns. But specifying the column names makes this work since the fourth column has a default value. If there were a way to configure the column name list in the connector, we could utilize the default column to timestamp every row we load with the load time (which is the goal).