Feb 01 2024 11:05 AM
I've set up a pipeline to dynamically copy multiple csv files from a directory in Azure Blob Storage into Azure SQL DB. One of my files is failing because it exceeds the allowable row size of 8060. I have over 400 columns in the specific file but none of the fields exceeds varchar(50). I am able to manually import the file using SSMS defaulting all datatypes to varchar(50) with no issues. I noticed that the copy activity is defaulting all datatype to nvarchar(max).
Is this why the copy activity is failing for this file? How do I fix this? Ideally I'd like to default all my datatypes for all files to varchar(250) on the sink side since my source data does not exceed that.
Any help would be greatly appreciated. Thanks!