Azure Data factory Copy Data is dropping records

Copper Contributor

I have set up a copy data pipeline in Azure Datafactory. The table I'm copying from contains 1500 records. The copy data step only read 872 records. I've tried to find records in the input table that seem to have bad data but nothing seems to be excessive long text or the wrong data type.  I also looked for duplicate keys but I don't find any fields in the input record layout that indicate they are a key.  Does anyone have suggestions as to what I should look for or try to change to resolve the issue? 

 

mke40_0-1655472061457.png

 

2 Replies
Hi, did you get any resolution for this?

Have you tried writing it to a different place? like a CSV file? it says that from source 872 rows were picked up.. which seems to be an issue on the source side itself.

Also, did you try data migration method instead of copy data activity?

@mke40, to ensure that all data is copied to the sink SQL Database, try setting the enableSkipIncompatibleRow property to True (False by default and documented here). By default, rows will be skipped when the first instance of a duplicate is detected according to the sink table definition.