Forum Discussion
ADF throwing error while connecting thru SFTP
Priya1 I know you said the file itself doesn't have a data issue, but the error you're seeing would typically indicate to me that there's a schema or data contamination issue.
These would be my typically recommendations to troubleshoot the issue:
- Doublecheck your schema on the copy activity.
- Validate your delimiter, quote character, and escape character in the dataset configuration.
- Inspect the file itself at line 88186 in the file.
- I recommend using Notepad++, then show all characters (View -> Show Symbol -> Show All Characters)
- You'd be looking for unquoted delimiters, unescaped quote characters, or unquoted carriage feed or line break characters. (In notepad++, these would show up as CR & LF, respectively)
- Priya1Jun 21, 2024Copper Contributor
patrickdeline Thanks for the reply.
Like I said I processed the same file by uploading it on Azure Blob storage and it worked well without any data issue errors.
Is there any restrictions on size of the file or number of row count limit while using SFTP linked service in ADF?
Thanks
- patrickdelineJun 21, 2024Copper Contributor
Priya1 I wouldn't consider myself an expert, so take my thoughts with a grain of salt. There are a number of limitations with FTP & SFTP linked service connections, but none would generate that type of error, as far as I'm aware.
If you're successfully ingesting the same file by uploading it manually, then a schema configuration is definitely the top of the list of possible culprits. I would compare & contrast the source dataset in Data Factory with the way you're parsing it manually (however you're grabbing it from blob storage).
What you're describing sounds something like an un-escaped delimiter or quote character. If the data itself is correct, then I'd bet money that your issue is an error in your source CSV dataset connection settings for either the quote or escape character.