Forum Discussion
Error importing .csv file
Details, although pesky, are truly important in trying to understand what we can't see on your computer.
At what point in the import wizard does this error message get raised?
Have you inspected the entire csv file to ensure it's completely consistent in layout and contents?
How many columns does the csv file have?
How many characters in a typical row in the csv file?
Details are important, im just not sure what details are needed. Please see answers below:
At what point in the import wizard does this error message get raised?
- This error message appears when I click finish at the end of the import text wizard
Have you inspected the entire csv file to ensure it's completely consistent in layout and contents?
- There are some 13,000,000 rows of data so I have not checked the entirety of the file. But in my random spot checks I cannot find any errors.
How many columns does the csv file have?
- There are 18 columns
How many characters in a typical row in the csv file?
- Looks to average between 50 and 80 characters
Thanks for the help!
- George_HepworthApr 06, 2022Silver Contributor
Thanks. It errors when you have gone through the wizard and attempt to finish it and start the import.
13,000,000 rows? In a csv file.... I would have to do some research, but that sounds like it might be a factor. I am also suspicious about that number, because the maximum number of rows in an Excel worksheet would be 1,048,576. That means this csv file would be at least 12 times too large to have been saved from an Excel file.
Where is this file generated originally, by the way?
80 characters divided by 18 columns is about 4 to 5 characters per field. not a lot on that dimension.
In fact, thinking about it a bit more, I'm wondering if Access is really the right tool for a 13,000,000 row CSV file. Perhaps SQL Server, or another server database would be more appropriate.
- George_HepworthApr 06, 2022Silver ContributorI'm finding references to imports into SQL Server failing at around that number of rows (13,000,000) from a csv. It makes me wonder if there isn't a a way to batch that data into smaller, more manageable chunks.