Forum Discussion
How to import text file with 20,000 columns?
Hi Ralph,
Power Query has exactly the same limit on number of colums https://support.office.com/en-us/article/power-query-specifications-and-limits-5fb2807c-1b16-4257-aa5b-6793f051a9f4
Perhaps you may import your txt with dividing on columns not by delimiter, but using Fixed width, 3-4 separate imports and split on columns each of them with some combining after that. But that very depends on your data structure.
- Ralph PuchalskiJun 21, 2018Copper Contributor
Dear Sergei,
If the original data had been transposed prior to saving in the text file, there would be no problem; 20,000 rows x 500 columns is easy to import. The data set would be ready to mine.
I tried your recommendation, but it may not be suitable for the structure of my data set. I am able to import big blocks of merged columns using the text import wizard based on fixed width, but defining the fixed widths is tough to do in the tiny preview window, and then deleting some blocks so that the total is less than the 16,384 excel limit requires some guessing, and then matching the data columns (not in quotes) to the column names (are in quotes), because they are not aligned, makes it even more difficult. Separating the blocks of columns into columns in excel worked well except for the cells on the left and right edges of the blocks. These cells lost data to adjacent blocks. There must be another solution.
It might be best to (1) split the original text file and import into excel into two different sheets, transpose each, and then combine, or (2) use a macro in excel to import, or (3) maybe use a pivot table if this is possible, to connect to the remote text file with 20K columns, and import or add to the destination sheet's pivot table just 10K columns or transpose them to rows, but I am not sure how to automate this. The main point of (3) is to have the excel file connect to a source text file or database to extract just what is called up to avoid exceeding the 16K limit on columns.
What do you recommend?
Thanks,
Ralph
- hirandesilvaFeb 16, 2021Copper Contributor
Ralph Puchalski Why can't the data be received normalized from the source? Wherever the data exists, it must be stored in tables that are normalized.
Now there's an opportunity for adding value ...
- SergeiBaklanJun 21, 2018Diamond Contributor
Hi Ralph,
From my point of view the best way is to connect directly to the database and filter the data on query level. The only most probably some coding will be required, not sure you may filter data with UI due to columns limit.
If you have Power Pivot in your version of Excel you may try to connect to database or to text file directly from Power Pivot, data model has another structure in background and no direct limits on columns. If works, result could be landed to Excel sheet as normal table, not as Pivot Table. - if you reduce number of columns within the data model.
- Ralph PuchalskiOct 17, 2019Copper Contributor
Hi Sergei,
Yes, I upgraded to Excel 2019, so have access to Power Pivot, but I can't see an option for importing part of the text file into the data model, and don't understand the method for dividing the data in the model even if I knew how to import the entire text file into the data model.
Can you explain or point me to a resource that describes this?
Other possible options include dividing the text file in a text editor, but I don't know how to use an editor like Word to begin dividing a file with e.g. 24,000 columns into two files.
Can R or the command line be used to transpose or divide text files?
Thank you,
Ralph