Forum Discussion
Max_Mustermann
Aug 20, 2024Copper Contributor
Power Query losing Columns
Hello there, I have a problem importing some folders into power query. The interesting part is the different files in said folder seem to have a varying amount of columns - and power query onl...
- Aug 20, 2024
In case of csv check Transform Sample File query, first step looks like
= Csv.Document(Parameter1,[Delimiter=";", Columns=10, Encoding=65001, QuoteStyle=QuoteStyle.None])
Change Columns=10 on Columns=20 or like, i.e. max possible number of columns.
When in combined file remove all columns with no data. As variant that could be function which removes all empty rows and columns as
(Source as table) as table => let RemoveEmptyColumns = Table.SelectColumns( Source, List.Select( Table.ColumnNames(Source), each List.NonNullCount(Table.Column(Source,_)) <> 0 ) ), RemoveEmptyRows = Table.SelectRows( RemoveEmptyColumns, each not List.IsEmpty(List.RemoveMatchingItems( Record.FieldValues(_), {"", null} ) ) ) in RemoveEmptyRows
Not sure about PDF, check the connector options if something similar.
Max_Mustermann
Aug 21, 2024Copper Contributor
Hey SergeiBaklan, again thank you for the input. My first problem is that i don't know where to put the first part:
= Csv.Document(Parameter1,[Delimiter=",", Columns=null, Encoding=65001, QuoteStyle=QuoteStyle.None])
The rest became quite messy, as i had to partly translate it, as my office runs in german. It kinda seemed to work in the end - excluding the last step. But in the second last step "names" i already seen that power query fetched 12 columns again, meaning it catched one too few.
Maybe i already broke it with neglecting the first part? The error is being unable to convert a source csv into a table...
SergeiBaklan
Aug 21, 2024MVP
Please check queries in the attached file. It combines 3 csv files located in local folder c:\Test temp.
Or you work with SharePoint folder?
- Max_MustermannAug 21, 2024Copper Contributor
I found your testfile working perfectly after putting the 3 imported ones in the source folder. After replacing the three test inputs by the data i want to actually use it somehow worked without giving me any errors back, which was little confusing, but also seemed to have squished all data into a single column - making me now lose the other 12.
Oddly, i can find all 13 column headers now, squished, too. It looks as it looks in the import data, some have 12 columns, some have 13. Maybe something with delimiters...
- SergeiBaklanAug 21, 2024MVP
Yes, in your case delimiter shall be semicolon
[Delimiter=";", Columns=null, Encoding=65001,
perhaps another encoding. In than part take the generated script and change only number of columns on null, don't touch the rest.
If that doesn't work perhaps you could share couple of your csv files keeping only few rows and removing sensitive information.
- Max_MustermannAug 21, 2024Copper Contributor
I finally found out where the "- Csv.Doc..." part belonged (picture 1). I changed the delimiter (the encoding seems fine, but if not i can fix that myself i guess). Power Query then started adding some steps and implementing the example file table (picture 2):
The example file still only has 12 columns, which makes sense I guess. If you take a look at the final combined table it has 13 columns now, which seems like the result i wanted to see.
I am lacking the time experimenting with that right now, but i am very optimistic my problem is solved. Again my thanks for your incredible support.