Forum Discussion
Power Query losing Columns
- Aug 20, 2024
In case of csv check Transform Sample File query, first step looks like
= Csv.Document(Parameter1,[Delimiter=";", Columns=10, Encoding=65001, QuoteStyle=QuoteStyle.None])
Change Columns=10 on Columns=20 or like, i.e. max possible number of columns.
When in combined file remove all columns with no data. As variant that could be function which removes all empty rows and columns as
(Source as table) as table => let RemoveEmptyColumns = Table.SelectColumns( Source, List.Select( Table.ColumnNames(Source), each List.NonNullCount(Table.Column(Source,_)) <> 0 ) ), RemoveEmptyRows = Table.SelectRows( RemoveEmptyColumns, each not List.IsEmpty(List.RemoveMatchingItems( Record.FieldValues(_), {"", null} ) ) ) in RemoveEmptyRows
Not sure about PDF, check the connector options if something similar.
I finally found out where the "- Csv.Doc..." part belonged (picture 1). I changed the delimiter (the encoding seems fine, but if not i can fix that myself i guess). Power Query then started adding some steps and implementing the example file table (picture 2):
The example file still only has 12 columns, which makes sense I guess. If you take a look at the final combined table it has 13 columns now, which seems like the result i wanted to see.
I am lacking the time experimenting with that right now, but i am very optimistic my problem is solved. Again my thanks for your incredible support.
Max_Mustermann , you are welcome.
Please remove "Change Type" step in Transform sample file query. It works with hardcoded column names which is not good if you work with variable columns.
It could be applied to the final file, and there is trick to automate that somehow, i.e. not to hardcode column names. But bit complex one.