Power Query - Avoid "Helper Queries" (+10 Cool Tricks)

Поделиться
HTML-код
  • Опубликовано: 13 фев 2024
  • Learn how to avoid creating "Helper Queries" when extracting data from binary files.
    Plus, see 10 incredibly useful Power Query & "M" code tricks to save valuable time and perform unthinkable data transformation acts.
    "M" code examples (File download link:)
    www.bcti.com//wp-content/YT_D...
    00:27 When & why Helper Queries are created
    01:29 Video Objectives
    02:11 Avoiding Helper Queries when Working with CSV files
    04:18 Filtering by File Extension (i.e., file type)
    05:03 Filtering by Path (i.e., file location)
    05:44 Removing unnecessary Meta Data
    05:54 Utilizing "Helper Queries"
    07:03 Removing unwanted rows using "good" errors
    07:36 Avoiding "Helper Queries" when working with text files
    07:50 Extract Binary content using a function
    09:06 Filling in the missing steps
    10:26 Avoiding "Helper Queries" when working with Binary files
    12:57 Filtering nested tables before data extraction
    15:07 Sorting columns during column extraction
    15:54 Renaming columns during column extraction
    16:37 Removing total rows using "good" errors
    17:33 Find / Replace values
  • НаукаНаука

Комментарии • 110

  • @robbe58
    @robbe58 4 месяца назад +11

    What a way to avoid all those extra query steps when using files from a folder.
    NOT having to see all those steps Power Query adds is fantastic.
    You are a very skilled and intelligent guy.
    So keep digging into the Excel world to explain all those useful methods/tips.
    Thank you very much for sharing them.

  • @serdip
    @serdip 2 месяца назад +3

    Great video! Thanks for sharing this very practical information. I have performed multi-file import operations similar to what was demonstrated in the lecture. However, I just retain the [Name] column from the initial metadata table and don't require the additional steps outlined in the presentation. I do remove the file extensions, of course. It propagates through the subsequent steps, as far as I can tell.
    I have been using and programming Microsoft Excel for 27 years but only started learning about Power Query some six months ago. It's a game changer! It's so cool that PQ can replace many of the VBA routines I have developed over the years to clean and transform data, all with just a few clicks. Throw in the added power of customizing the generated M Code script - I have created dozens of general-purpose PQ custom functions plus my own Excel VBA Add-in to help load multiple tables and ranges in one go - and my data cleansing capabilities have now reached the next level.
    I will *never* interact with Excel the same way again!
    Thank you kindly.

    • @bcti-bcti
      @bcti-bcti  2 месяца назад +1

      I, like you, retain the [Name] column from the meta data. This was to demonstrate how to achieve the goal if you didn't take that approach. I used to rely very heavily on VBA for me cleanup and automation solutions. Now it only gets used for situations where Power Query can't do the job. MY need for VBA has declined about 90% since Power Query came to being. Thanks for watching.

  • @ExcelWithChris
    @ExcelWithChris 3 месяца назад +10

    Brilliant! Been using for a few years and always struggled with those helper files. Sometimes ending with 20 or even more. And it creates major issues if you just copy a query and maybe change the source only. It still works but I am always scared something goes wrong. Thanks so much. More of these please.

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      Thanks for taking the time to watch. Glad it helped. 👍👍

    • @phandongexcel
      @phandongexcel 3 месяца назад

      I like this video ❤❤❤❤

    • @manojkrishna5391
      @manojkrishna5391 2 месяца назад

      Fantastic

  • @iankr
    @iankr 4 месяца назад +1

    Great video, and very clearly explained, as always. I also use that technique to avoid those confusing Helper Queries!

  • @mightydrew1970
    @mightydrew1970 3 месяца назад +2

    Great stuff, I'll use that next time I touch a folder full of files. The only thing I'll do different is to filter out "grand total" and surplus header rows first. Filtering just based on error may (silently) remove and hide bad entries in the csv

  • @BrvKor
    @BrvKor Месяц назад +2

    Simple, clear, focused --> understood and implemented in my work. Excelent presentation, clear use of words and examples are one of the best I have seen. Thank you for the tutorial.

    • @bcti-bcti
      @bcti-bcti  Месяц назад

      Thank you so much for your VERY kind and supportive words. Thanks for watching!!!

  • @SaladSharkGaming
    @SaladSharkGaming 14 дней назад

    Ive been wanting to get around these helper queries for ages. Your guide is incredibly straight forward, and i cant wait to apply this new methodology in my future queries!

    • @bcti-bcti
      @bcti-bcti  13 дней назад

      Thank you so much for saying nice things. Thanks for watching!!!

  • @paser2
    @paser2 Месяц назад +1

    Just what I was looking for - getting rid of helper queries and hard coded steps. I love the fool proofing tips. Thank you.

    • @bcti-bcti
      @bcti-bcti  Месяц назад

      You're very welcome. 👍👍

  • @tinhoyhu
    @tinhoyhu 4 месяца назад +2

    I like this format of going through long examples with a lot of embedded tricks.
    That Detect Data Types feature will save me a lot of time.

    • @bcti-bcti
      @bcti-bcti  4 месяца назад

      So glad you're enjoying them. I know it's not a format for everyone. Thanks for watching.

  • @qwerty1945yt
    @qwerty1945yt 3 месяца назад +1

    this is AWESOME! and, learned fundamentals in the process! thanks!!

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      You are very much welcome. Thanks for watching.

  • @Donkeys_Dad_Adam
    @Donkeys_Dad_Adam 3 месяца назад +2

    Holy Cow!! This was an incredible mind blowing bit of knowledge! I for one have always HATED the helper queries because when you swap out entire sets of files in a folder, the helper queries are still looking for the filename that they first referenced and when that file is gone, they break. I have some pretty heavy queries that have too many steps. I know I can clean those up much better now. Thank you.

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      So glad it will help. I prefer this method (in most cases.) Thanks for watching!!

    • @Donkeys_Dad_Adam
      @Donkeys_Dad_Adam 3 месяца назад +2

      @@bcti-bcti At work, I was asked to automate the updating and consolidation of spec files when part numbers change. Prior process was to assign someone to manually go through hundreds of files, with tens of thousands of rows and MANUALLY copy/paste/Q.C. the data. Whenever specs were updated (several times per year), this error prone process would take a full time user about a month. Using PQ, I got that down to about 7 minutes... Now that I've seen this, I can make that and other processes even better!

    • @bcti-bcti
      @bcti-bcti  3 месяца назад +3

      @@Donkeys_Dad_Adam FANTASTIC!!!! It’s always great to hear a PQ success story.

  • @user-dn5gd1rn9f
    @user-dn5gd1rn9f 4 месяца назад

    This was excellent!. Thank you! Love your channel!

    • @bcti-bcti
      @bcti-bcti  4 месяца назад

      Thank you for saying so. And thanks for watching.

  • @alman34
    @alman34 3 месяца назад

    This has been super helpful. I didn't know about the combined function. This is saving me a bunch of time. Thank you!

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      Glad we could be of service. Thanks for your viewership!

  • @user-lc8dl5gl9v
    @user-lc8dl5gl9v 3 месяца назад

    Excellent - and a great introduction for anyone looking to learn how to 'program' with power query.

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      Thank you so much for taking the time to watch and comment.

  • @Daithi354
    @Daithi354 3 месяца назад

    Honestly, this was very well explained, at a good pace, with lots of cool tips and tricks. Keep these vids coming, subscribed!

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      Thank you so much. We appreciate your viewership.

  • @DanKnight
    @DanKnight 4 месяца назад

    Wonderful! Excellent tips, except now it's increased my workload, as I'm going to have to go back and rework some queries to avoid the "helpers" and streamline multiple steps.
    Well done, well explained and illustrated.

    • @bcti-bcti
      @bcti-bcti  4 месяца назад

      Yeah, but's a "good" kind of work. Very satisfying. Thanks for watching.

  • @anilb5836
    @anilb5836 3 месяца назад

    Hats up sir. I watched so many youtube views did not get expiation like you.
    Appreciated for sharing such a useful information.

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      That is exceedingly nice of you to say. Thanks for taking the time to watch.

  • @flmiami1
    @flmiami1 Месяц назад

    Great and clean method! Thanks a lot.

    • @bcti-bcti
      @bcti-bcti  Месяц назад

      You're welcome. Thanks for watching.

  • @FranzL608
    @FranzL608 12 дней назад

    I hated these helper files from the beginning. So far I tried to avoid them by creating a function like
    let GetFiles=(Path,Name) =>
    let
    Source = Excel.Workbook(File.Contents( Path & Name ), null, true),
    Sheet1_Sheet = Source{[Item="Sheet1",Kind="Sheet"]}[Data],
    #"Promoted Headers" = …,
    #xxx = …
    in
    #"xxx"
    in GetFiles
    and applied this function to the files in a folder.
    Your solution seems smarter as it even avoids having the function code. Brilliant. I will certainly give it try. Thanks for sharing your knowledge.

    • @bcti-bcti
      @bcti-bcti  12 дней назад

      I'm not a fan either. Thanks for watching.

  • @yousrymaarouf2931
    @yousrymaarouf2931 4 месяца назад +1

    Excellent

  • @gabrielgordon
    @gabrielgordon 4 месяца назад

    Thank you! this is so valuable!

    • @bcti-bcti
      @bcti-bcti  4 месяца назад

      Glad it helps. Thanks for watching.

  • @RichardJones73
    @RichardJones73 4 месяца назад +1

    Superb explanation and now I can get rid of my ugly, confusing helper queries. Always used to drive me mad they did and now I can control the data much more easily
    EDIT: I would add though when I do the Folder.Contents thing I like to have the steps expanded out in the Applied Steps window so that I can easily see the path/folders I've drilled down into. To do this, you add a line in the advanced editor to set a variable e.g. file = "expand" and put a comma on the end and then you will find that the steps following that are expanded out when you drill down into folders

    • @bcti-bcti
      @bcti-bcti  4 месяца назад

      So glad to hear it helps. Thank you so much for watching and commenting. It really helps.

  • @HanoCZ
    @HanoCZ 3 месяца назад

    Simply briliant. I have learned a trick or two, while considered myself as skilled in PQ

    • @bcti-bcti
      @bcti-bcti  3 месяца назад +1

      I like to think I know more than most about PQ, but I still have a LOT to learn. It’s always a great feeling to learn something new. Thanks for watching.

  • @zzota
    @zzota 3 месяца назад

    Amazing. Thank you for this, very helpful.

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      You are most welcome. Thanks for watching!!!

  • @Mephistopheles019
    @Mephistopheles019 2 месяца назад

    A gentleman and a scholar. Thank you so much, dude. Now I get to spend the day optimizing/rebuilding all my tools.
    Question for you: does this speed up the performance due to not having all that overhead?

    • @bcti-bcti
      @bcti-bcti  2 месяца назад +1

      I haven't run any scientific studies, but I can't help but think that things would run a bit quicker without the extra overhead. I'm sure someone could demonstrate a scenario where having the helper queries improves performance, so I guess it's just a matter of what you're willing to tolerate. Thanks for watching!

  • @thelastfry23
    @thelastfry23 4 месяца назад +1

    Great video. Not sure how you keep putting out videos on projects I just finished using a better methodology, but keep them coming!

    • @bcti-bcti
      @bcti-bcti  4 месяца назад +1

      So what's the NEXT project I can assist with? 🤣 Thank you for taking the time to watch.

  • @jerry.david1
    @jerry.david1 3 месяца назад

    What a great video!!

  • @7213261
    @7213261 3 месяца назад

    Thanks, cool trick to avoid subfolders!

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      Yep. Sure is. Thanks for watching.

  • @teekayjunior2517
    @teekayjunior2517 3 месяца назад

    I love your channel and hope it thrives! Specific question from this video. I have a file that I've applied this technique. It works but the refresh rate is exceedingly slow. It's a very small file at this point. The only thing I can think of that would take a long time is filtering through the Sharepoint directory to get to the target folder. The root folder is parent to a lot of subfolders and sub-subfolders and so on. In any case, anything you can do to help me speed this up would be helpful. We're talking over 10 minutes or more every time a change is made.

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      Thank you for the nice encouragement. We hope it thrives, too. If you want to post here, or email me your full query (copied from the Advanced Editor), I can take a look and see if there's anything that may come to mind when it comes to optimizing the performance.
      training@bcti.com

  • @kebincui
    @kebincui 4 месяца назад

    Fabulous video❤

  • @Hello-bn2yc
    @Hello-bn2yc 11 дней назад

    thank you very much

    • @bcti-bcti
      @bcti-bcti  10 дней назад

      You are quite welcome!

  • @jamihoneycutt3025
    @jamihoneycutt3025 15 дней назад

    Great view and excellent training!
    What do you do when your data is not in table form? I am using a date created column from the file selection to separate the data that is on Sheet1 of the individual files. When I click to expand the Data column containing the tables, I get "Column 1", "Column 2", etc. When I promote headers, I lose the first date of the date created because it is then promoted to a header as well. How do you correct for this?

    • @bcti-bcti
      @bcti-bcti  13 дней назад

      Without seeing your data, I would think that you would just have to rename the date column to something more appropriate. Are you losing the entire record when you promote the header row, or just a single heading for a single column?

  • @Gef1ManArmy
    @Gef1ManArmy 4 месяца назад

    Can I ask - why do you start by connecting to the parent folder and then go through the steps of filtering just to the sub folder? Is it in case people have their files across multiple folders? If not, wouldn't it easier to connect straight to the sub folder?
    Thanks for your informative videos.

    • @bcti-bcti
      @bcti-bcti  4 месяца назад +3

      Yes, exactly. I would just start at the sub-folder level, but this gives the operator the ability to "catch" more files that may be spread across multiple folders. It also allows the video to demonstrate "path filtering", something some users may have never thought to do. Thanks for watching.

  • @LilLinh
    @LilLinh 10 дней назад

    Great tips

    • @bcti-bcti
      @bcti-bcti  10 дней назад

      Thanks. We appreciate you taking the time to watch.

  • @SamehRSameh
    @SamehRSameh 3 месяца назад

    Marvelous 🎉🎉

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      Thanks for watching!!!

  • @Isambardify
    @Isambardify 3 месяца назад

    Will the model load everything in the parent folder before filtering down or is it intelligent enough to only request the data it needs?

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      I can't be entirely certain as to your question. Can you rephrase it with a procedural example? Thanks for watching!

  • @srider33
    @srider33 2 месяца назад

    Good video. While I agree that those helper functions clutter any detail on performance for larger datasets with query?

    • @bcti-bcti
      @bcti-bcti  2 месяца назад

      Thanks. I haven’t done any large scale performance tests, so I’m uncertain as to the benefit/detriment of either approach. Thanks for watching.

  • @djl8710
    @djl8710 4 месяца назад +1

    Cool, I hate all the helper junk, way too much clutter. Nicely done!

    • @bcti-bcti
      @bcti-bcti  4 месяца назад +1

      Thanks. I couldn't agree more.

  • @McIlravyInc
    @McIlravyInc 4 месяца назад

    I like the cleaner combining of files, but I often need the name of the source file for each row. How would you approach that? Thanks.

    • @bcti-bcti
      @bcti-bcti  4 месяца назад +1

      You could keep the [Name] and [Content] columns, that way you could use the [Name] as a transaction-level "stamp" so you always know where each transaction came from. See the below M code for an example (this is done with the CSV files, but can easily be done the same way with any files.)
      ==============================
      let
      // Connect to the parent folder holding all files and subfolders
      Source = Folder.Files("C:\Camtasia Projects\Power Query - Avoid Scary Queries"),
      // Standardize the casing of the file extensions (using UPPERCASE)
      Uppercase_File_Extensions = Table.TransformColumns(Source,{{"Extension", Text.Upper, type text}}),
      // Filter to keep only files in the "CSV Files" folder and subfolders
      Filter_By_Path = Table.SelectRows(Uppercase_File_Extensions, each Text.Contains([Folder Path], "\CSV Files\")),
      // Filter to keep only CSV files
      Filter_By_CSV_Files = Table.SelectRows(Filter_By_Path, each Text.Contains([Extension], ".CSV")),
      // Remove all columns of folder metadata EXCEPT the [Name] and [Content] columns
      Keep_Only_Content = Table.SelectColumns(Filter_By_CSV_Files,{"Name", "Content"}),
      // Standardize the casing of the Filenames (UPPERCASE)
      #"Uppercased Text" = Table.TransformColumns(Keep_Only_Content,{{"Name", Text.Upper, type text}}),
      // Extract the tables from the Binary files
      Extract_Tables = Table.AddColumn(#"Uppercased Text", "CSV_Data", each Csv.Document([Content])),
      // Delete the [Content] column (column of Binaries)
      Delete_Binaries = Table.RemoveColumns(Extract_Tables,{"Content"}),
      // Expand the table columns of data from the [CSV Data] column
      Expand_Tables = Table.ExpandTableColumn(Delete_Binaries, "CSV_Data", {"Column1", "Column2", "Column3", "Column4"}, {"Column1", "Column2", "Column3", "Column4"}),
      // Promote the first row of data to a Header Row position
      Promoted_Headers = Table.PromoteHeaders(Expand_Tables, [PromoteAllScalars=true]),
      // Set the Data Types for the columns
      Set_Data_Types = Table.TransformColumnTypes(Promoted_Headers,{{"TranDate", type date}, {"Account", Int64.Type}, {"Dept", Int64.Type}, {"Sum of Amount", Currency.Type}}),
      // Remove any redundant headers and total rows (rows with text are rendered as errors by the previous step)
      Remove_Error_Rows = Table.RemoveRowsWithErrors(Set_Data_Types, {"TranDate", "Account", "Dept", "Sum of Amount"})
      in
      Remove_Error_Rows

  • @bimanroy8865
    @bimanroy8865 3 месяца назад

    Do your input excel files have- tables named after region?

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      Yes; the tables within each Excel file contains a table that was named according to the state the date is related.

  • @patrickharilantoraherinjat2994
    @patrickharilantoraherinjat2994 3 месяца назад

    Before expanding table you can promote headers for each table at once!

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      Very true. Thanks for the input. And thanks for watching.

  • @michaelt312
    @michaelt312 4 месяца назад +1

    In your Expand_CSV step, couldn't you change the header names there? This would avoid the need to promote the first row to name the columns?

    • @bcti-bcti
      @bcti-bcti  4 месяца назад +1

      Can you provide a sample of M-code to demonstrate your combined strategy? Thanks.

    • @michaelt312
      @michaelt312 4 месяца назад +1

      8:55 I'm on my phone so just typing out this line as it is what I'm referencing. I'll type out the full M when I get to my computer later today. Please forgive any typos. I'm currently on a crowded L heading into downtown Chicago. I think you will be able to understand from this.
      =Table.ExpandTableColumn{#"Removed Columns", "CSV_Data", ("Column1", "Column2", "Column3", "Column4"), ("TranDate", "Account", "Dept", "Sum of Amount")

    • @bcti-bcti
      @bcti-bcti  4 месяца назад +1

      I thought you were alluding to that, but I wasn't sure you were doing something else. Yes, that would be a great step-saver. I should have done that in the video. Thanks for the input, and thanks for watching. @@michaelt312

    • @michaelt312
      @michaelt312 4 месяца назад +2

      @@bcti-bcti , your channel is awesome. I tripped on this process when I was trying to clean some code. It is now my go-to in every build.

  • @asher0987
    @asher0987 3 дня назад

    Thats a wonderful solution. What if the data type is in .xml?

    • @bcti-bcti
      @bcti-bcti  2 дня назад

      Once you have the folder data reduced to just a [Content] column, you could create a custom column using the following formula:
      =Xml.Tables([Content])

  • @bimanroy8865
    @bimanroy8865 3 месяца назад

    If there is no error, how can the header rows be removed for all the input files except for one? In this case, there was one field which was numeric. But there may be case where all the columnns may be of same type.

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      Then a different strategy will need to be employed. Every scenario is different and requires its own special adjustments. Thanks for watching.

  • @AlvaroFreireAF1
    @AlvaroFreireAF1 3 месяца назад +1

    Great job!! However, in my case it was not so simple since my CSV files have a more complex encoding.
    When I first tried to use the Csv.document command, it threw an error. To work around the problem, I first expanded using the normal process and copied the formula from the Transform Sample File located in the help folder.
    After that, just replace the expression “Parameter1” with “Content”.
    This was the original formula:
    Csv.Document(Parameter1,[Delimiter=";",Columns=17, Encoding=1252, QuoteStyle=QuoteStyle.None])
    This is the new one:
    Csv.Document([Content],[Delimiter=";", Columns=17, Encoding=1252, QuoteStyle=QuoteStyle.None])

    • @bcti-bcti
      @bcti-bcti  3 месяца назад +2

      Great idea! As this is not a 100% perfect solution, there are always going to be scenarios that require creativity. Thanks for watching!!!

    • @ladytigereye6145
      @ladytigereye6145 3 месяца назад +1

      I have the exact same problem and your solution works for me. I can't say how much I want to thank you, saving me so much time and nerves! 😄🤗

    • @AlvaroFreireAF1
      @AlvaroFreireAF1 3 месяца назад

      @@ladytigereye6145 🤗

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      @@ladytigereye6145 so glad it helped. Thanks for taking the time to watch.

    • @ladytigereye6145
      @ladytigereye6145 3 месяца назад +1

      @@bcti-bcti The crazy thing about those "helper queries" is: If you use them, it prevents you from building a Power BI template app. Power BI template apps don't allow parameters of the type "any" or "binary". So without eliminating those helper queries from the data model one is not able to use an app to publish Power BI reports built from that whole data model. That's a very annoying relation I had to uncover. 😅 Thanks for helping me solve this issue. My app is running now. 😁

  • @YPartbee
    @YPartbee 3 месяца назад

    Nice 😃👍

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      Thanks for taking the time to watch.

  • @ExcelWithChris
    @ExcelWithChris Месяц назад

    at 13.25 I want to extract a specific named range in the file, but it is not showing in the list at the bottom. How can i do that? So what I am doing is getting a file from Sharepoint folder, then go through all the steps to get a named range out of it, resulting in hundreds of Helper Queries. I want to do what you doing here. Get the NamedRange out of the file without the helper queries. The list does not show named ranges in the file???

    • @bcti-bcti
      @bcti-bcti  Месяц назад +1

      I'm tied up for a couple days, but let me look into this and see what I can figure out.

    • @ExcelWithChris
      @ExcelWithChris Месяц назад

      @@bcti-bcti Thanks a million. Would help a lot. Currently I import 15 files and it creates a lot of Helper Queries.

    • @bcti-bcti
      @bcti-bcti  Месяц назад +1

      @@ExcelWithChris I can assist with this, but it would be easier if I were to call you. We could share screens and I could walk you thru the process. That way, if there's a glitch, I can see what is happening and we can figure it out. You can email me at "training@bcti.com". cheers.

  • @anz672
    @anz672 3 месяца назад

    Powerfull

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      Thanks for watching.

  • @sharmarudra
    @sharmarudra Месяц назад

    How to retain the file names?

    • @bcti-bcti
      @bcti-bcti  Месяц назад

      Can you provide additional information regarding your question? I am uncertain as to what you are asking. Thank you.

  • @ARSEABOUTFACE
    @ARSEABOUTFACE 3 месяца назад

    one of the better instructional videos. Very good pace and excellent explanations. Well done.

    • @bcti-bcti
      @bcti-bcti  3 месяца назад

      Thank you for saying so. We appreciate you taking the time to watch.

  • @Mephistopheles019
    @Mephistopheles019 2 месяца назад

    Follow up question for you-what if the data type is txt?

    • @bcti-bcti
      @bcti-bcti  2 месяца назад

      All steps should be the same for .TXT files as they use the same connector for decoding. The only issue may be when you are using a TAB as the file's delimiter instead of a COMMA. If this is the case, you will have to modify the Csv.Document function as follows:
      Csv.Document([Content], [Delimiter=" "])
      (There is a TAB character between the double-quotes.)
      NOTE: The "Custom Column" dialog box will not allow you to type a TAB character in the formula; it interprets TAB as an instruction to move to the next field in the dialog box. For this reason, you may have to type the above command in something like Notepad, then copy/paste it into the "Custom Column" dialog box.
      Hope this helps. Thanks for watching.