Automatically Remove Top Junk Rows & Combine Data from Multiple Excel Files

Поделиться
HTML-код
  • Опубликовано: 23 ноя 2024

Комментарии • 139

  • @stuartproctor6051
    @stuartproctor6051 6 дней назад

    Wow, this came up in my feed and it’s exactly what I’ve struggling with all week!!! You’re an absolute star, thank you! 🙌🏻☺️👌🏻

  • @sharadpunita
    @sharadpunita 2 месяца назад +2

    I am 70 having no use of learning Excel/Power query! I keep watching your videos as I like the way you solve problems. I have learnt so much from your teaching. Can I pay some Token for whatever I have learnt.

  • @nelson_k_d
    @nelson_k_d 6 месяцев назад +4

    I saved the entire syntax as follows:
    Add Custome Column =Table.PromoteHeaders(Table.Skip([Data], each not List.ContainsAny( Record.ToList(_), {"Header1", "Header2"} )))
    Works wonders!!
    Thanks a ton bro!!

  • @OmisileKehindeOlugbenga
    @OmisileKehindeOlugbenga Год назад +5

    Pretty awesome! Thanks a lot for this. Record, Table and List object manipulation in one video for one task without using "Remove Other Columns"

  • @ramonillarramendi3191
    @ramonillarramendi3191 Год назад

    Wow..Amazing. Been struggling with removing dynamically the junk and with custom headers for a while now. This works like a charm. Thanks a mill.

  • @josepepe741
    @josepepe741 5 месяцев назад

    It is a huge pleasure to look at your videos. Moving from excel advanced user to Power query person. Thanks a lot.

  • @retamapark
    @retamapark 11 месяцев назад

    Thanks, Chandeep! I knew you had posted this, and I had this problem today. I was fooling around with different other approaches that were a mess. This was perfect!

  • @goodnewskasparyaodzramedo9097
    @goodnewskasparyaodzramedo9097 Год назад

    This is wonderful @Goodly. I watch all your videos. From the logic to the solution of the problem and the actual solution. God continually bless you, you are a messiah!

  • @odallamico
    @odallamico 6 месяцев назад

    Dear, you are a genius. You make M language look so easy. I appreciate your videos, my respects to you.

  • @Phippsy23
    @Phippsy23 Год назад +1

    Amazing, I've done this with a fixed Skip value but this is on another level! Thanks

  • @nitish9111
    @nitish9111 Месяц назад

    Thanks Chandeep! I was using Index for this one, but you make it so easy. Learning from your videos is amazing! Keep up the good work!

  • @aahanavikram07
    @aahanavikram07 Год назад +2

    You hear the problem it seems😂
    I was using filter method and removing null values and a lot of other filter method.
    Thanks for making the work easier and cleaner ❤

  • @justinwduff
    @justinwduff Год назад

    Thank you, SO MUCH! Had about 200 files to combine with various junk rows up top and now I can do it :D

    • @GoodlyChandeep
      @GoodlyChandeep  Год назад

      Woah.. thats a lot of files.
      I am glad I could help

  • @tARasKoni
    @tARasKoni Год назад

    M-Masterpiece!
    Packing a lot of slick tricks in one video.
    Thank you Chandeep!

  • @JeevanC-l3k
    @JeevanC-l3k 9 месяцев назад

    I am actually learning power query its Excellent. I like the way to teach. Thank u so, Much for this video.

  • @rhaps2008
    @rhaps2008 Год назад

    Amazing, I have had to struggle through with this exact issue to manually remove those junk rows, your a life saver I will be using this in the future

  • @decentmendreams
    @decentmendreams Год назад

    Your methods become so refined over time . Awesome job

  • @inaction2024
    @inaction2024 Год назад

    This is a lifesaving technique. Thank you for sharing with us.

  • @roywilson9580
    @roywilson9580 Год назад +1

    Thanks for the video. Much better than only skipping rows to one hardcoded value, makes sense to use if your column order is not the same across data tables.

  • @Sarvabhauma
    @Sarvabhauma 4 месяца назад

    Thanks @Chandeep, I got it based on your trick im able to do it for removal of top rows. I also wanted to do for bottom rows, there is lot of junk bottom rows from my sheet, i applied same trick but i have added index row with descending order then applied this trick for to remove junk rows from sheet and sort ascending order index back.. Anyways your tricks are fantastic.

  • @Chillman666
    @Chillman666 Год назад

    You have no idea how much Power Query has helped me to automate my tasks. Also I have been struggling with this problem. A big THANK YOU ❤!

  • @rajanpradeepankarath8846
    @rajanpradeepankarath8846 Год назад

    yeah that is pretty damn awesome, Chandeep. this is an everyday challenge

  • @vl21i
    @vl21i Год назад

    Super Awesome, Chandeep.
    Very powerful formulas that you are teaching in a simple and easy understandable way !
    Power Query and DAX are having lot of hidden treasures

  • @joukenienhuis6888
    @joukenienhuis6888 Год назад

    Thanks again for such a great clear videobabout the next step in PowerQuery. I am new in PowerQuery, but i am experimenting on DAX and you are giving a great explanation

  • @rohitwaradpande2492
    @rohitwaradpande2492 Год назад

    This was awesome video. Thanks for the same. I liked the trick that you used to removed the blanks dynamically.

  • @MikeMcGlynn
    @MikeMcGlynn 4 месяца назад +2

    Hello, trying to see if any one else ran into this issue:
    -Followed all steps and worked as they should until ~7:00 when we're supposed to transform the content from Binary to a Table. My files are all CSV so the formula I ended up using was =Table.TransformColumns(Source,{"Content",Csv.Document}) instead of =Table.TransformColumns(Source,{"Content",Excel.Workbook}) that is used in the video. It converted to Tables fine.
    -Then on the next step (7:40) where you're supposed to expand the tables and see all of the sheets, when I do this...the header options are just Column 1, Column 2, etc. instead of the actual headers...AND every row of all the files shows up instead of just the sheet name.
    -When I follow the steps afterwards, I get the error: "dataformat.error there were more columns in the result than expected"
    Any idea what's going on?

  • @ursvenky6394
    @ursvenky6394 Год назад

    Awesome video. I was struggling earlier. I had work by using macro. This is very cool. Thanks Goodly

  • @setantadundalk
    @setantadundalk Год назад

    Very sleek. I had this very same issue but I used List.Generate to loop through each record which suppose would take slightly more processing time but nothing you would ever notice.

  • @santiagovillamoreno6821
    @santiagovillamoreno6821 Год назад

    Beautiful Power Query techniques!!

  • @manojsakthi6201
    @manojsakthi6201 Год назад

    its very awesome, i hade a similar issue had to work around it, but this looks pretty good

  • @bohdanduda652
    @bohdanduda652 Год назад +4

    Another way, how we can do it, it is add additional column using List.PositionOf and due to that calculate in which position we have Date and Profit

  • @pmsocho
    @pmsocho 9 месяцев назад

    Awesome. Great logic. Thanks for the video.

  • @iankr
    @iankr Год назад

    Brilliant! Many thanks, Mr Goodly.

  • @ShubhamSharma-ls6hj
    @ShubhamSharma-ls6hj Год назад

    Very well explained thanku so much brother.

  • @davidjosevarelagarcia7011
    @davidjosevarelagarcia7011 Год назад

    Great , excellent. Simple like that. Thanks.

  • @stephenandrews1291
    @stephenandrews1291 Год назад

    this is awesome; thank you! one question - i need to add a column into the combined file that shows the original source filename for each record... where in the flow and how best to do that please?

  • @syrophenikan
    @syrophenikan Год назад +1

    Fantastic! This is definitely going into my daily routine.

  • @cesarsaldana3429
    @cesarsaldana3429 4 месяца назад

    Amazing!!!, Greetings from Mexico.

  • @jianlinchen7978
    @jianlinchen7978 7 месяцев назад

    really very helpful. Thanks . It is a good idea

  • @rahulmahajan5932
    @rahulmahajan5932 Год назад

    Awe Stucked... No Words to Express How Fantabulous It is

  • @Adam_K_W
    @Adam_K_W 6 месяцев назад

    This is great! What is the best way to do this when your source files are not formatted as Tables, but are simply Excel Worksheets?

  • @SAIKISH2005
    @SAIKISH2005 4 месяца назад

    Marvelous work ji

  • @Rice0987
    @Rice0987 Год назад

    Magosh... It's just simple brilliant!💎
    Thanks a bunch for yor priceless help!🤗👦

  • @zahir585
    @zahir585 Год назад

    Genius 🔥 thank you my friend sooo helpful ❤️❤️❤️

  • @chrism9037
    @chrism9037 Год назад

    Fantastic Chandeep, thank you!

  • @FarkadBarri
    @FarkadBarri 4 месяца назад

    This is awesome! I have one challenge, that one of the first rows contain the date of report and I want it to be within the data columns , how to do that :D!

  • @emilmubarakshin49
    @emilmubarakshin49 Год назад

    Fantastic video and amazing explanation!

  • @cherianiype
    @cherianiype Год назад

    ooff!! SHABASH! Terrific video Chandeep! Superb!! Sixer Maar diya!

  • @wensesvincen4877
    @wensesvincen4877 Год назад

    Hi Goodly, Wonderful and Powerful Trick...Keep up the good work. What if you are importing from PDF files...Trying to convert the binary gives a different results.

  • @swathimanda5342
    @swathimanda5342 Год назад

    Hi Goodly,
    That was an amazing video. I learned a lot from your videos for my daily tasks with excel. It saves lot of my time. God bless you.
    I have a question please if you can answer that, when I covert pdf to excel most of the column values are not aligned into 1 column but locate on either side.
    Ex: column B dates, should be in column B but on few rows it will be on A or C.
    How can I align them into just 1 column B.
    Please advise.
    Thank you for all your great videos. 🙏

  • @noorbisharmohamed
    @noorbisharmohamed Год назад

    Fantastic! But can we use "is blank" instead of "not Contain any" as the condition?
    Or promote headers if record contains any " Date", "Amount" etc?

  • @SandhyaSingh-qk6up
    @SandhyaSingh-qk6up Год назад +1

    Can this also be solved by using index number and custom function?

  • @Ratnakumarwrites
    @Ratnakumarwrites Год назад

    Super Video Chandeep.

  • @BharathiM-t8y
    @BharathiM-t8y 10 месяцев назад

    admiring the brilliance

  • @s1ngularityxd64
    @s1ngularityxd64 Месяц назад

    excellent🙂

  • @tiagocarvalhal4502
    @tiagocarvalhal4502 Год назад

    Incredible. Thanks :)

  • @swapna_learner
    @swapna_learner Месяц назад

    Could we have similar logic for bottom rows?

  • @arbazahmad7177
    @arbazahmad7177 Год назад

    Fantastic 🎉.. Thanks 😊

  • @excel-in-g
    @excel-in-g Год назад

    As always, very neat & clear stuff. 👍
    I was wondering if one can't use Table.FindText?
    Like, each Table.Skip( _,Table.PositionOf(_, Table.FindText(_, "Profit"){0} )))
    But only testing for 1 column header here.

  • @andreass.3130
    @andreass.3130 Год назад +1

    Hi Goodly, thanks for all your great videos. Isn't there a simpler way to do it here? In the example file you create a conditional column (If Column1=Date Then True). Then you fill the conditional column downwards. Now you have a True for all rows you need and a null value above the desired header row. So you can filter for True. Shouldn't that be dynamic as well?

    • @DAXifiedSatish
      @DAXifiedSatish Год назад

      No. In that way if u filter true it will be with junk rows and all data except headers and if u filter for false u will get only headers

  • @patrickharilantoraherinjat2994

    Great video Goodly ! what if I have a junk rows and also a junk columns. is it possible to combine? Thanks

    • @GoodlyChandeep
      @GoodlyChandeep  Год назад

      Thanks Patrick.
      May this video will help ruclips.net/video/1fn8fXYw6M4/видео.html

  • @gennarocimmino
    @gennarocimmino Год назад

    I would say simply amazing !!!!

  • @starprinceofficial9603
    @starprinceofficial9603 7 месяцев назад

    For me my question if we need to bring the data before the columns as a new column before promoting headers, how do we go about it

  • @RogerStocker
    @RogerStocker Год назад

    unbelievable crazy as usual.

  • @stefankirst3234
    @stefankirst3234 Год назад

    Really awesome!

  • @marylandcarvajal9909
    @marylandcarvajal9909 17 дней назад

    Amazing❤! 🎉

  • @babakmohammadi9931
    @babakmohammadi9931 Год назад

    could you possibly tell me what do i have to do with CSV files for "Table.TransformColumns(Source, {"Content", Excel.Workbook})" as that dosent work for CSV

  • @MrKamranhaider0
    @MrKamranhaider0 Год назад

    JUST AMAZAING SUPERB

  • @TomStewart-x1t
    @TomStewart-x1t 5 месяцев назад

    I have a similar problem with a CSV file, there are title characters before the Column headings. How could I remove those?

  • @markdonaghy5397
    @markdonaghy5397 3 месяца назад

    I have an interesting use case. I have a base file with headers but the fifth file has had one additional column with headers added. As a result the process you have described breaks and for the last table I get the error "DataFormat.Error: There were more columns un the result than expected" in the CSV data column.
    I have been wrestling with this for a week and my research does not show any ways to manage this, although an append tables would seem to manage this.

    • @GoodlyChandeep
      @GoodlyChandeep  3 месяца назад

      send me your sample data and description of the problem - goodly.wordpress@gmail.com

  • @mohamedheltoukhy2355
    @mohamedheltoukhy2355 Год назад

    Amazing
    Thanks alot

  • @zeinh5139
    @zeinh5139 Год назад

    Wow ! thank you

  • @mohitchaturvedi8931
    @mohitchaturvedi8931 Год назад

    Delighted, this is the problem of every hour.
    Many times data come with merged header, which you have sorted already,

  • @devendrareddynamballa1053
    @devendrareddynamballa1053 7 месяцев назад

    What if we want to add back the removed rows after promoting the headers

  • @huyuc6614
    @huyuc6614 11 месяцев назад

    Hi i stuck on the step transform column. My file is .csv file not xlsx so when i use transformcolumn, it show error on the content. Do you know how to fix it

  • @eslamfahmy87
    @eslamfahmy87 Год назад

    If the name & kind of data which extracted into PQ it's inconsistent and i need to filter out for all non needed sheets,
    How can work with that sheet

  • @itgyantricks7218
    @itgyantricks7218 Год назад

    excellent

  • @cooolbreeze
    @cooolbreeze Год назад

    Pretty cool. Seems the only thing that would limit what one can do with PQ is one's imagination. Question, why List.ContainsAny instead of List.ContainsAll?

  • @hassanjatta4257
    @hassanjatta4257 Год назад

    Awesome!!!

  • @zdzislawkes
    @zdzislawkes 3 месяца назад

    Hi, It's very smart solution. I'm looking for instruction, how to combine tables and not lose the columns that exist in the previous steps, in this case I would like the Name column to remain?

    • @GoodlyChandeep
      @GoodlyChandeep  3 месяца назад

      you'll find the answer in this video
      m.ruclips.net/video/oExuBdnHtrk/видео.html

    • @zdzislawkes
      @zdzislawkes 3 месяца назад

      @@GoodlyChandeep Thank You very much. This is what I am looking. Can Yoy say what You think about this solution: ruclips.net/video/rCYn_onMP0I/видео.htmlsi=QYmkwRM2Cl1FuoCu

  • @j_baisley_
    @j_baisley_ 10 месяцев назад

    Damn awesome is right 👍

  • @williamarthur4801
    @williamarthur4801 Год назад

    This may sound like a stupid question, and I'm sure it's something basic, but why do you get
    Name = Sheet , Data = Table, but I always have Name= table , Name = Sheet and adjacent
    Data = table, Data = table ? Oh, and loved the use of a condition for skip which I'd never thought of,
    even though now having looked it does say count or condition.

  • @rogeriopsvalle
    @rogeriopsvalle Год назад

    Hi Goodly.
    Thanks for all videos. They are just great.
    I need to combine two of your tricks in just one.
    I have many sheets with junk lines (same number of junk lines for all sheets) and these same sheets have inconsistent columns.
    How do I do that?
    Thanks in advance

    • @GoodlyChandeep
      @GoodlyChandeep  Год назад

      You code to remove the junk headers can probably be
      Table.Skip ( Table, List.MatchesAny(Record.ToList(_), each _ "null" or each _ "")
      After this you can promote the headers and then follow the inconsistent header video.

  • @abhijitmodak3461
    @abhijitmodak3461 11 месяцев назад

    Thanks!

  • @jimmckie3574
    @jimmckie3574 3 месяца назад

    Brilliant

  • @denissipchenko2455
    @denissipchenko2455 Год назад +1

    Thanks for sharing!
    But looks like there should be List.ContainsAll instead of List.ContainsAny

  • @josealvesferreira1683
    @josealvesferreira1683 Год назад

    very nice

  • @eslamfahmy87
    @eslamfahmy87 Год назад

    Actually, I think you are the only youtuber instructor who is preparing depth & creative PQ examples.
    Really, fantastic 👏, elegance & more simple 😊

  • @bulbulahmed4414
    @bulbulahmed4414 Год назад

    Amazing

  • @markp8600
    @markp8600 Год назад

    Fabulous video , but it is difficult to see the "Applied Steps" at the end of the video. Question: what if the junk rows and "junk row data" are spread inconsistently through the spreadsheet. For example, the fund or department information may change resulting in blank rows between datasets and header rows when a new fund source or department is identified in the report. The example shows how to remove junk rows at the top of the report for 3 reports. Do you have any videos where the junk rows would be sporadically located throughout the report?

    • @GoodlyChandeep
      @GoodlyChandeep  Год назад +1

      Mark, I'll have to take a look at the data to give you possible ways to solving it.
      See if you can pick any tricks from this long video - ruclips.net/video/_ZKT1raC4P0/видео.html
      I've shared horizontal and vertical looping techniques in this video.

    • @markp8600
      @markp8600 Год назад

      @@GoodlyChandeep - Hi Chandeep, I could share a file with you on your website. I could upload on the PQ training course site. Would that work? Thanks for the great insights!

    • @markp8600
      @markp8600 Год назад

      @@GoodlyChandeep Perfect! Thank you so much Chandeep. Your help and guidance is greatly appreciated. Cheers!

  • @retamapark
    @retamapark 11 месяцев назад

    Not as sophisticated, but say you had a table with two columns, Col1 and Col2. There are junk rows at the top. The row with headers has the values "Date" and "Amount". This seems for work and is easy to implement.
    let
    Source = Excel.CurrentWorkbook(){[Name="Table1"]}[Content],
    Custom1 = Table.Skip(Source, each [Col1] "Date" and [Col2] "Amount")
    in
    Custom1

    • @retamapark
      @retamapark 11 месяцев назад

      fnTableSkipDynamic(Source, "Col1", "Date", "Col2", "Amount")
      Function to do this: Input column names as text, header values as text.
      (sourcetablename as table, col1name as text, header1value as text, col2name as text, header2value as text)=>
      let
      return = Table.Skip(sourcetablename,
      each
      ( Record.Field(_, col1name) header1value)
      and
      ( Record.Field(_, col2name) header2value)
      )

      in
      return

    • @GoodlyChandeep
      @GoodlyChandeep  11 месяцев назад

      List.matchesall can look for both the headers. 👍

  • @ericliu79
    @ericliu79 3 месяца назад

    Great!

  • @oOBlindyOo
    @oOBlindyOo 5 месяцев назад

    My source data is from pdf. Not excel.
    So the formula '=Table.TransformColumns(Source,{"Content", Excel.Workbook}) isn't working... any ideas?

  • @ankitachauhan5834
    @ankitachauhan5834 6 месяцев назад

    How to do same thing but fro CSV file

  • @PandaVlademirBerlin
    @PandaVlademirBerlin 7 месяцев назад

    What if i need the source.file column?

  • @carolshipley7903
    @carolshipley7903 5 месяцев назад

    Hi. Can you give me away of doing this if there is more than one worksheet in the excel file and you only want to clean just one worksheet?

    • @GoodlyChandeep
      @GoodlyChandeep  5 месяцев назад

      If you're connecting to a single xl file, you'll have to apply a filter before the Navigation Step to restrict the sheet names that you want.

  • @ankursharma6157
    @ankursharma6157 Год назад +1

    Token of Gratitude!

  • @cdmkk2211
    @cdmkk2211 10 месяцев назад

    How can I add files name as a column in this

  • @alokkumarsahu228
    @alokkumarsahu228 19 дней назад

    I am working with csv fileand the suntax is not working

  • @ajay249
    @ajay249 Год назад

    Goodly is just too Godly.