Table.Split and Table.SplitAt - Power Query

Поделиться
HTML-код
  • Опубликовано: 1 янв 2025

Комментарии •

  • @asimonr
    @asimonr Год назад

    Gracias Alejandra, mas claro imposible. Excelente entrega!!!!

  • @franciscom.paredesarias2356
    @franciscom.paredesarias2356 Год назад

    Muchas Gracias @Alejandra Horvath siempre algo nuevo que aprender con tus entregas. Saludos desde Chile

  • @_GB84
    @_GB84 11 месяцев назад

    You're the best! Thank you for helping me solve this problem!

    • @alejandrahorvath
      @alejandrahorvath  11 месяцев назад

      Glad you found it useful Gary! Thanks for watching 😊

  • @kestaanalytics
    @kestaanalytics 7 месяцев назад

    Thank you. I got the answer I needed. the first 3 minutes of watching your video

  • @OdelLeal
    @OdelLeal Год назад

    Alejandra, Excellent results.

  • @roywilson9580
    @roywilson9580 Год назад +1

    It is great to see you back posting videos!
    Thanks for the interesting and useful video. Nice use of table.splitat. I would make a small change to the steps to avoid hardcoding the split point of the tables by inserting few steps adding an index column, filtering down to a known value found in the row to split at and drilling down to the index number:
    ......
    AddIndex = Table.AddIndexColumn( Table001, "Index", 1),
    DrillDown = Table.SelectRows(AddIndex,each [Column13] = null)[Index]{0},
    PrdcDataSet = List.Transform(Table.SplitAt(AddIndex, DrillDown), each
    let
    a = Table.RemoveColumns(_, "Index"),
    b = Table.PromoteHeaders(a),
    c = Table.SelectRows(b, each [#"Product #"] null),
    d = Table.UnpivotOtherColumns( c, {"Product #"}, "Year - Qtr", "Amount"),
    e = Table.SplitColumn(d, "Year - Qtr", Splitter.SplitTextByAnyDelimiter({" - "}),{"Year", "Qtr"})
    in e
    .........
    Not overly important on such a small dataset but a for a large dataset it might save a lot of scrolling around.

    • @alejandrahorvath
      @alejandrahorvath  Год назад +1

      It's great to be back 😊
      Thank you very much for sharing!
      Very appreciated 🙏
      Thanks for watching 😊

  • @tobesoria
    @tobesoria Год назад

    Excelente mini training💪🤗. gracias
    Por otro lado, se te extraña mucho con los "live"

  • @antonysovy5887
    @antonysovy5887 Год назад +1

    as you always say, excellent!!

  • @eduardomunoz2764
    @eduardomunoz2764 Год назад

    Magistral como siempre, saludos estimada Maestra Ale.

  • @zahoorsarbandi2982
    @zahoorsarbandi2982 Год назад

    Very interesting and full of knowledge. Thanks

  • @tb9359
    @tb9359 Год назад

    Great job! I enjoy your videos and learn a huge amount of tips. Thank you so much!!

  • @Ratnakumarwrites
    @Ratnakumarwrites Год назад

    Great mam... I am learning from you... Love your videos

  • @SamehRSameh
    @SamehRSameh Год назад

    Excellent 👌👍👍

  • @asheeshpahwa7508
    @asheeshpahwa7508 Год назад

    Thanks 😊
    Keep bringing these videos .
    I would request you to please make videos on the excel bi challenges that comes everyday on linkedin
    Thanks in advance

  • @etienneleparisien1391
    @etienneleparisien1391 9 месяцев назад

    Hello Alejandra, thanks very much for this very interesting video.
    Do you have a video explaining how in PowerQuery to split a table in N tables according all the N distinct values ​​in a column ?
    Thanks very much in advance for your help.

    • @alejandrahorvath
      @alejandrahorvath  9 месяцев назад +1

      To split a table in N tables according all the N distinct values ​​in a column, you could select the column containing the values you want to split by, right click and select group by, select all rows.
      The code would be something like this, if the table is in the current workbook:
      let
      Source = Excel.CurrentWorkbook(){[Name="TableName"]}[Content],
      GroupedRows = Table.Group(Source, {"ColumnNameYouWantToSplitBy"}, {{"NewColumnName", each _}})
      in
      GroupedRows

    • @etienneleparisien1391
      @etienneleparisien1391 9 месяцев назад

      @@alejandrahorvath Thank you so much. I will try it at work. Jo estet és jo vasárnap (in Hungarian).

    • @alejandrahorvath
      @alejandrahorvath  9 месяцев назад +1

      You're very welcome! Hope that this will solve what you were looking for. Have a great rest good evening and Sunday as well 😊

    • @etienneleparisien1391
      @etienneleparisien1391 9 месяцев назад +1

      @@alejandrahorvath Thanks to you. I could do it in VBA and SQL but I think the solution is smarter and faster to realize in PowerQuery.
      After, I just need to change/replace dynamically the file source in the query formula by VBA.

  • @boissierepascal5755
    @boissierepascal5755 Год назад

    Awsome !

  • @DinoAMAntunes
    @DinoAMAntunes Год назад

    Hi Alejandra. Very good. when to choose between Table Ou Page, i have always this doubt? Tks in advance

    • @alejandrahorvath
      @alejandrahorvath  Год назад +1

      Hi Dino,
      This is a very good question.
      In my experience the tables have worked better (for the most part); it all depends on the source.
      You will need to evaluate the source and decide which is better.
      Thanks for your question 😊

    • @DinoAMAntunes
      @DinoAMAntunes Год назад

      @@alejandrahorvath Many Tks

    • @alejandrahorvath
      @alejandrahorvath  Год назад

      You're most welcome 😊

  • @osielg9889
    @osielg9889 Год назад

    Hola, excelente explicación, una duda consideras que con muchos datos es más eficiente al resultado que tú realizaste únicamente tienes unos cuantos pasos en el editor, si lo hiciera utilizando las opciones normales desde la interfaz de usuario seguro quedarían muchos más. muchas gracias que estés muy bien 😁, te evi un fuerte abrazo 🤗

    • @alejandrahorvath
      @alejandrahorvath  Год назад

      Hola, todo depende en la fuente de datos, las funciones que utilices y el orden en que apliques los pasos.
      Menos número de pasos no significa más eficiencia, es lo que realizas y el orden en que lo realizas en cada paso (on en cada función).
      Gracias por tu comentario. Saludos 😊