How To Automate Power BI Snapshots With Power Automate: A Step-by-Step Guide | NextGen BI Guru

Поделиться
HTML-код
  • Опубликовано: 2 дек 2024

Комментарии • 28

  • @MKJainStatistics
    @MKJainStatistics 4 месяца назад

    Hi excellent crisp presentation. I create Power BI reports, but so far was not using Automate, but one user needs forced me to look into it. Just saw your video, and replicated this very similar flow. It worked fine.
    There are few challenges, if you can help, 1. it skips rows if there is any column value is blank, how to still take those lines. 2. it transfer rows in batch of 100, and between such batch it takes time, can we avoid that? 3. Just for 2400 rows, it took 4 hours, too long? 4. where are the setting to change row count limits?
    By the way, one of the place is when you copy the DAX query, in query itself, 501 rows were written as TOPN limit, that I altered already. All help appreciated.
    Thanks.

    • @NextGenBIGuru
      @NextGenBIGuru  4 месяца назад

      Hello 👋🏼. Glad you found it useful.
      For 1 - yes blank rows will be skipped, the workaround is to use replace nulls from query editor with 0’s or any other value.
      For 2-4 - unfortunately PowerAutomate takes long time to finish especially as the no. Of rows grows, the workaround is to duplicate the flow many times (say 3 in your case) then filter the query for each one based on one of the fields. And let them run simultaneously. Not perfect, but worked for me in a similar situation. Hope this helps.

  • @qaistabib8079
    @qaistabib8079 11 месяцев назад

    Excellent Tutorial as usual! I was looking for an automated way to archive/store data without having a dedicated SQL server!

    • @NextGenBIGuru
      @NextGenBIGuru  11 месяцев назад

      Thank you Qais. Glad you found it useful.

  • @gabrieltosi9719
    @gabrieltosi9719 10 месяцев назад

    When I try the “Run a query against the dataset” action it returns “BadRequest”. What can I be doing wrong? Is it because I am using DirectQuery data?

    • @NextGenBIGuru
      @NextGenBIGuru  10 месяцев назад

      Hi there. You are correct, this is the reason. this action requires listing the rows which can be achieved with imported queries only.

  • @Viego4300
    @Viego4300 13 дней назад

    This wont work with live connection or directquery mode only import ?
    thank you

    • @NextGenBIGuru
      @NextGenBIGuru  11 дней назад

      Hello 👋, it should work whether you are using import or direct query

  • @divyanair6147
    @divyanair6147 6 месяцев назад

    When running the query against a dataset the rows with null columns are excluding from the results. Is there anyway to include that?

    • @NextGenBIGuru
      @NextGenBIGuru  6 месяцев назад

      Hello 👋🏼, null values are treated as no data. To overcome this, use the replace value option in query editor. Just replace null with any value like 0 or -

  • @tamyauming8966
    @tamyauming8966 11 месяцев назад

    Great tutorial~
    But is the snapshot data table limited by 501 rows?

    • @NextGenBIGuru
      @NextGenBIGuru  11 месяцев назад

      Hello Tam. Thank you first of all.
      For the maximum number of rows of “Run a query a against dataset” the documentation says either 100k rows or 1M rows which every comes first.
      So you should be ok with 501 rows.

  • @hrishitathe1947
    @hrishitathe1947 9 месяцев назад

    Can we store the data in excel spreadsheet.

    • @NextGenBIGuru
      @NextGenBIGuru  9 месяцев назад

      Hello there, yes you can save to csv format. In PowerAutomate, you need first to find a create a table block then saving the output to CSV block

    • @alvinmarkmiclat8680
      @alvinmarkmiclat8680 5 месяцев назад

      @@NextGenBIGurudo you happen to have a video showing these steps?

    • @NextGenBIGuru
      @NextGenBIGuru  5 месяцев назад

      Hello 👋🏼, apologies I don’t have a specific video shows how to do it for an excel sheet

  • @tulasiramb3864
    @tulasiramb3864 9 месяцев назад

    Can we do snapshots without power automate?

    • @NextGenBIGuru
      @NextGenBIGuru  9 месяцев назад

      Hello there. You can do them manually by creating table visuals and exporting them if that’s what you mean. PowerAutomate will automate the process for you

  • @maheshakula6544
    @maheshakula6544 3 месяца назад

    Hi Hamzeh,
    I tried to follow same steps as you explained in this video. I am using OneDrive for data storing as you used google sheets. When I am trying to add a row to table using item(‘apply to each’)[‘table[project_name]’] , its colour is not Turning to yellow, it staying as purple. Could you please help in this regard. Thanks.

    • @NextGenBIGuru
      @NextGenBIGuru  3 месяца назад

      Hello 👋🏼. I assume you are appending to excel or csv file. Based on your comment above and assuming all the steps are correct, you code should look like this: items(‘apply_to_each’)[‘table[project_name]’]. Make sure the ‘apply to each’ function name is the exact name of the block you are using also use items not item for the function. And replace spaces with _

    • @maheshakula6544
      @maheshakula6544 3 месяца назад

      @@NextGenBIGuruhi Hamazeh,
      I am able take snapshot of data after making changes as recommended by you, but it is appending only 501 rows but I have to append 21616 rows. Could you please let me know how I can do this.

    • @NextGenBIGuru
      @NextGenBIGuru  3 месяца назад

      Hello 👋🏼, you Dax query probably has a TOPN function. So inside this function increase the number of columns to big number like.

    • @maheshakula6544
      @maheshakula6544 2 месяца назад

      @@NextGenBIGuru TOPN function has maximum N_value of 5000. Could you please let me know how I can insert more than 5000 rows of table.

    • @JulesRulesYen
      @JulesRulesYen 2 месяца назад

      @@NextGenBIGuru Hi - I get invalid expression when using this code ---- items(‘apply_to_each’)[‘table[project_name]’]. I'm using an xlsx on a Sharepoint. Any ideas? Is the code slightly different for Excel vs. Google Sheets in some way? Thank you!

  • @Jameson1307
    @Jameson1307 Месяц назад

    Any idea on how to get rid of the 5k row limit? My snapshot has over 5k rows and I will break
    "Your 'Apply to each' action is encountering an error because it is trying to process more items than the maximum limit of 5000, with 5903 items being passed. This issue arises from the query results returning too many rows. To resolve this, could you consider filtering the query results to reduce the number of items returned, or perhaps implementing pagination to handle the excess items?"

    • @NextGenBIGuru
      @NextGenBIGuru  Месяц назад

      Hello 👋🏼, you can do something like pagination by creating 2 workflows instead of 1. And filter your query through a field you intruduce to you table (calculated column) with values like “Page 1, Page 2,…”