Комментарии •

  • @gregoryoliveira8358
    @gregoryoliveira8358 Год назад +1

    Thanks for sharing this with us. I was thinking that the limit was the number of rows per page, but now I completely understand this feature. Thanks again!

  • @thnkngwlnts
    @thnkngwlnts 10 месяцев назад

    This is exactly the solution I need as I looked all over RUclips and couldn't find the right video that highlights all the components you included in this video. Thanks for doing that. My only thing is that I am trying to get the data from SharePoint (over 100k rows) and not dataverse and then piping the data to Cosmos I get the concept would be the same just a different data source. Thanks much.

  • @macward7495
    @macward7495 3 года назад +1

    Thank you Steve!! Pagination and Concurrency are saving me so much time :) Wonderful video.

    • @SteveWinward
      @SteveWinward 3 года назад

      Excellent. Glad you enjoyed it!

  • @jt9228
    @jt9228 4 месяца назад

    Just stumbled across this and subscribed to your channel. Great video on the topic; I have been searching for how to handle large dataset in Automate and this will do it for me! Thanks!!!

  • @johnfromireland7551
    @johnfromireland7551 3 года назад

    Very clear video. More please!

  • @kimsalas8197
    @kimsalas8197 9 месяцев назад +5

    At timestamp 10:05 you mention you will show how you are parsing out to get the skip token value, but you don’t show it. For the action ‘update skip token variable’, what expression are you using for the value field? I’ve tried a few things, but I don’t get correct results.

  • @Alex-sq7ck
    @Alex-sq7ck 2 года назад +1

    Best video ive come across

    • @SteveWinward
      @SteveWinward 2 года назад

      Thank you. Glad you enjoyed it!

  • @hafidazer1634
    @hafidazer1634 3 года назад +1

    Great work ! Thanks

    • @SteveWinward
      @SteveWinward 3 года назад

      Thank you, glad you liked it!

  • @linamariasilva2141
    @linamariasilva2141 15 дней назад

    Gracias totales me sirvió muchísimo esta información. thx

  • @srik9814
    @srik9814 2 года назад

    Nice information

  • @seang2012
    @seang2012 Год назад +2

    Would you create a similar video showing how to use Pagination with Flow pulling from an API/Web source?

  • @user-is7xl1bu4t
    @user-is7xl1bu4t Месяц назад

    Hello Steve, I am in the process of attempting to utilize excel to create / update records in a SharePoint List. I currently have mu flow functioning the way i want. However, I am stuck at that 5000 limit. My data will be upwards of 200 times that. I know this will move slow but will work for what I would need.
    How do I implement this on of these methods into it? I understand that I will have to turn my flow into a solution. Just not sure what that will look like with my needs.
    Currently I am utilized a Scheduled Flow--->Get files (properties only)--->Apply to each->Update file properties->List rows present in a table->Apply to each-Get items-Conditions->Delay->Update file Properties
    The video I watched to do this is
    Reza Dorrani - Add & Update Excel Data to SharePoint List using Power Automate | Excel Import using flow
    My flow will be almost Identical to his. The only difference is I did not need to do all the extra steps to select choices. My data is very clean and singular text.

  • @shashankk1682
    @shashankk1682 Год назад

    Hi Steve. Great stuff. Wanted to know if you would be helpful in knowing how would this work with sharepoint.
    I have a flow that creates/updates sharepoint item sourcing out from an excel file present on the sharepoint. The data ofcourse will be going beyond 5000 rows in excel and evidently at some point I have to worry about paging. Could you please create a video or help so that creating/updating sharepoint list items for newly added rows/existing rows (no matter the no of rows to be listed on automate) are possible. P.S. an O365 user.

  • @vaibhavsingh594
    @vaibhavsingh594 2 года назад +2

    Hi Steve
    Very well explained.
    One question:
    After the do until loop, with what value you have updated the skip token variable.
    Thanks in advance!

    • @SteveWinward
      @SteveWinward 2 года назад +1

      Sorry I'm not sure I fully understand your question. I parse the OData response to get the next page URL. If that value is not empty, meaning there is a URL present, that means you can request the next "batch" of data. Hopefully that helps answer your question.

    • @vaibhavsingh594
      @vaibhavsingh594 2 года назад

      @@SteveWinward Thanks Steve👍

  • @iainpritchard803
    @iainpritchard803 2 года назад +2

    Brilliant...just what I need! Thank you...😊
    ...I can't find the link to the documentation / example. Has that gone ? Could you help me to get to it somehow? Desperate to use it...😊

    • @SteveWinward
      @SteveWinward 2 года назад +2

      Here's the link on GitHub,
      github.com/SteveWinward/PowerApps/blob/master/WriteUps/PowerAutomatePagination.md

  • @dynamics365series2
    @dynamics365series2 Год назад

    Can you provide the expression in initialize value?

  • @cmichael7807
    @cmichael7807 8 месяцев назад

    Concurrency is actually defaulted to 20, not 1, even with control off. To do truly synchronous (1 by 1), you need to turn concurrency control ON, and slide the scaler down to 1.

  • @user-vw7yb1sk5l
    @user-vw7yb1sk5l 2 месяца назад

    Should elaborate on skip token more

  • @madihasalman1
    @madihasalman1 Год назад

    Hi
    Good video. But how can we do if are using SQL connector to extract more than 5000 records. ?

  • @Brunos2Costa
    @Brunos2Costa Год назад

    Can I use pagination to get pages of 100 records each? And if so, how do I configure that?

  • @ThisIsFrederic
    @ThisIsFrederic Год назад

    Hi Steve,
    Thanks for your video!
    While your flow works on my data, there's still something I am not totally clear about: when discussing the "Do until" loop at 10m02s, you said "as long as the next URL property is not null", then we loop. Which makes sense for a "Do while" loop but not in this case since it's a "Do until". At least I would run the loop UNTIL the next link variable becomes null, meaning that the NextLink property does not exist anymore. Why does it work in that case?

    • @coolbeing163
      @coolbeing163 Год назад

      Probably, if Next link property not found, it indicates as NULL

  • @robertmerritt5245
    @robertmerritt5245 3 года назад +1

    Hi! This is an awesome video! Thanks so much! Does this only work with DataVerse data, or can I use it with SharePoint data?

    • @SteveWinward
      @SteveWinward 3 года назад

      It will work against any api that uses OData. SharePoint apis support OData so it should work there too.

    • @robertmerritt5245
      @robertmerritt5245 3 года назад +1

      @@SteveWinward Awesome! Thanks again!

  • @aditya10039
    @aditya10039 2 года назад +2

    Hey Steve! I want to export data from Power BI to excel using Power automation, But it only updates 1000 rows after a run. Can this method also be applied when the source file is from Power BI?

    • @SteveWinward
      @SteveWinward 2 года назад

      Great question. If the Power BI api let’s you do paging then yes you can use this same technique. I’m not aware off the top of my head if the Power BI export api supports this but the pattern in Power Automate supports this.

  • @brajgopal108
    @brajgopal108 2 года назад

    what is the length expression.

  • @babajidebantale
    @babajidebantale 3 года назад

    Hi, thanks for this content, however, I experienced some issues.
    1. Getting all data from contact entity gave this error: "HTTP request failed as there is an error: 'Cannot write more bytes to the buffer than the configured maximum buffer size: 104857600. "
    2. I retrieved some fields in the select columns and error in No.1 above resolved but "NextLink" was empty.
    3. I also used fetchXML (which has a limit of 5k rows) and NextLink was still empty.
    Any help is appreciated. I have a 12k contact DB to work on.

    • @SteveWinward
      @SteveWinward 3 года назад

      If you are getting the maximum buffer size error, you probably need to create a flow like I do here (ruclips.net/video/5NtzcfmSGes/видео.html) which is the custom pagination example. If you go to the settings of the action and enable pagination, that step still has a max limit that you are running into. If you loop over the records 5,000 records at a time, I don't think you will hit the max buffer size error. I did not test my example with a fetchXML query specified.

  • @yppatel4223
    @yppatel4223 2 года назад +1

    Instead of Dataverse, will it work for the Sharepoint list? Also, will it work for the seated version of power automate?

    • @SteveWinward
      @SteveWinward 2 года назад

      I believe this same pattern with work with SharePoint. I haven’t tested it myself, but my guess is it would. Also if you are doing this against SharePoint only you can used the seeded version of Power Automate.

  • @mohamedaffi3816
    @mohamedaffi3816 2 года назад

    what about list sharepoint ?

  • @krishnabairi1581
    @krishnabairi1581 3 года назад +1

    Thank you Steve. Will this pagination apply for the excel as a data source?

    • @SteveWinward
      @SteveWinward 3 года назад +1

      I believe it should work with the Excel connector. I have not tried that one myself, but I imagine the API has the same pagination capabilities.

    • @SteveWinward
      @SteveWinward 2 года назад

      @Ahmad Syakir Shahruddin For the Excel for Business connector. On the action, click on the 3 dots > Settings. Enable pagination and set the threshold to higher than 5,000. This will automatically do the paging for you and return a larger dataset size.

  • @ksowjanya4488
    @ksowjanya4488 Год назад

    I want import more than 1lakh records to dataverse from csv file in SharePoint using dataflow instead of power automate. Can you guide me possible ways please?

  • @yilu3219
    @yilu3219 2 года назад

    Hi, but anything more than 5000 is for premium users, right. I have basic plan which is 5000 for Pagination limit

  • @tarunverma6352
    @tarunverma6352 2 года назад +1

    How to import data from sharepoint or one drive into Dataverse, or we can directly upload excel file from canvas App and read data and insert into Dataverse as a table. please make one vidio

    • @SteveWinward
      @SteveWinward 2 года назад

      Dataflows are how you can do what you want to do here.
      docs.microsoft.com/en-us/power-query/dataflows/overview-dataflows-across-power-platform-dynamics-365

  • @klemetsrudj
    @klemetsrudj 2 года назад

    I'm outputting 25 fields from an entity, using fetchxml, parsing the paging cookie and doing it very similar...however; I'm trying to write these 25 fields over 10k rows (limit if 5k per run)....it's run for 24 minutes now already (ugh, so long). It seems like there should be a much much faster way to get 10k rows into excel. IE, one should not even have to apply to each, instead just get the rows and add them all to a table in one shot. If you have any ideas on how to do that. I guess I could maybe append a CSV table over and over, and then write all the contents in one push???

    • @SteveWinward
      @SteveWinward 2 года назад

      Another idea is to use Azure Synapse and setup a data pipeline to convert Dataverse records to a csv file.

    • @klemetsrudj
      @klemetsrudj 2 года назад +3

      @@SteveWinward actually, just using an array variable to loop over and append to, then put that into a csv table action, then create file for SharePoint with it's output as input took about 9 minutes. Much more acceptable for now.

    • @SteveWinward
      @SteveWinward 2 года назад +1

      @@klemetsrudj excellent. Thanks for sharing your solution.

  • @waltertiggeloven
    @waltertiggeloven 3 года назад

    No matter what I try.
    Import keeps failing with message saying "something went wrong try again later". I also tried to remove the folder "Workflows" and created a new ZIP with the 3 Json's and without that subfolder.
    But nothing works. Import of the zip is impossible. 🤷‍♂️ Any idea?

    • @SteveWinward
      @SteveWinward 3 года назад +1

      Are you following these steps to import a solution file?
      docs.microsoft.com/en-us/powerapps/maker/data-platform/import-update-export-solutions

    • @SteveWinward
      @SteveWinward 3 года назад +1

      I added some more detailed steps to get the sample installed
      github.com/SteveWinward/PowerApps/blob/master/WriteUps/PowerAutomatePagination.md

    • @waltertiggeloven
      @waltertiggeloven 3 года назад

      @@SteveWinward Well, many thanks for your quick response en additional info. I did not follow the correct procedure. However, even if I do this import is not possible.
      Anyway, don't bother. Your video was helpfull enough to get me on the right track. I was struggling (for almost a week now) to get pagination solved for MS Planner (through MS Graph). There were several obstacles I had to overcome and partly because of the limited information in the event of an error, it was very difficult to discover what and where exactly went wrong.
      After getting the permissions right (Admin Concent on Microsoft Graph ApiConnectors) pagination (handling NextLink) remained. I now have a working model. Many thanks for your video and support. You made one of the most complete and intelligible explanations.

    • @SteveWinward
      @SteveWinward 3 года назад

      @@waltertiggeloven I'm glad you found this video helpful!

  • @travissmithoh
    @travissmithoh 2 года назад +1

    Will this work for over 100,000 records?

    • @SteveWinward
      @SteveWinward 2 года назад +1

      Hey Travis. I haven’t tested this with 100,000 records. My intent was to show how to use paging if you have datasets right over the 5,000 or 10,000 mark that will get truncated by default. If you have records on the order of 100,000 items you need to regularly work with you may want to find another approach outside of Power Automate. Dataflows might be a better option or even looking at something in Azure to process the records.

    • @travissmithoh
      @travissmithoh 2 года назад

      @@SteveWinward Thanks! I will look into Power BI Dataflows.

  • @richardma2655
    @richardma2655 2 года назад

    How to deal with 15000 records? thanks

    • @PodOConaill
      @PodOConaill 4 месяца назад

      Did you find a solution to this?
      I have to do this for 2.5million records

  • @9028129188
    @9028129188 Месяц назад

    HI Steve thanks for sharing the useful information. I would like to ask about as I have under apply to each action is export to file for paginated reports and it is taking quite time and passing few parameters value from dataverse then calling send email action.kindly suggest to optimize the flow performance issue .