I am so happy this exists. I was doing this the long and wrong way around for nearly 25,000 items. What was months of data entry has been reduced to minutes.
I used your upsert flow.It was nice..I got my expected result. Can we delete the items from splist,which is not present in excel?Please give some guidance
Sure, you likely want to use the SharePoint Batch Full Sync download where a section of that template pulls in all data from Excel & all data from SP before filtering to only the items still in SP not in Excel.
I am presently attempting your flow but amending with SQL datasource - let me know of any pitfalls I should watch out for with this amendment, otherwise I'll report back on my progress!! (fingers crossed)
Check this post & the one following it. powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-Create-and-Upsert-SharePoin... For lookup columns, I think you can only update which lookup item is referenced. And you set it up similar to the Person column updating the LookupColumnNameId value.
I haven’t personally worked with batch actions for Dataverse at this point. I think most would just use Dataflows which run larger data transform & loading jobs for Dataverse ruclips.net/video/8IvHxRnwJ7Q/видео.htmlsi=l3J1M36c14ts8cIT Alternatively Paul at TachyTelic has done a post on batch deleting Dataverse records www.tachytelic.net/2021/11/power-automate-bulk-delete-dataverse/
This is exactly what I was looking for and have deployed the automation. I was hoping you could tell me how I can fix this, I keep having to select the file and table in the List Rows Present in a Table section. I believe this is happening because each day I get a new data download via email as an attachment. I have another flow that captures the attachment and uploads it to a designated folder in onedrive. Where another flow renames the files to the exact same name and table. I have tried modifying the flow to add the create table function at the top of the flow but I still can't get it to work. I always received the error (Unable to process template language expressions for action 'Do_until_Upsert' at line '1' and column '593': 'The template language function 'length' expects its parameter to be an array or a string. The provided value is of type 'Null'. ) Any suggestions? Thank you for the great videos your processes really helping me look at things in different ways.
Did you rename the List rows present in table action or something? It is likely getting a null value because the reference to that action in the length( ) expression isn't returning anything at all. Like if there were no values then it would return 0 and be fine, but there must be something else wrong because a null indicates it is not returning anything at all. If you can please post screenshots of your flow to the download page thread: powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-Create-and-Upsert-SharePoint-Lists/td-p/1365410 or email me at takolota@gmail.com
Hi Tyler, This is great! Do you know if this method would work with files and folders? Specifically copying them from one site collection library to another site collection library? Thank you!
This is amazing, thank you so much for sharing this valuable content! :) I was really struggling with this for a long time. I was using the standard method that you have described, and it is so annoying how slow that is. I didn't measure it yet, but I think this will be probably 10 times faster. However, I do have some issue with this method also. Be aware that my use case is not that common probably. The table that I am testing has around 2000 rows, and it can go up to 10-15k, which is not too much. But the problem is that it has 92 columns, and when I tried your flow with all columns, I am getting this error: The maximum number of bytes allowed to be read from the stream has been exceeded. After the last read operation, a total of 1063840 bytes has been read from the stream; however, a maximum of 1048576 bytes is allowed. Do you have any idea how could I modify the flow to overcome this problem? I was thinking about splitting it into 3 parallel branches with cca 30 columns each, but I believe there is a better way. P.s. I tested with 36 columns it worked perfectly well (59 seconds for 1407 rows). I think it would take minimum 30 minutes with the traditional method, which is unbelievable! Great job!
If the issue is you have too much data loading from the Excel table at one time, then the easiest adjustment would be to reduce the Top Count & the Excel batch count in the right side of the Do until condition to like 1000 & 995 respectively. You'll also need to change the 100000 in the Skip Count expression to 1000. That way it would load & run with a much smaller set of data each time the Do until loop runs. Then to adjust things so the batch loads better fit that set-up, you’d probably want to reduce the settings batch size to half the Excel Top Count, so 500 in this case.
@@tylerkolota Thank you again! It works perfect now, but I had to put batch size in setting to 250. It is still very fast - 1400 rows x 96 columns uploaded in just 1:04 min. To me this is unbelievable result to be honest, because with the traditional method this would take for sure more than 45 minutes. :)
Hello, I do have a post on Dataverse batch actions here: community.powerplatform.com/galleries/gallery-posts/?postid=1ee689f5-81e0-4fb0-9155-a4387fc1598c Alternatively you may also be able to use Dataflows to connect Excel data to Dataverse: learn.microsoft.com/en-us/power-query/dataflows/sync-excel-cds-dataflow
Hello @tyler kudos for the work Can you please share the work flow zip file please not able to download for the link mentioned in description for create and update flow
I'm able to download the solution file from the link just fine on my laptop: drive.google.com/file/d/1UG3mj9Y-FCZ0ncCfUs3ucGZDdoiZJtgp/view?usp=sharing
If you only need to batch delete, then there is this blog post: www.tachytelic.net/2021/06/power-automate-delete-sharepoint-items/ If you need to sync deletions across datasources, then there is a “Full Sync” template provided on the download page: powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-Create-and-Upsert-SharePoint-Lists/td-p/1365410
Hi, I have tried the same code, there is no error in the flow, but somehow all the records are not getting created. In the response I could see 500 operation timed out error for few records. Any way to overcome this error?
I’m not sure what the 500 error may or may not refer to. Please go to the community thread if you want to share screenshots of your data & flow set up in actions like the GenerateSPData: powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-Create-and-Upsert-SharePoint-Lists/td-p/1365410
I wonder if something is getting throttled on the SharePoint side after so many requests. You could try turning off the concurrency on the update & create loops & see if that resolves it.
In short, yes but it is more complicated to set up than other columns. Post with a fuller explanation here: powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-Create-and-Upsert-SharePoint-Lists/m-p/2377703/highlight/true#M2114
For anyone wondering we can. If the name of the column is Lookup for example, this is how we update it: {"LookupId": item()?['ID']} Do we just have to add Id at the end of the name and the value is the id of the row.
@@abyal3kod201 Kind of. Yes you need to add Id to the end of the column name. And if you already have the Lookup column Ids for the values you want in your updated data, then yes you could use that directly for the input value. Or if you have a single lookup record you want to use for all your main list records. Because it needs the Id value of the lookup record for the LookupId value in the GenerateSPData action. But the real challenge is when you don’t have the lookup record Id, & you just have some other value from the lookup record. For instance if you had a contacts lookup list & your updated data only had the emails for each lookup record, but not the Id for each lookup record. Then you need to do something similar to what I did for the Person columns in V2.7 to create a reference-able JSON object indexed by that email key. That way you can do a type of VLookup to then get the correct record Id value for that given email address.
Hey @Taylor This means a lot, helped me so much. What if I want to just upload items from excel to sharepoint? I don't want to update, or check if items already exist in sharepoint, I just want to create. How can I modify this flow ? Thank you again. :)
Hello, If you just want to batch create SharePoint items, then Paulie's original blog on Batch Create may be helpful: tachytelic.net/2021/06/power-automate-flow-batch-create-sharepoint-list-items/
thanks for this video , i want to batch create list items , but the datasource here is a power bi dataset , used "run query against dataset" and "parsen json" and then the subsequent actions for batch create are not working for reason . can you help in this issue
How can i use parallel braching for same snerio to create data in different list from different sheet please help me this If i am Creating parallel branch but not able to create data in list getting failed second branch
Please post screenshots of your set-up & error to powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-Create-and-Upsert-SharePoint-Lists/td-p/1365410 Or email them to me at takolota@gmail.com
This was really helpful thank you! I need to update my list only where the id from the list matches the id from the excel. Could you or anyone advise how I can do this?
@@tylerkolota thanks for replying! I've managed to sort that issue out now - basically because it was an UPSERT and I just wanted it to an UPDATE so I just removed the UPDATE part. The latest challenge I've got is that I need to update based on the ID and also where another column in the SharePoint list equals let's say 'x'. This value isn't a look up into the excel but just a filter on the SharePoint list
@@Rob46 I would think you could go to the "Do until Get destination list IDs + keys" loop, add the relevant SP column to the "SharePoint HTTP Get items" select query, then add a Filter array action between the "Set variable LastID" and "Select IDs + Keys" actions to filter the SP records to just those where the relevant column = x, & adjust the "Select IDs + Keys" From input to use the Filter array outputs.
@@RommelGerardGalindo The destination list IDs? It pulls all destination list items, indexes them by the primary key column values, & then the GenerateSPData action is where the source dataset primary key values are used to lookup the ID of the destination list record where the source key value matches that destination item’s key value.
I am trying to do filter query using join function so it will do like accountnumber eq 1 or accountnumber eq 2 and so on. Now i got the id in sharepoint list but i cant add it in my array and cant make a batch out of it
@@RommelGerardGalindo You’re trying to do what? Are you trying to do a One to Many update where one source item may be used to update multiple destination items?
Sir i used your method to bulk update my flow ran successfully but i has updated 14216 records only, it skipped around 1104 records, is there any limit issue why it skipped but I didn’t get any error, please help
@@ashishrawat8276 Yes, the issue with the skipped records is SharePoint may be overwhelmed with requests & not finishing some, so the flow needs to slow down & send the requests over more time. I can't do anything more about this trade-off. Thanks
I am so happy this exists. I was doing this the long and wrong way around for nearly 25,000 items. What was months of data entry has been reduced to minutes.
Thanks for this video! I was struggling with rate limitations before seeing this.
Thank you so much for this valuable video!! You saved me hours of slaving away updating SharePoint list.
Extremely valuable video. AI-powered editing can't be used for this flow yet, since "it was created using an older format".
I used your upsert flow.It was nice..I got my expected result. Can we delete the items from splist,which is not present in excel?Please give some guidance
Sure, you likely want to use the SharePoint Batch Full Sync download where a section of that template pulls in all data from Excel & all data from SP before filtering to only the items still in SP not in Excel.
I am presently attempting your flow but amending with SQL datasource - let me know of any pitfalls I should watch out for with this amendment, otherwise I'll report back on my progress!! (fingers crossed)
I’ve also personally used this to bring SQL data to SP, let me know if you get any questions
Inside this Upsert flow how can we add values to lookup columns???
Check this post & the one following it.
powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-Create-and-Upsert-SharePoin...
For lookup columns, I think you can only update which lookup item is referenced.
And you set it up similar to the Person column updating the LookupColumnNameId value.
Is it possible to batch update dataverse? Do you have any guides or videos on that?
I haven’t personally worked with batch actions for Dataverse at this point.
I think most would just use Dataflows which run larger data transform & loading jobs for Dataverse
ruclips.net/video/8IvHxRnwJ7Q/видео.htmlsi=l3J1M36c14ts8cIT
Alternatively Paul at TachyTelic has done a post on batch deleting Dataverse records
www.tachytelic.net/2021/11/power-automate-bulk-delete-dataverse/
This is exactly what I was looking for and have deployed the automation. I was hoping you could tell me how I can fix this, I keep having to select the file and table in the List Rows Present in a Table section. I believe this is happening because each day I get a new data download via email as an attachment. I have another flow that captures the attachment and uploads it to a designated folder in onedrive. Where another flow renames the files to the exact same name and table. I have tried modifying the flow to add the create table function at the top of the flow but I still can't get it to work. I always received the error (Unable to process template language expressions for action 'Do_until_Upsert' at line '1' and column '593': 'The template language function 'length' expects its parameter to be an array or a string. The provided value is of type 'Null'. ) Any suggestions? Thank you for the great videos your processes really helping me look at things in different ways.
Did you rename the List rows present in table action or something? It is likely getting a null value because the reference to that action in the length( ) expression isn't returning anything at all. Like if there were no values then it would return 0 and be fine, but there must be something else wrong because a null indicates it is not returning anything at all.
If you can please post screenshots of your flow to the download page thread: powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-Create-and-Upsert-SharePoint-Lists/td-p/1365410
or email me at takolota@gmail.com
Can the trigger be changed to Scheduled?
Sure, the flow can use any trigger
Hey great video but what if i dont have primary key?
@@yashgandhi6128 Then you can only create items, not update them
tachytelic.net/2021/06/power-automate-flow-batch-create-sharepoint-list-items/
Hi Tyler,
This is great! Do you know if this method would work with files and folders? Specifically copying them from one site collection library to another site collection library? Thank you!
This is specifically for SharePoint lists. I have yet to see any way to use this or a different api for SharePoint document library batch calls.
This is amazing, thank you so much for sharing this valuable content! :) I was really struggling with this for a long time. I was using the standard method that you have described, and it is so annoying how slow that is. I didn't measure it yet, but I think this will be probably 10 times faster.
However, I do have some issue with this method also. Be aware that my use case is not that common probably. The table that I am testing has around 2000 rows, and it can go up to 10-15k, which is not too much. But the problem is that it has 92 columns, and when I tried your flow with all columns, I am getting this error:
The maximum number of bytes allowed to be read from the stream has been exceeded. After the last read operation, a total of 1063840 bytes has been read from the stream; however, a maximum of 1048576 bytes is allowed.
Do you have any idea how could I modify the flow to overcome this problem? I was thinking about splitting it into 3 parallel branches with cca 30 columns each, but I believe there is a better way.
P.s. I tested with 36 columns it worked perfectly well (59 seconds for 1407 rows). I think it would take minimum 30 minutes with the traditional method, which is unbelievable! Great job!
If the issue is you have too much data loading from the Excel table at one time, then the easiest adjustment would be to reduce the Top Count & the Excel batch count in the right side of the Do until condition to like 1000 & 995 respectively. You'll also need to change the 100000 in the Skip Count expression to 1000.
That way it would load & run with a much smaller set of data each time the Do until loop runs.
Then to adjust things so the batch loads better fit that set-up, you’d probably want to reduce the settings batch size to half the Excel Top Count, so 500 in this case.
@@tylerkolota Thank you again! It works perfect now, but I had to put batch size in setting to 250. It is still very fast - 1400 rows x 96 columns uploaded in just 1:04 min. To me this is unbelievable result to be honest, because with the traditional method this would take for sure more than 45 minutes. :)
Hello, could anyone please give me hints on how to modify this to update a dataverse table from an excel source ?
Hello,
I do have a post on Dataverse batch actions here: community.powerplatform.com/galleries/gallery-posts/?postid=1ee689f5-81e0-4fb0-9155-a4387fc1598c
Alternatively you may also be able to use Dataflows to connect Excel data to Dataverse: learn.microsoft.com/en-us/power-query/dataflows/sync-excel-cds-dataflow
Hello @tyler kudos for the work Can you please share the work flow zip file please not able to download for the link mentioned in description for create and update flow
I'm able to download the solution file from the link just fine on my laptop: drive.google.com/file/d/1UG3mj9Y-FCZ0ncCfUs3ucGZDdoiZJtgp/view?usp=sharing
does this also work with deleting the item?
If you only need to batch delete, then there is this blog post: www.tachytelic.net/2021/06/power-automate-delete-sharepoint-items/
If you need to sync deletions across datasources, then there is a “Full Sync” template provided on the download page: powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-Create-and-Upsert-SharePoint-Lists/td-p/1365410
thank you very much@@tylerkolota
Hi, I have tried the same code, there is no error in the flow, but somehow all the records are not getting created. In the response I could see 500 operation timed out error for few records. Any way to overcome this error?
I’m not sure what the 500 error may or may not refer to.
Please go to the community thread if you want to share screenshots of your data & flow set up in actions like the GenerateSPData: powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-Create-and-Upsert-SharePoint-Lists/td-p/1365410
Mine does it too. It copied about 9,000 data, and then just failed.
I wonder if something is getting throttled on the SharePoint side after so many requests.
You could try turning off the concurrency on the update & create loops & see if that resolves it.
Facing same issue. Did you find a solution?
@@nopulo Try turning concurrency off on the loops & reducing the batch size.
How can we update a lookup column with batch?
In short, yes but it is more complicated to set up than other columns.
Post with a fuller explanation here: powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-Create-and-Upsert-SharePoint-Lists/m-p/2377703/highlight/true#M2114
For anyone wondering we can. If the name of the column is Lookup for example, this is how we update it:
{"LookupId": item()?['ID']}
Do we just have to add Id at the end of the name and the value is the id of the row.
@@abyal3kod201 Kind of.
Yes you need to add Id to the end of the column name. And if you already have the Lookup column Ids for the values you want in your updated data, then yes you could use that directly for the input value. Or if you have a single lookup record you want to use for all your main list records. Because it needs the Id value of the lookup record for the LookupId value in the GenerateSPData action.
But the real challenge is when you don’t have the lookup record Id, & you just have some other value from the lookup record. For instance if you had a contacts lookup list & your updated data only had the emails for each lookup record, but not the Id for each lookup record. Then you need to do something similar to what I did for the Person columns in V2.7 to create a reference-able JSON object indexed by that email key. That way you can do a type of VLookup to then get the correct record Id value for that given email address.
Hey @Taylor
This means a lot, helped me so much.
What if I want to just upload items from excel to sharepoint?
I don't want to update, or check if items already exist in sharepoint, I just want to create.
How can I modify this flow ?
Thank you again. :)
Hello,
If you just want to batch create SharePoint items, then Paulie's original blog on Batch Create may be helpful: tachytelic.net/2021/06/power-automate-flow-batch-create-sharepoint-list-items/
thanks for this video , i want to batch create list items , but the datasource here is a power bi dataset , used "run query against dataset" and "parsen json" and then the subsequent actions for batch create are not working for reason . can you help in this issue
What error(s) are you getting? How is it not working?
How can i use parallel braching for same snerio to create data in different list from different sheet please help me this
If i am Creating parallel branch but not able to create data in list getting failed second branch
Please post screenshots of your set-up & error to powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-Create-and-Upsert-SharePoint-Lists/td-p/1365410
Or email them to me at takolota@gmail.com
This was really helpful thank you! I need to update my list only where the id from the list matches the id from the excel. Could you or anyone advise how I can do this?
Hello,
Were you not able to specify your id / primary key column names in the template flow & have it batch update based on the matches?
@@tylerkolota thanks for replying! I've managed to sort that issue out now - basically because it was an UPSERT and I just wanted it to an UPDATE so I just removed the UPDATE part. The latest challenge I've got is that I need to update based on the ID and also where another column in the SharePoint list equals let's say 'x'. This value isn't a look up into the excel but just a filter on the SharePoint list
@@Rob46 I would think you could go to the "Do until Get destination list IDs + keys" loop, add the relevant SP column to the "SharePoint HTTP Get items" select query, then add a Filter array action between the "Set variable LastID" and "Select IDs + Keys" actions to filter the SP records to just those where the relevant column = x, & adjust the "Select IDs + Keys" From input to use the Filter array outputs.
@@tylerkolota thanks I'll give that a go
@@tylerkolota got it to work! Thanks for your help 😃
this works before, now it doesnt and having error in 'HTTP Get backend listName' function
Hello, what does the error say?
@@tylerkolota sorry, got it fixed. i obliviously changed the name of the source list. Still working great!
How did you get the ID for each items?
@@RommelGerardGalindo The destination list IDs? It pulls all destination list items, indexes them by the primary key column values, & then the GenerateSPData action is where the source dataset primary key values are used to lookup the ID of the destination list record where the source key value matches that destination item’s key value.
I am trying to do filter query using join function so it will do like accountnumber eq 1 or accountnumber eq 2 and so on. Now i got the id in sharepoint list but i cant add it in my array and cant make a batch out of it
@@RommelGerardGalindo You’re trying to do what?
Are you trying to do a One to Many update where one source item may be used to update multiple destination items?
Sir i used your method to bulk update my flow ran successfully but i has updated 14216 records only, it skipped around 1104 records, is there any limit issue why it skipped but I didn’t get any error, please help
You may want to reduce your batch size & turn concurrency off on the Apply to each loops inside the Batch Update & Batch Create scopes
@@tylerkolota sir i have turned off the concurrency but it has increase my flow running time
@@ashishrawat8276 Yes, the issue with the skipped records is SharePoint may be overwhelmed with requests & not finishing some, so the flow needs to slow down & send the requests over more time. I can't do anything more about this trade-off.
Thanks