Version 1.5 Is Now Available powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List-With-External-Data/td-p/1365410 This version makes it easier to handle cases where the list name may have changed since its creation and moves a few of the primary key column matching inputs to the 'settings' compose action so users don't have to look through & edit the more complicated expressions to set up the flow.
This is incredible! I've got a list of about 6000 records that I need to update daily and although the volume isn't hundreds of thousands I could still use it as a test case to set this up.
Thank you for sharing this video and the flow templates too. I adapted it to a flow that adds about 20,000 records to a SharePoint list daily from Power BI dataset.
@@analyticsadda6080 Please also visit the newer version of this set-up & the new instruction video here: powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-Create-and-Upsert-SharePoint-Lists/td-p/1365410
The ID expressions which has been used in GenerateSPData is throwing an error. Could you please fix it as it is stopping from updating the batch data into SharePoint.
I'm having the same problem as well. I know he mentioned this the below but I can't get my to run proerly. *LISTS WITH UNDERSCORES _ IN THE TITLE MAY CAUSE ERRORS WITH EARLIER VERSIONS OF BATCH ACTIONS.** If you have a list with underscores in the name, then in TachyTelic.Net's batch create you will want to go to the __metadata line on the GenerateSPData select action and substitute the expression json(concat('{"type":"SP.Data.', replace(outputs('settings')['listName'], ' ', '_x0020_'), 'ListItem"}')) for json(concat('{"type":"SP.Data.', replace(replace(outputs('settings')['listName'], '_', '_x005f_'), ' ', '_x0020_'), 'ListItem"}')) This is what I have on my GenerateSPData outputs('Compose_Reformat_keys_to_single_JSON_object')?[item()?['Smartsht_x0020_ID']]?['ID']
Hello, Good timing, I just got my new job & access to a SharePoint environment again. I didn't see any new errors or anything different with the normal way I use this template flow & with the ID expression. However... 1. Going through the more complicated expressions to change the SP & new datasource primary key column names was not very user friendly in this V1 template. 2. Some people found issues with the _metadata part of the GenerateSPData if their SP list name had changed since it was created. To fix a lot of this, I just put out Version 1.5 of this template in the Power Automate Community thread powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List-With-External-Data/td-p/1365410 If you try the new template version and it is still giving you trouble on the ID expression, then please take some screenshots and describe the issue on the community thread so I can better diagnose the issue and help. powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List-With-External-Data/td-p/1365410 It may be a problem with your primary key column name in Excel. There may be extra challenges if the Excel column has special characters like spaces, underscores, dashes, etc.
Thank you for the very help helpful video! If I was changing the excel source to a sharepoint get items, how could I replace the skip count thats present in the List rows present? So that the Do until loops through all the items in the Gets items?
Hi Josh, good question. You should be able to do something similar to the initial 2 get items in the template. So set Order By on the Get items to ID, then at the end of each Do until Update loop, record the last ID number of that loop iteration’s Get items. Then the Get items would also have a Filter Query where ID gt @variables(‘LastID’) That way it would continually pull the next 100,000 items after the last ID of the last 100,000 items & if any loop instance failed, it wouldn’t update the ID at the end of the loop yet, so the next loop instance would basically retry the failed one.
I am having an error message "message":{"lang":"en-US","value":"A 'PrimitiveValue' node with non-null value was found when trying to read the value of a navigation property; however, a 'StartArray' node, a 'StartObject' node, or a 'PrimitiveValue' node with null value was expected." I think probably due to my choice / people column in sharepoint. not sure if you have any suggestion Tyler. much appreciated!
Someone asked something similar on pages 4 & 5 of the Power Automate Community thread... powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List-With-External-Data/td-p/1365410/page/4 If something like this can pull all the potential users, then it may be possible to create something similar to the item Id value lookup for the person Ids pankajsurti.com/2020/10/12/how-to-update-people-field-using-send-an-http-request-to-sharepoint-in-power-automate/ "I tried fetching user ID via email and used it to update the column. So far it's functioning as expected. Thank you for the suggestion."
@@davidlattimore663 I’d recommend watching the video & using the template for the newer Upsert V2.7. Then if you run into issues reach out again & we can troubleshoot. community.powerplatform.com/galleries/gallery-posts/?postid=dc95f047-afcc-4a31-bc84-962b240fe6fa
Thanks Acik! I developed this when I needed to update 20,000+ SharePoint records a day to track my organization’s holiday food & toy distribution. Next I’ll automate the data collection for several KPI reports using something like this method and desktop flows. I’m sure other people may need to use it for similar things.
If the batch call succeeds, but individual rows fail to load to SharePoint, then the flow will not throw an error. I personally change out the Results Compose actions with Append to variable actions and use an expression that returns the results text if it identifies an error. Then at the end of the flow I have a condition checking the length of that variable. If it has some text/length to it, then something failed and it sends me an email with a link to the flow run.
I am updating a sharepoint list from a Collection in powerapps. I have the Collection table in a Parse Json in Power Automate. Do I replace that with the Excel list row and how do I go through it?
You could outright put the JSON in the From field of the main Select action that maps the update data if you can manually parse the JSON for each field value with expressions like item()?[‘Column1’]
@@reggieoverton121 is ID an integer or number type value? That should be the part that references a string version of the ID or primary key in previous actions so it links the SP ID to the corresponding data. Try putting a string expression on it like [string(item?[‘ID’])]
@@tylerkolota ok, I updated the ID field in the GenerateSPData (selection) to the json ID field. I don't get any errors, but the results say "204 No Content" not sure what that means
@@tylerkolota just now seeing this... the ID is an integer. how should the I put the string expression on it? outputs('Compose_Reformat_keys_to_single_JSON_object')?[item()?['ID']]?['ID']
I updated the ID to the following and I do not get an error but I get the 204 No Content in the results. outputs('Compose_Reformat_keys_to_single_JSON_object')?[string(item()?['ID'])]?['ID']
This is great stuff. Thank you for sharing. I have about 25000 records that need to be uploaded to SPList but it only took 16000 over. Would you know why it did not complete the upload?
@@tylerkolota Thank you so much for your suggestions. So I reduced batch size to 500. Tried twice - Copied 10,171 instead of 10,498; and Copied 14,262 instead of 15887. The Apply_to_each loop does not return any errors either.
@@All73520 I once had someone with a similar issue. They ended up just running things twice. If you run things twice, does it fix those remaining rows, or do they still not update?
I am seeing this error The power flow's logic app flow template was invalid. Unable to parse template language expression 'outputs('Compose_Reformat_keys_to_single_JSON_object')?[concat(item()[outputs('settings')['NewDatasourceKeyColumnName']], if(empty(outputs('settings')['NewDatasourceKeyColumnName2']), , item()[outputs('settings')['NewDatasourceKeyColumnName2']]))]?['ID']': expected token 'Identifier' and actual 'Comma'.
I re-downloaded the template to double check it wasn't me. For some reason the expression you have is missing two signle quotes in... if(empty(outputs('settings')['NewDatasourceKeyColumnName2']), , item()[outputs('settings') It should actually be... if(empty(outputs('settings')['NewDatasourceKeyColumnName2']), '', item()[outputs('settings') But be careful if you try to copy expressions from somewhere like a RUclips comment. It often changes out some of the characters & errors, so you're better off just manually adding those two single quotes '
If anyone has larger datasets to maintain with something like this, the flow run time and flow action API calls may be really high. Your account may go over its daily action limit. To do something like this quickly & efficiently for larger databases with thousands of updates a day, you can combine this flow with SharePoint batch create and batch update methods. Batch Create: ruclips.net/video/2dV7fI4GUYU/видео.html Batch Update: ruclips.net/video/l0NuYtXdcrQ/видео.html You may just have to use Filter array actions instead to figure out what records in SQL are not already in SharePoint for the batch create and which are already in SharePoint for the batch update. The filter condition for new records would be something like string({SharePoint body}) does not contain concat(‘“sqlid”:”’, {SQL ID}, ‘“‘) And the other Filter array condition would be the opposite string({SharePoint body}) contains concat(‘“sqlid”:”’, {SQL ID}, ‘“‘) And if you have several hundreds of thousands of records in SharePoint, you’ll have to create multiple Filter array actions, one for each SharePoint Get items with 100,000 records, and combine the Filter array outputs with Unions or Intersections.
Hey Tyler, this video has been super helpful and I have been able to pull and sync lots of reports in sharepoint because of it! I do need some additional help if you know where I could look or if you could provide any guidance that would be appreciated. Im trying to Batch update list item permissions for a sharepoint list. I have found a tutorial for when an item is created or modified it breaks the permissions and applies new permissions such as edit or view, but it does it for every item and uses lots of actions. The reason I'm trying to do this is so that different individuals can see only what applies to them in the list based on a criteria etc.
I may not 100% understand your use-case. But I have yet to see documentation or examples of batch API call options for permissions. I’d first want to check if there is a way to better filter to the individual items you are trying to modify the permissions for. Is it possible to use a Filter array action to get to just the items that need permissions edited? Or is there a way to set things up to only share links to specific items with each person & edit those specific share links each time? powerusers.microsoft.com/t5/Building-Flows/Create-Sharing-link-from-Sharepoint/td-p/675038
@@tylerkolota Thank you for the feedback, I have to track roles for multiple projects but each project cant see the others information for privacy reasons. The projects have names like "Fairview" or something like that. Since the roles are changed alot and new ones are added, I was hoping to batch provide permissions based on the project and on a schedule. So all could see their projects but not others and if new PM is added to the project they could be added to the project "group" and be given permission to see their project items when the scheduled flow runs. Since its a lot of roles and they are modified alot I didnt want to do the apply to each option which is possible. It would be like the following but in a batch if possible: ruclips.net/video/q2vRqHr16Lg/видео.html
Hi Tyler, I’m getting an error on the 2nd get items action. It is saying the expression cannot be evaluated because property ID cannot be selected. Are you familiar with this error? Also, can you provide the whole expression just incase I missed something?
That sounds… odd. Did the 1st Get items return any data? Is your list blank? Do you somehow not have an ID column in the SharePoint list? Or maybe it is labeled Id instead of ID? The expression is Last(outputs(‘Get_items_100000’)?[‘body/value’])[‘ID’]
@@tylerkolota I had 2 filters in the Odata filter option preventing the get items data from pulling. Looks like I’ve fixed it now. Thank you for the quick response! Additionally, your video shows that the pagination was setup with a threshold of 100,000. I receive an error that the threshold can only be 5000. Has something functionality wise changed with Power Automate since the video was made?
@@saherkhatani Looks like that may depend on your licensing. powerusers.microsoft.com/t5/Building-Flows/Pagination-Error/td-p/835358 I didn’t know some licenses had this limitation.
Well that’s a bummer! My list has roughly 15000 rows so I’m doing 3 get items actions of 5000. I’m working on a flow that seems complex to me as a beginner but you may have a better solution. I have 2 lists. List 1 is populated with MS form responses and List 2 has staff data (name, job title, dept., etc.). I’ve set up a flow to feed MS form responses to List 1 but after the item is created, I am trying to get staff data from List 2 to populate onto list 1 based on employee Id such as name, job title, dept. Unfortunately, with the flow I currently have set up, it is doing an apply to each for each of the 5000 get items from list 2. This is causing the flow to be very lengthy and time consuming. Do you have any solutions other than lookup columns? Lookup columns won’t work in our scenario because we want to lock in the staff data values at the time the MS form was submitted. Any guidance would be greatly appreciated!
Both the batch create and batch update flows can be combined into something like what Daniel Christian is doing in this video, ruclips.net/video/k3WZ-lRiPIU/видео.html But using these methods will be much faster and use less daily actions. Instead of what Christian does with the row-by-row check on records in the SharePoint list, you may have to use Filter array actions to figure out what records in SQL are not already in SharePoint for the batch create and which are already in SharePoint for the batch update. The filter condition for new records would be something like string({SharePoint body}) does not contain concat(‘“sqlid”:”’, {SQL ID}, ‘“‘) And the other Filter array condition would be the opposite string({SharePoint body}) contains concat(‘“sqlid”:”’, {SQL ID}, ‘“‘) And if you have several hundreds of thousands of records in SharePoint, you’ll have to create multiple Filter array actions, one for each SharePoint Get items with 100,000 records, and combine the Filter array outputs with Unions or Intersections.
***LISTS WITH UNDERSCORES _ IN THE TITLE MAY CAUSE ERRORS WITH EARLIER VERSIONS OF BATCH ACTIONS.*** If you have a list with underscores in the name, then in TachyTelic.Net's batch create you will want to go to the __metadata line on the GenerateSPData select action and substitute the expression json(concat('{"type":"SP.Data.', replace(outputs('settings')['listName'], ' ', '_x0020_'), 'ListItem"}')) for json(concat('{"type":"SP.Data.', replace(replace(outputs('settings')['listName'], '_', '_x005f_'), ' ', '_x0020_'), 'ListItem"}'))
The template is here: powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List-With-External-Data/td-p/1365410 It’s also listed in the description. And it’s listed in the top comments.
Also here is a video that includes a segment where I import a template from such a page. (Near the beginning of the video after the Excel Primary Key set-up) ruclips.net/video/HiEU34Ix5gA/видео.html
@@tylerkolota the download are all json files. Do y'all don't have a flow template? i meant its really hard to follow in the video to type in all those parameters
@@tylerkolota The flow run successfully, but it does not generate output. I only replace the input table and output list based what I need, everything is import from the template.
I’m building a flow that uses Power Automate Desktop UI flows to routinely pull a CSV file from a website, then send the data back to a regular Power Automate flow through a variable, then send it to a SharePoint list with these batch create & batch update actions so a coworker can automatically update their PBI reports with it. For a part of this flow I need to change the delimiter of the CSV file data so I can easily parse it all in the batch GenerateSPData Select actions. So I made this other flow to change the delimiter, even on large files, with only a handful of actions. powerusers.microsoft.com/t5/Power-Automate-Cookbook/Change-CSV-File-Data-Delimiter/m-p/1442954#M531
If anyone is looking for another way to build the Vlookup type section that matches SharePoint IDs to the external data, Paulie just made this video on combining arrays: ruclips.net/video/QSF6dNkSKSA/видео.html
Hello, I keep getting this message, the flow runs but nothing happens. ( "--batchresponse_62ea2ddf-9322-4726-9ec0-8be3c698f2de Content-Type: application/http Content-Transfer-Encoding: binary
HTTP/1.1 400 Bad Request CONTENT-TYPE: application/json;odata=verbose;charset=utf-8
{\"error\":{\"code\":\"-1, Microsoft.SharePoint.Client.InvalidClientQueryException\",\"message\":{\"lang\":\"pt-BR\",\"value\":\"A type named 'SP.Data.lista_x005f_batchListItem' could not be resolved by the model. When a model is available, each type name must resolve to a valid type.\"}}} ). I'm from Brazil so I guess that this "pt-BR" means somehing for nothing happens but don't know where change the code to update the list. Can you help me?
You may want to post to the community thread where we can include screenshots of the issues & action expressions powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List-With-External-Data/td-p/1365410 It’s likely something is wrong with the data or with the primary key field on the piece that puts the ID with the new data.
Version 1.5 Is Now Available
powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List-With-External-Data/td-p/1365410
This version makes it easier to handle cases where the list name may have changed since its creation and moves a few of the primary key column matching inputs to the 'settings' compose action so users don't have to look through & edit the more complicated expressions to set up the flow.
This is incredible! I've got a list of about 6000 records that I need to update daily and although the volume isn't hundreds of thousands I could still use it as a test case to set this up.
Thank you for sharing this video and the flow templates too. I adapted it to a flow that adds about 20,000 records to a SharePoint list daily from Power BI dataset.
Thank you, I’m glad it helped!
This will increase the speed. I also needs to upload 9000 records in sharepoint , it is taking long time approx 3-4 hours. This will reduce the time.
@@analyticsadda6080 Please also visit the newer version of this set-up & the new instruction video here: powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-Create-and-Upsert-SharePoint-Lists/td-p/1365410
The ID expressions which has been used in GenerateSPData is throwing an error.
Could you please fix it as it is stopping from updating the batch data into SharePoint.
I'm having the same problem as well. I know he mentioned this the below but I can't get my to run proerly.
*LISTS WITH UNDERSCORES _ IN THE TITLE MAY CAUSE ERRORS WITH EARLIER VERSIONS OF BATCH ACTIONS.**
If you have a list with underscores in the name, then in TachyTelic.Net's batch create you will want to go to the __metadata line on the GenerateSPData select action and substitute the expression
json(concat('{"type":"SP.Data.', replace(outputs('settings')['listName'], ' ', '_x0020_'), 'ListItem"}'))
for
json(concat('{"type":"SP.Data.', replace(replace(outputs('settings')['listName'], '_', '_x005f_'), ' ', '_x0020_'), 'ListItem"}'))
This is what I have on my GenerateSPData
outputs('Compose_Reformat_keys_to_single_JSON_object')?[item()?['Smartsht_x0020_ID']]?['ID']
Hello,
Good timing, I just got my new job & access to a SharePoint environment again.
I didn't see any new errors or anything different with the normal way I use this template flow & with the ID expression.
However...
1. Going through the more complicated expressions to change the SP & new datasource primary key column names was not very user friendly in this V1 template.
2. Some people found issues with the _metadata part of the GenerateSPData if their SP list name had changed since it was created.
To fix a lot of this, I just put out Version 1.5 of this template in the Power Automate Community thread powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List-With-External-Data/td-p/1365410
If you try the new template version and it is still giving you trouble on the ID expression, then please take some screenshots and describe the issue on the community thread so I can better diagnose the issue and help.
powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List-With-External-Data/td-p/1365410
It may be a problem with your primary key column name in Excel. There may be extra challenges if the Excel column has special characters like spaces, underscores, dashes, etc.
Thank you for the very help helpful video! If I was changing the excel source to a sharepoint get items, how could I replace the skip count thats present in the List rows present? So that the Do until loops through all the items in the Gets items?
Hi Josh, good question.
You should be able to do something similar to the initial 2 get items in the template. So set Order By on the Get items to ID, then at the end of each Do until Update loop, record the last ID number of that loop iteration’s Get items. Then the Get items would also have a Filter Query where ID gt @variables(‘LastID’)
That way it would continually pull the next 100,000 items after the last ID of the last 100,000 items & if any loop instance failed, it wouldn’t update the ID at the end of the loop yet, so the next loop instance would basically retry the failed one.
@@tylerkolota Wow thank you for responding so fast! I will try giving that a shot, thanks again!
Just coming back to report your suggestion worked! Thank you!
Great video, is there anyway we can update person column in SPO using batch?
Thanks!
I responded to your same question on the forums
I am having an error message
"message":{"lang":"en-US","value":"A 'PrimitiveValue' node with non-null value was found when trying to read the value of a navigation property; however, a 'StartArray' node, a 'StartObject' node, or a 'PrimitiveValue' node with null value was expected."
I think probably due to my choice / people column in sharepoint. not sure if you have any suggestion Tyler. much appreciated!
Someone asked something similar on pages 4 & 5 of the Power Automate Community thread...
powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List-With-External-Data/td-p/1365410/page/4
If something like this can pull all the potential users, then it may be possible to create something similar to the item Id value lookup for the person Ids
pankajsurti.com/2020/10/12/how-to-update-people-field-using-send-an-http-request-to-sharepoint-in-power-automate/
"I tried fetching user ID via email and used it to update the column. So far it's functioning as expected. Thank you for the suggestion."
@@tylerkolota Thanks Tyler!
Can’t seem to get this working. Any insights you can share ? Thanks. Dave
@@davidlattimore663 I’d recommend watching the video & using the template for the newer Upsert V2.7. Then if you run into issues reach out again & we can troubleshoot.
community.powerplatform.com/galleries/gallery-posts/?postid=dc95f047-afcc-4a31-bc84-962b240fe6fa
What is the Power Automate Licence you are using ?
I’m usually working with a Microsoft E3 license or higher.
Just subscribed Tyler.. This a good topic.
Thanks Acik!
I developed this when I needed to update 20,000+ SharePoint records a day to track my organization’s holiday food & toy distribution. Next I’ll automate the data collection for several KPI reports using something like this method and desktop flows.
I’m sure other people may need to use it for similar things.
If the batch call succeeds, but individual rows fail to load to SharePoint, then the flow will not throw an error. I personally change out the Results Compose actions with Append to variable actions and use an expression that returns the results text if it identifies an error. Then at the end of the flow I have a condition checking the length of that variable. If it has some text/length to it, then something failed and it sends me an email with a link to the flow run.
I am updating a sharepoint list from a Collection in powerapps. I have the Collection table in a Parse Json in Power Automate. Do I replace that with the Excel list row and how do I go through it?
You could outright put the JSON in the From field of the main Select action that maps the update data if you can manually parse the JSON for each field value with expressions like item()?[‘Column1’]
@@reggieoverton121 is ID an integer or number type value?
That should be the part that references a string version of the ID or primary key in previous actions so it links the SP ID to the corresponding data.
Try putting a string expression on it like [string(item?[‘ID’])]
@@tylerkolota ok, I updated the ID field in the GenerateSPData (selection) to the json ID field. I don't get any errors, but the results say "204 No Content" not sure what that means
@@tylerkolota just now seeing this... the ID is an integer. how should the I put the string expression on it?
outputs('Compose_Reformat_keys_to_single_JSON_object')?[item()?['ID']]?['ID']
I updated the ID to the following and I do not get an error but I get the 204 No Content in the results.
outputs('Compose_Reformat_keys_to_single_JSON_object')?[string(item()?['ID'])]?['ID']
This is great stuff. Thank you for sharing. I have about 25000 records that need to be uploaded to SPList but it only took 16000 over. Would you know why it did not complete the upload?
Your source action definitely pulled all 25,000?
Did you try turning off concurrency on the Apply to Eaches?
@@tylerkolota I turned off concurrency and tried again. This time I had 1932 records but only 1707 was uploaded.
@@All73520 Okay, you can also try reducing the batch size & see if that improves it
@@tylerkolota Thank you so much for your suggestions. So I reduced batch size to 500. Tried twice - Copied 10,171 instead of 10,498; and Copied 14,262 instead of 15887. The Apply_to_each loop does not return any errors either.
@@All73520 I once had someone with a similar issue. They ended up just running things twice.
If you run things twice, does it fix those remaining rows, or do they still not update?
I am seeing this error
The power flow's logic app flow template was invalid. Unable to parse template language expression 'outputs('Compose_Reformat_keys_to_single_JSON_object')?[concat(item()[outputs('settings')['NewDatasourceKeyColumnName']], if(empty(outputs('settings')['NewDatasourceKeyColumnName2']), , item()[outputs('settings')['NewDatasourceKeyColumnName2']]))]?['ID']': expected token 'Identifier' and actual 'Comma'.
I re-downloaded the template to double check it wasn't me. For some reason the expression you have is missing two signle quotes in...
if(empty(outputs('settings')['NewDatasourceKeyColumnName2']), , item()[outputs('settings')
It should actually be...
if(empty(outputs('settings')['NewDatasourceKeyColumnName2']), '', item()[outputs('settings')
But be careful if you try to copy expressions from somewhere like a RUclips comment. It often changes out some of the characters & errors, so you're better off just manually adding those two single quotes '
If anyone has larger datasets to maintain with something like this, the flow run time and flow action API calls may be really high. Your account may go over its daily action limit.
To do something like this quickly & efficiently for larger databases with thousands of updates a day, you can combine this flow with SharePoint batch create and batch update methods.
Batch Create: ruclips.net/video/2dV7fI4GUYU/видео.html
Batch Update: ruclips.net/video/l0NuYtXdcrQ/видео.html
You may just have to use Filter array actions instead to figure out what records in SQL are not already in SharePoint for the batch create and which are already in SharePoint for the batch update.
The filter condition for new records would be something like
string({SharePoint body}) does not contain concat(‘“sqlid”:”’, {SQL ID}, ‘“‘)
And the other Filter array condition would be the opposite
string({SharePoint body}) contains concat(‘“sqlid”:”’, {SQL ID}, ‘“‘)
And if you have several hundreds of thousands of records in SharePoint, you’ll have to create multiple Filter array actions, one for each SharePoint Get items with 100,000 records, and combine the Filter array outputs with Unions or Intersections.
Hey Tyler, this video has been super helpful and I have been able to pull and sync lots of reports in sharepoint because of it! I do need some additional help if you know where I could look or if you could provide any guidance that would be appreciated. Im trying to Batch update list item permissions for a sharepoint list. I have found a tutorial for when an item is created or modified it breaks the permissions and applies new permissions such as edit or view, but it does it for every item and uses lots of actions. The reason I'm trying to do this is so that different individuals can see only what applies to them in the list based on a criteria etc.
I may not 100% understand your use-case. But I have yet to see documentation or examples of batch API call options for permissions.
I’d first want to check if there is a way to better filter to the individual items you are trying to modify the permissions for.
Is it possible to use a Filter array action to get to just the items that need permissions edited?
Or is there a way to set things up to only share links to specific items with each person & edit those specific share links each time?
powerusers.microsoft.com/t5/Building-Flows/Create-Sharing-link-from-Sharepoint/td-p/675038
@@tylerkolota Thank you for the feedback, I have to track roles for multiple projects but each project cant see the others information for privacy reasons. The projects have names like "Fairview" or something like that. Since the roles are changed alot and new ones are added, I was hoping to batch provide permissions based on the project and on a schedule. So all could see their projects but not others and if new PM is added to the project they could be added to the project "group" and be given permission to see their project items when the scheduled flow runs. Since its a lot of roles and they are modified alot I didnt want to do the apply to each option which is possible. It would be like the following but in a batch if possible:
ruclips.net/video/q2vRqHr16Lg/видео.html
Hi Tyler, I’m getting an error on the 2nd get items action. It is saying the expression cannot be evaluated because property ID cannot be selected. Are you familiar with this error? Also, can you provide the whole expression just incase I missed something?
That sounds… odd.
Did the 1st Get items return any data?
Is your list blank?
Do you somehow not have an ID column in the SharePoint list?
Or maybe it is labeled Id instead of ID?
The expression is
Last(outputs(‘Get_items_100000’)?[‘body/value’])[‘ID’]
@@tylerkolota I had 2 filters in the Odata filter option preventing the get items data from pulling. Looks like I’ve fixed it now. Thank you for the quick response! Additionally, your video shows that the pagination was setup with a threshold of 100,000. I receive an error that the threshold can only be 5000. Has something functionality wise changed with Power Automate since the video was made?
@@saherkhatani Looks like that may depend on your licensing.
powerusers.microsoft.com/t5/Building-Flows/Pagination-Error/td-p/835358
I didn’t know some licenses had this limitation.
Well that’s a bummer! My list has roughly 15000 rows so I’m doing 3 get items actions of 5000. I’m working on a flow that seems complex to me as a beginner but you may have a better solution. I have 2 lists. List 1 is populated with MS form responses and List 2 has staff data (name, job title, dept., etc.). I’ve set up a flow to feed MS form responses to List 1 but after the item is created, I am trying to get staff data from List 2 to populate onto list 1 based on employee Id such as name, job title, dept. Unfortunately, with the flow I currently have set up, it is doing an apply to each for each of the 5000 get items from list 2. This is causing the flow to be very lengthy and time consuming. Do you have any solutions other than lookup columns?
Lookup columns won’t work in our scenario because we want to lock in the staff data values at the time the MS form was submitted.
Any guidance would be greatly appreciated!
@@saherkhatani Can you use the Odata Filter query on the SP action to at least reduce the 5k Apply to each array?
Both the batch create and batch update flows can be combined into something like what Daniel Christian is doing in this video, ruclips.net/video/k3WZ-lRiPIU/видео.html
But using these methods will be much faster and use less daily actions.
Instead of what Christian does with the row-by-row check on records in the SharePoint list, you may have to use Filter array actions to figure out what records in SQL are not already in SharePoint for the batch create and which are already in SharePoint for the batch update.
The filter condition for new records would be something like
string({SharePoint body}) does not contain concat(‘“sqlid”:”’, {SQL ID}, ‘“‘)
And the other Filter array condition would be the opposite
string({SharePoint body}) contains concat(‘“sqlid”:”’, {SQL ID}, ‘“‘)
And if you have several hundreds of thousands of records in SharePoint, you’ll have to create multiple Filter array actions, one for each SharePoint Get items with 100,000 records, and combine the Filter array outputs with Unions or Intersections.
Hi, do you have a video to delete items in batch api?
www.tachytelic.net/2021/06/power-automate-delete-sharepoint-items/
***LISTS WITH UNDERSCORES _ IN THE TITLE MAY CAUSE ERRORS WITH EARLIER VERSIONS OF BATCH ACTIONS.***
If you have a list with underscores in the name, then in TachyTelic.Net's batch create you will want to go to the __metadata line on the GenerateSPData select action and substitute the expression
json(concat('{"type":"SP.Data.', replace(outputs('settings')['listName'], ' ', '_x0020_'), 'ListItem"}'))
for
json(concat('{"type":"SP.Data.', replace(replace(outputs('settings')['listName'], '_', '_x005f_'), ' ', '_x0020_'), 'ListItem"}'))
this statement is underrrated! please take note guys!
do you have template for batch template step? do we have to type all that in ?
The template is here: powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List-With-External-Data/td-p/1365410
It’s also listed in the description.
And it’s listed in the top comments.
Also here is a video that includes a segment where I import a template from such a page.
(Near the beginning of the video after the Excel Primary Key set-up)
ruclips.net/video/HiEU34Ix5gA/видео.html
@@tylerkolota the download are all json files. Do y'all don't have a flow template? i meant its really hard to follow in the video to type in all those parameters
@@yilu3219 You can import the flow directly into your power automate environment. Did you check the video I suggested for seeing the import process?
@@tylerkolota The flow run successfully, but it does not generate output. I only replace the input table and output list based what I need, everything is import from the template.
I’m building a flow that uses Power Automate Desktop UI flows to routinely pull a CSV file from a website, then send the data back to a regular Power Automate flow through a variable, then send it to a SharePoint list with these batch create & batch update actions so a coworker can automatically update their PBI reports with it.
For a part of this flow I need to change the delimiter of the CSV file data so I can easily parse it all in the batch GenerateSPData Select actions. So I made this other flow to change the delimiter, even on large files, with only a handful of actions.
powerusers.microsoft.com/t5/Power-Automate-Cookbook/Change-CSV-File-Data-Delimiter/m-p/1442954#M531
If anyone is looking for another way to build the Vlookup type section that matches SharePoint IDs to the external data, Paulie just made this video on combining arrays:
ruclips.net/video/QSF6dNkSKSA/видео.html
Hello, I keep getting this message, the flow runs but nothing happens.
( "--batchresponse_62ea2ddf-9322-4726-9ec0-8be3c698f2de
Content-Type: application/http
Content-Transfer-Encoding: binary
HTTP/1.1 400 Bad Request
CONTENT-TYPE: application/json;odata=verbose;charset=utf-8
{\"error\":{\"code\":\"-1, Microsoft.SharePoint.Client.InvalidClientQueryException\",\"message\":{\"lang\":\"pt-BR\",\"value\":\"A type named 'SP.Data.lista_x005f_batchListItem' could not be resolved by the model. When a model is available, each type name must resolve to a valid type.\"}}}
).
I'm from Brazil so I guess that this "pt-BR" means somehing for nothing happens but don't know where change the code to update the list. Can you help me?
You may want to post to the community thread where we can include screenshots of the issues & action expressions
powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List-With-External-Data/td-p/1365410
It’s likely something is wrong with the data or with the primary key field on the piece that puts the ID with the new data.