This is great and really is a major time saver. Looks like some of your flows have moved on a bit since the video. I'm trying to implement the Batch Upsert V5.5b (with the loop to split into smaller batches for updating the spreadsheet). I've had the 'standard' version running but now it talks about batching up using take(skip([InsertDataHere], mul(iterationIndexes('Do_until_Update'), variables('BatchSize'))), variables('BatchSize')) I'm not familiar with this processing for a Sharepoint 'get items' action. Could you give any pointers about how best to do this?
Hello, I mentioned a way to convert this to use SharePoint here: powerusers.microsoft.com/t5/Power-Automate-Cookbook/Excel-Batch-Create-Update-and-Upsert/m-p/2173379/highlight/true#M1576
Hi! Exciting video. I think I have recently run into a problem with Excel actions and limits in PA, hope this will solve my issues... By any chance, do you know if there are any risks with using scripts when the solution needs to be moved between environments (and SharePoint site changes in every environment as well)?
I think the scripts are saved to the account used when creating or loading the script. So if the environment changes, the script should still be available. Even if it wasn’t, you could just make another copy for that environment. The changes to SharePoint site should only affect things if the Excel file(s) you are using are saved on the SharePoint site. If those files change SharePoint sites, then you will need to change the SharePoint site & File locations on the Excel actions in the flow.
I am using the batch create v5b and all data are updated but leaving the first two rows blank in the destination table. And if you can do another video for some more insights of how the flow is working ,that would be most helpful for the beginners like me. Thanks for this wonderful script
I can guess you are creating rows in an empty table then. The flow & script will have issues if the table doesn’t have any rows, so it checks if it is empty 1st, then adds some rows if it is. I should probably add something to track when it needs to perform that step to add blank rows to the table & have it remove those rows by the end. But I’m also a bit curious how you got the flow to run successfully if the table really was blank. There’s an earlier action in the flow to get the table headers & it usually fails if the table is truly blank.
Found a workaround on how to create/update data in the new table that works for me :) I need to update the destination file with the newest data from my data source. So I am using 2 files, one as a template, with a test row (source file), and the other file is my destination file. First, I delete all the data from the template file (except the test row) and create new records with the batch create script. Then I upsert data into the destination file and later delete records that are not in the template file (source file).
Hi, i've posted a question for the batch delete if we don"t have PK ! Thanks a lot for your answer. Is it possible to do a batch create only without PK ? Thanks in advance
If you want to later update anything in the table using any method, you will need a PK column, so it is advisable to have one. Also there are a couple places in the Office Script that reference the PK column. But if you really need to, I think you can pick a random column, even if it doesn’t qualify as a primary key, & input that column & its values as the primary key / PK in the flow. The script will then create rows with those column values as it would any other column. The only difference is the column label in the GenerateData action would be PK instead of its actual column name in the Excel table. The create script ultimately adjusts that PK labeling to match the same structure as the other column labels. So you could do that & remove the other Update actions as shown in the video for only batch Creates.
@@hedgar6989 Yes, you would need to replace the Excel List rows Sample source data with a Dataverse List rows action & update the From input of the SelectGenerateData action to the Dataverse action outputs
Hi might be a simple question but if I want to upsert the source data into multiple excel files is it just a case of duplicating the last two steps for each file? Or is there a more efficient method?
Yes if all file tables have the same column names then you should be able to copy the last two actions in the batch create scope & copy the last Run script update action from the batch update scope for each file. And probably not necessary for your case but… If I had many files to do this with, I’d check if I could use dynamic values for some of the Run script inputs, then create a JSON array in a compose action with the dynamic values like the file names & table names, so I could create an Apply to each loop on that JSON array of input values & loop through running it for each file.
Microsoft is renovating their community forums. I'm not sure if the page they moved my stuff to even has file uploads and I'm unable to edit the posts. I'll need to figure something else out, but in the meantime, email me at takolota@gmail.com with what templates you want the solution packages for.
Here is a google drive link to a zip folder with the solution package import for most of my projects: drive.google.com/file/d/1-hZmu2-QdEl7l95Y3NdZ1jn8sYdaK2kD/view?usp=sharing
Hi. Thank you so much for this video, it is very thorough and explains everything so well. I am looking to use this tool to perform a Sharepoint to Excel batch update. I saw your other video on Sharepoint batch creation but is the general workflow of this video possible to achieve using a Sharepoint folder as a source? I was hoping to trigger it when a new file is uploaded. Wondered if you had any thoughts on this possibility. Thank you so much!
Thanks Ayecob. What kind of SharePoint folder source file are you trying to auto-upload to an Excel workbook? Is it Excel to Excel? Because it should all be possible, especially for new Excel or CSV files. The flow would trigger, then you would check if the file name ends with .xlsx, if yes then use a List rows present in table on that folder & file name from the trigger. I think one limitation may be any xlsx files would need the same data table name across all files, like Table1.
@@tylerkolota Thanks for your response. Ideally yes it would be in an Excel file. One question I have is, does the data in the source and or destination have to be in a table or can it just be raw data. In which case would it be better to use CSV? Unfortunately, the names of tables would not be consistent, like Table1. And this is what I am looking for a workaround to address.
@@ayecob558 So if you could use CSV files for source data and those CSV files have the same sets of columns, then you could combine a CSV parsing template with this video’s Excel template: powerusers.microsoft.com/t5/Power-Automate-Cookbook/CSV-to-Dataset/td-p/1508191 That should get around needing any table references.
I actually have an earlier video & thread on SharePoint batch update & other batch actions. powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List-With-External-Data/td-p/1365410 It’s not pre-set for batch upsert like this one, but I’ve definitely combined the SP batch create & batch update to do upserts. For that I do have to use the Filter arrays to identify only the new records & only the records to update, much like what I used in the best standard method of this video.
Hi, I have a json that contains the information I want to add in my excel table in one drive. I tried replacing the List rows with sample source data with Http connector that gets json and used Parse Json to take only what is needed to update from that JSON. Is there any solution for this?
@@tylerkolota i have excel file in shared folder in work (master database) and I prepared online excel sheet and I inserted a query from that file which is in shared folder of the company, is there any way to refresh that query automated instead of opening the online excel and refresh it?
@@Hemo2YoYo If you want the query data to refresh upon a user opening the workbook, then there are ways to do that with VBA: office-watch.com/2022/excel-automatic-refresh-recalculation-tricks/ You can also try using Power Automate Desktop to open the files & refresh from there. Or you can set up the data tables you need in your file & use this video & template to batch update the tables using standard Power Automate cloud flows powerusers.microsoft.com/t5/Power-Automate-Cookbook/Excel-Batch-Create-Update-and-Upsert/td-p/1624706
hi, i tried this exact method and this is not adding any rows to my excel. it is only adding 1 row with the empty values. can you show me the data format in the action "Filter array Get records not found in table " that we take as input array to create the rows. because I have in the format [ {col1row1 : val1, col2row1 : val2}, {col1row2 : val1, col2row2 : val2} ]
If you are running that after a batch update action, then it should be gathering all rows where their primary key values were returned in the output of the batch update script. What was returned from the batch update script? Or are you only trying to run a batch create? You may want to go to the community forum where you can post screenshots of your flow run. powerusers.microsoft.com/t5/Power-Automate-Cookbook/Excel-Batch-Create-Update-and-Upsert/td-p/1624706
@@DTejashReddy789 Alright that should make things relatively simple then. To only batch Create you can feed any JSON array into the CreateData input. It will then find any Excel columns that match the JSON array field key labels & create cell values at those columns on new rows. So you shouldn’t need the Filter action at all. You can remove the Run script Update Excel rows & remove the Filter array Get records not found in table and use the outputs of the SelectGenerateData action as the CreateData input, making sure the labels in the Select Map match your Excel column headers.
Hi, Thank you so much sharing wonderful scripts to delete and update excel tables with speed. I am trying to utilize this for following use case: I have 5 excel files on Sharepoint document library which has information about resources in 5 teams. Column names are same in all sources files, Size is approx. 100 rows in each source file. These files get updated with new resources coming in and existing leaving at times. I do want to get a consolidated excel table in destination file which have all current rows in source files with some filtering on source columns. Currently , I have tried to use delete Script/power flow following up with upsert v5 script and power flow. It does work good to get data from 1 source file to consolidated file, but when I run it again with 2nd source file, it does overwrite rows which I got from 1st source file. Is there any way to tweak script so that it does not overwrite but concat the multiple source file tables rows in 1 consolidate table in destination file? I can share more details, if required, that you so much for your help.
Hello Jagmeet, That does sound strange as the Upsert flow does not delete anything. But if the items on different files have the same primary keys, then it will “overwrite” those with new updates. You’ll probably want to change your tables to include something like a concatenation column that combines the primary key with the table name. Then use that as the primary key in the flow.
Hello, I have yet to use that function in Scripts as it originally was not functional for most cloud Excel files. I saw MS recently got it working for datasets in the same workbook & maybe other workbooks on the same cloud. Is it working for your scenario? I would think it would save the refresh automatically in the cloud file if it was.
@@tylerkolota I tried to run script for refreshing all connection and it is working locally when tried to run from powerautomate unsure whether it is running the sciript my flow got success but it seems not refreshed so doubt weather auto script having saving functional
This is great and really is a major time saver. Looks like some of your flows have moved on a bit since the video.
I'm trying to implement the Batch Upsert V5.5b (with the loop to split into smaller batches for updating the spreadsheet). I've had the 'standard' version running but now it talks about batching up using take(skip([InsertDataHere], mul(iterationIndexes('Do_until_Update'), variables('BatchSize'))), variables('BatchSize'))
I'm not familiar with this processing for a Sharepoint 'get items' action. Could you give any pointers about how best to do this?
Hello,
I mentioned a way to convert this to use SharePoint here: powerusers.microsoft.com/t5/Power-Automate-Cookbook/Excel-Batch-Create-Update-and-Upsert/m-p/2173379/highlight/true#M1576
Perfect - thank you! @@tylerkolota
Hi! Exciting video. I think I have recently run into a problem with Excel actions and limits in PA, hope this will solve my issues... By any chance, do you know if there are any risks with using scripts when the solution needs to be moved between environments (and SharePoint site changes in every environment as well)?
I think the scripts are saved to the account used when creating or loading the script. So if the environment changes, the script should still be available. Even if it wasn’t, you could just make another copy for that environment.
The changes to SharePoint site should only affect things if the Excel file(s) you are using are saved on the SharePoint site. If those files change SharePoint sites, then you will need to change the SharePoint site & File locations on the Excel actions in the flow.
@@tylerkolota thanks:)
Thank you. Seems like this is the solution to my issue. I'll give it a try soon. You've earned yourself a subscriber.
Thanks Riyaan!
Reach back out if you have any questions.
I am using the batch create v5b and all data are updated but leaving the first two rows blank in the destination table. And if you can do another video for some more insights of how the flow is working ,that would be most helpful for the beginners like me.
Thanks for this wonderful script
I can guess you are creating rows in an empty table then. The flow & script will have issues if the table doesn’t have any rows, so it checks if it is empty 1st, then adds some rows if it is.
I should probably add something to track when it needs to perform that step to add blank rows to the table & have it remove those rows by the end.
But I’m also a bit curious how you got the flow to run successfully if the table really was blank. There’s an earlier action in the flow to get the table headers & it usually fails if the table is truly blank.
Found a workaround on how to create/update data in the new table that works for me :)
I need to update the destination file with the newest data from my data source. So I am using 2 files, one as a template, with a test row (source file), and the other file is my destination file. First, I delete all the data from the template file (except the test row) and create new records with the batch create script. Then I upsert data into the destination file and later delete records that are not in the template file (source file).
Hi, i've posted a question for the batch delete if we don"t have PK ! Thanks a lot for your answer.
Is it possible to do a batch create only without PK ?
Thanks in advance
If you want to later update anything in the table using any method, you will need a PK column, so it is advisable to have one.
Also there are a couple places in the Office Script that reference the PK column.
But if you really need to, I think you can pick a random column, even if it doesn’t qualify as a primary key, & input that column & its values as the primary key / PK in the flow. The script will then create rows with those column values as it would any other column.
The only difference is the column label in the GenerateData action would be PK instead of its actual column name in the Excel table. The create script ultimately adjusts that PK labeling to match the same structure as the other column labels.
So you could do that & remove the other Update actions as shown in the video for only batch Creates.
Can the source data be from Dataverse and the destination be excel?
@@hedgar6989 Yes, you would need to replace the Excel List rows Sample source data with a Dataverse List rows action & update the From input of the SelectGenerateData action to the Dataverse action outputs
Hi might be a simple question but if I want to upsert the source data into multiple excel files is it just a case of duplicating the last two steps for each file? Or is there a more efficient method?
Yes if all file tables have the same column names then you should be able to copy the last two actions in the batch create scope & copy the last Run script update action from the batch update scope for each file.
And probably not necessary for your case but…
If I had many files to do this with, I’d check if I could use dynamic values for some of the Run script inputs, then create a JSON array in a compose action with the dynamic values like the file names & table names, so I could create an Apply to each loop on that JSON array of input values & loop through running it for each file.
Hello! The downloadable packages are missing on the referenced page. Is there another way to download the flow templates?
Microsoft is renovating their community forums. I'm not sure if the page they moved my stuff to even has file uploads and I'm unable to edit the posts.
I'll need to figure something else out, but in the meantime, email me at takolota@gmail.com with what templates you want the solution packages for.
Here is a google drive link to a zip folder with the solution package import for most of my projects: drive.google.com/file/d/1-hZmu2-QdEl7l95Y3NdZ1jn8sYdaK2kD/view?usp=sharing
@@tylerkolota Thank you very much! Appreciated!
Hi. Thank you so much for this video, it is very thorough and explains everything so well. I am looking to use this tool to perform a Sharepoint to Excel batch update. I saw your other video on Sharepoint batch creation but is the general workflow of this video possible to achieve using a Sharepoint folder as a source? I was hoping to trigger it when a new file is uploaded. Wondered if you had any thoughts on this possibility. Thank you so much!
Thanks Ayecob. What kind of SharePoint folder source file are you trying to auto-upload to an Excel workbook?
Is it Excel to Excel?
Because it should all be possible, especially for new Excel or CSV files. The flow would trigger, then you would check if the file name ends with .xlsx, if yes then use a List rows present in table on that folder & file name from the trigger. I think one limitation may be any xlsx files would need the same data table name across all files, like Table1.
@@tylerkolota Thanks for your response. Ideally yes it would be in an Excel file. One question I have is, does the data in the source and or destination have to be in a table or can it just be raw data. In which case would it be better to use CSV? Unfortunately, the names of tables would not be consistent, like Table1. And this is what I am looking for a workaround to address.
@@ayecob558 So if you could use CSV files for source data and those CSV files have the same sets of columns, then you could combine a CSV parsing template with this video’s Excel template:
powerusers.microsoft.com/t5/Power-Automate-Cookbook/CSV-to-Dataset/td-p/1508191
That should get around needing any table references.
@@tylerkolota Awesome, Ill give that a try. Thanks!
Very good . Whether it's possible to update SharePoint list with Batch upsert based on excel data
I actually have an earlier video & thread on SharePoint batch update & other batch actions.
powerusers.microsoft.com/t5/Power-Automate-Cookbook/Batch-Update-SharePoint-List-With-External-Data/td-p/1365410
It’s not pre-set for batch upsert like this one, but I’ve definitely combined the SP batch create & batch update to do upserts.
For that I do have to use the Filter arrays to identify only the new records & only the records to update, much like what I used in the best standard method of this video.
@@tylerkolota Thank you very much for your prompt response... Keep up your good unique content videos... 👍
Hi, I have a json that contains the information I want to add in my excel table in one drive. I tried replacing the List rows with sample source data with Http connector that gets json and used Parse Json to take only what is needed to update from that JSON. Is there any solution for this?
Hello Manoj, I’m responding over in your LinkedIn message requests as it may be easier to share pictures there.
Hello Manoj. I’m responding on the LinkedIn messages as it may be easier to share screenshots there.
Can u explain from where can I refresh connections on online excel ?
Refresh the Excel Online connector connections in Power Automate?
Or refresh a Power Query table on an Excel Online workbook?
@@tylerkolota i have excel file in shared folder in work (master database) and I prepared online excel sheet and I inserted a query from that file which is in shared folder of the company, is there any way to refresh that query automated instead of opening the online excel and refresh it?
@@Hemo2YoYo
If you want the query data to refresh upon a user opening the workbook, then there are ways to do that with VBA: office-watch.com/2022/excel-automatic-refresh-recalculation-tricks/
You can also try using Power Automate Desktop to open the files & refresh from there.
Or you can set up the data tables you need in your file & use this video & template to batch update the tables using standard Power Automate cloud flows
powerusers.microsoft.com/t5/Power-Automate-Cookbook/Excel-Batch-Create-Update-and-Upsert/td-p/1624706
hi, i tried this exact method and this is not adding any rows to my excel. it is only adding 1 row with the empty values.
can you show me the data format in the action "Filter array Get records not found in table " that we take as input array to create the rows. because I have in the format
[ {col1row1 : val1, col2row1 : val2},
{col1row2 : val1, col2row2 : val2} ]
If you are running that after a batch update action, then it should be gathering all rows where their primary key values were returned in the output of the batch update script.
What was returned from the batch update script?
Or are you only trying to run a batch create?
You may want to go to the community forum where you can post screenshots of your flow run.
powerusers.microsoft.com/t5/Power-Automate-Cookbook/Excel-Batch-Create-Update-and-Upsert/td-p/1624706
@@tylerkolota I am trying only create. And for creating a new row, the action is not asking for any of the primary key.
@@DTejashReddy789 Alright that should make things relatively simple then.
To only batch Create you can feed any JSON array into the CreateData input. It will then find any Excel columns that match the JSON array field key labels & create cell values at those columns on new rows.
So you shouldn’t need the Filter action at all.
You can remove the Run script Update Excel rows & remove the Filter array Get records not found in table and use the outputs of the SelectGenerateData action as the CreateData input, making sure the labels in the Select Map match your Excel column headers.
Hi, Thank you so much sharing wonderful scripts to delete and update excel tables with speed. I am trying to utilize this for following use case:
I have 5 excel files on Sharepoint document library which has information about resources in 5 teams. Column names are same in all sources files, Size is approx. 100 rows in each source file. These files get updated with new resources coming in and existing leaving at times. I do want to get a consolidated excel table in destination file which have all current rows in source files with some filtering on source columns.
Currently , I have tried to use delete Script/power flow following up with upsert v5 script and power flow. It does work good to get data from 1 source file to consolidated file, but when I run it again with 2nd source file, it does overwrite rows which I got from 1st source file. Is there any way to tweak script so that it does not overwrite but concat the multiple source file tables rows in 1 consolidate table in destination file? I can share more details, if required, that you so much for your help.
Hello Jagmeet,
That does sound strange as the Upsert flow does not delete anything. But if the items on different files have the same primary keys, then it will “overwrite” those with new updates.
You’ll probably want to change your tables to include something like a concatenation column that combines the primary key with the table name. Then use that as the primary key in the flow.
Thanks for video it is really nice, I am facing issue in SAVE excel after refreshAllDataConnections() could you please help
Hello, I have yet to use that function in Scripts as it originally was not functional for most cloud Excel files.
I saw MS recently got it working for datasets in the same workbook & maybe other workbooks on the same cloud. Is it working for your scenario? I would think it would save the refresh automatically in the cloud file if it was.
@@tylerkolota I tried to run script for refreshing all connection and it is working locally when tried to run from powerautomate unsure whether it is running the sciript my flow got success but it seems not refreshed so doubt weather auto script having saving functional
See the related Excel Batch Delete template video here: ruclips.net/video/LrzjH9dI0is/видео.html