Thanks for watching! If you found this tutorial helpful and plan to build this Microsoft Power Automate flow-please consider giving this video a 👍. It helps me gauge what type of tutorials you like! Thank you! 🤗
For anyone having issues with the fields not showing up after the parse JSON step. Change your schema to allow nulls, but do it using this method: { "type": "array", "items": { "type": "object", "properties": { "Append1[Dash]": { "anyOf": [ { "type": "string" }, { "type": "null" } ] },
You saved my everything with this comment. I had a blank row at the end of my CSV couldn't find anything that would get the JSON to work. Tried all different filter arrays to no avail. Then I saw this comment....life saver thank you thank you thank you
Hi, can you make a video to explain the procedure if you want to only update a list by a csv and not just adding items? Many thanks for the great content!
This is so great, really covering something I need to implement. However, I'm running that I think I've tracked down to my column headers starting in Row 2. Is there a way to specify that that is where the column headers are and to ignore Row 1 or a 'simple' automated way to delete Row 1 before starting with the various Compose actions that you've outlined in this video?
This is amazing and almost exactly what I am looking for! Only my CSV file does not have headers. I am very new to using power automate but is there some step I should be skipping to avoid the headers entirely to achieve this same result? I’d really appreciate the tip! Thank you!
I love the speed because I watch it one time fast speed, then after I change the speed to .75 and bring up my Power Automate. This allows me to download, and then have guidance after with FireFox Picture in Picture mode on top 😉 ... I was actually rather impressed and assumed you made it this fast for a reason LOL! I NEEDED THIS SO MUCH TY!!!
Hello. This tutortial is great! Really appreciate you sharing the information. I have to filter a column in my spreadsheet priot to import. What would you recommend the best way to do this?
Thanks for watching. When using an Excel table in a flow you use the Filter Query in the action. However, if you have more than one condition that you need to filter on, you'll have to use a Filter Array action. You can refer to this YT tutorial: ruclips.net/video/HVAf1aTEQNg/видео.htmlsi=1YJ2hxWMJ9jY4uVa
Hi there! A fantastic tutorial, concise and easy to follow. I would have one question for my specific dataset, when running the automation, I receive an error at the Create Item action that says "The 'inputs.parameters' of workflow operation 'Create_item' of type 'OpenApiConnection' is not valid. Error details: Input parameter 'item/Project_x0020_Start' is required to be of type 'String/date'. The runtime value '"16-05-2023"' to be converted doesn't have the expected format 'String/date'." What should I do to resolve this issue?
Thank you so much! It looks like your date needs to be formatted. Take a look at this short and adjust your date format: ruclips.net/user/shortsHmxKckDdWQo LMK if that works!
@@acreativeopinion Unfortunately no, I am now receiving this error: InvalidTemplate. Unable to process template language expressions in action 'Compose_-_SOS_Formatted' inputs at line '0' and column '0': 'The template language function 'int' was invoked with a parameter that is not valid. The value cannot be converted to the target type.'.
@@Emil904 I'm so sorry. I made a mistake. That video I directed you to was for converting Excel dates for SharePoint. In your case, you will need to use this expression formatDateTime([insert your date here], 'yyyy-MM-dd'). give that a try.
Are you inserting your date into the [insert your date] part of the formula? Try to create a compose action to store your date from the CSV file, and the insert the output of that compose action into the [insert your date] portion of the expression. See if that works.
Great video, as always. I have a csv in my OneDrive that is connected to PowerBi for data visualisation. The csv contains the data from our CRM which doesn't directly talk to PowerBi. So right now, I export the data from CRM, open the existing csv that is connected to PowerBi and replace the values expect the column heading. After watching your video, I am wondering if I could download the csv, right click and automate the list to SharePoint and change the PowerBi dataset to SharePoint list. My question is, does it have to be the exact file name (As each export will have a data/time stamp to it) and when we import - does it replace the sharepoint list or add to the existing list? Or a much larger question - is there a better way to handle my situation? :(
Thanks for watching! I don't have much experience with Power BI (yet) because I use a Macbook Pro as my main device and the Power BI web app can't connect to a SharePoint list. Anyway, the file name can be anything you want. At the 6:15 in the video I go over how to change the manual trigger to a Selected File Trigger. When you import it will add to the SharePoint list. When you run your CRM export are you exporting out unique data each time or is there some overlap? If there is some overlap... you will need a way to check SharePoint to see if the data already exists. This could be done using a unique identifier if you have it in your CSV export. If you import a unique identifier then you could run a check before adding to the SP list. Another trigger that you could use is the When a File is Created trigger. This trigger would start the flow upon file creation (so you wouldn't need to manually select the file and then have the automation run). You could drag and drop your file into the folder or if you've synced your folder-simply save the CRM export right into that folder (this would be my recommendation). If you take this approach I would also recommend adding a column in your Document Library to hold a timestamp of when the file was imported into SharePoint. I hope this helps!
Great video! Very easy to follow. What do I do if some of my data contains commas? I am processing CSV files attached to received emails and I cannot manipulate the CSV file before it hits Power Automate. Thanks!
Thanks for watching! I wish I could help. I had the same issue too with a different set of data and ended up opening up the CSV in Excel and doing a find/replace. Not ideal. If you do happen to find a solution-pls drop it down below!
@@acreativeopinion Good to know I'm not alone! I thought I had a solution by using a DataFlow (ruclips.net/video/8IvHxRnwJ7Q/видео.html) but it turns out that I am locked out of this feature Back to the drawing board!
and I had another idea based on ruclips.net/video/9J6ThPWGaG0/видео.html about converting .CSV to .XLSX using Office Scripts but the scrips returns an error that a header contains more than 255 characters (again, likely due to commas and line breaks in the data). So many roadblocks when dealing with complex .CSVs.
Hi There - This is amazing, Thank you - BUT I NEED SOME HELP. 05:03 - I am unable to see all the fields. Only Item and Body. I was having an issue in my Scheme where I was getting "Invalid type. Expected String but got Null." I added null into the Schema and was able to pass that step - I am not sure if that is now causing this issue. Please help? Thanks so much again :)
Can you pls post a topic in the Microsoft Power Automate Community: powerusers.microsoft.com/t5/Microsoft-Power-Automate/ct-p/MPACommunity. It's really hard to troubleshoot in the YT comments section. Thx!
The last column header and value has "/r"(carriage retrun). When I tried to map that column to a Sharepoint column(Textbox) it is not accepting the value and not populating the value. Please help how to fix that.
I fixed it by adding the following in Compose - Split Rows. Instead of using Carriage return from Compose. split(outputs('ComposeCSVContent'),decodeUriComponent('%0D%0A'))
@@salainayaganarunmozhi9558 Hi, Salai! Thank you for the tip. I have the absolute same problem, but if I add your code instead of an 'enter' in the compose, and use it to split, i get the error: "Unable to process template language expressions in action 'Compose - Split Rows' inputs at line '0' and column '0': 'The template language function 'split' expects its second parameter to be of type string. The provided value is of type 'Array'." Did you add any extra steps??
I'm having same issue now, @acreativeopinion ... I am not sure how this issue turned up. I didn't change anything in these areas but the extra /r on my column is now breaking things!! ACK!!
@@erichwill4839 1. I used "Get File Content using path" to read the .csv file 2. Next, I used "Compose" and used -- string(outputs('Get_file_content_using_path')?['body']) 3. Next, I used "Compose" and used -- split(outputs('Compose'), decodeUriComponent('%0D%0A')) Above 3 steps remove /r from my output.
Hi! Your video is great and helped me so much but I too seem to be stuck. When adding the data to SharePoint an error appears when parsing the json - expected string got null. Tried adding null as a type but didn’t work…
This is because your array probably contains a null row. Before getting the column names compose action, add a 'Filter array' action and add the following logic: @equals(empty(item()),false) This should remove any null rows from your array which is what is causing the Parse JSON action to break
Thank you! In your Parse JSON schema, please try to edit: "type": "string" to "type": ["string", "null"] You'll need to do this for each property to allow it to accept a null value and string value. Let me know if that helps!
@@acreativeopinion This fixed it for me, I was able to use the REPLACE all option too within the little search box inside Power automate so it replaced all of them at once.. ran into an error within the actual next step though I am figuring out now! But this got me farther!! Woot ty!!
@@acreativeopinion Now if I could only figure out how to make it so it updates EXISTING list entries... Is this possible?! It would be LEGENDARY if so!! What do you think @acreativeopinion
This is really fantastic. Thanks so much. I am having an issue at Compose Split Rows. Its adding an empty row at the end "" This seems to be added to the csv file and it causes the Parse JSON for Portfolio Slicer to fail validation "Invalid type. Expected String but got Null." This is the code used for Compose Split Rows split(outputs('Compose_Csv_Content'),outputs('Compose_Line_Break')) For the line break I just created the blank row as per the information. I could do with something that removes the empty row before continuing
Thank you! In the Parse JSON schema, please try to edit: "type": "string" to "type": ["string", "null"] You'll need to do this for each property to allow it to accept a null value and string value. Let me know if that helps!
@@acreativeopinion I have updated that But now Parse JSON only has Item and body available as options so i cant choose my columns. When I put it back to "type": "string" the columns appear again but the data doesn't parse because of the null value
@@debbieedwards7267 Hmm.. I just tried it on my Parse JSON action and I'm still able to see the columns. Did you replace each instance of "type": "string" with "type":["string", "null"]? Make sure null is in lower case and within double quotes. Another thing you could try is editing the Parse JSON schema AFTER you've selected your columns.
@@acreativeopinion this does work in a way. At least now the data from the csv is saved in the share point list. However, now there is also an empty entry added into the share point list because we accept the empty null row that is for whatsoever reason added to the csv data.
@acreativeopinion: I really appreciate your flow and steps! I used some of the comments to fix a couple spots in the flow, but I am receiving this error "ActionFailed. An action failed. No dependent actions succeeded." I think this is because there is a blank row at the end of my dataset. I've added type: "null" to the Parse JSON schema, which seems to work, however, the flow still reports being failed even though I'm not sure why... Any ideas or help would be most appreciated
Yes, it is the exact same for me! @acreativeopinion @gavinobiern5361 Does this have something to do with the line break compose action? Maybe there is an extra line break added after the last row in the csv which is where the null row is created.
What ive found out, is that my raw input csv when composing the csv to string does already have a at the end of the last row which must be the reasons for the extra row added. However, I am not yet sure where this comes from...
Is there a way to skip (or stop at) the null row at the end of the csv file? Otherwise it'll add blank rows to the SharePoint list every time it runs. 👀
@acreativeopinion - Thank you for the explanation. I am unable to see the import (folder name) when I try to the automate csv file. Currently, testing this with Excel 2016, not sure if that is the challenge. Please could you assist
Not sure which part of the tutorial you are referring to. Is it 7:11? If not, can you pls specify which part of the tutorial you are referring to? Thx.
@@Yinyemez What is the trigger you are using for your flow? In order to see an option to run the flow from your Document Library, you need to use the For a selected file trigger.
Thanks for watching. You would need to cross-check your CSV data against what already exists in SharePoint. You can refer to the YT Tutorial I uploaded on the Filter Array action: ruclips.net/video/HVAf1aTEQNg/видео.html. In this tutorial I cover how to cross-reference data in Excel with a SharePoint list. The concept will apply to data in a CSV file as well. You will need a unique identifier that exists in your CSV data as well as your SP data so that you can run a cross-reference check. Hope this helps!
Thanks for watching! I can't actually confirm as I've only used this flow with a small set of data. However, there are a couple considerations that you should keep in mind. Max flow duration: If you have a lot of data in your CSV file, it may take a long time to process. Actions per flow run: Depending on how many actions you need to run per row in your CSV, you may hit your limit (based on your plan). Hope this helps!
Have you tried to open your csv file in Excel and save it as the 'Comma Separated Values' format? Some platforms export out CSV files in CSV UTF-8 format. Try to open it in Excel and "save as" and see if that helps.
@@acreativeopinion this wont work since its a csv file generated by an API, so converting defeats the automation... trying the Encodian csv parsing to see if it does the same thing :)
Thanks for watching! When you view the last flow run, what does the Column Names Compose action (2:01) output? Also, in your CSV file, are your column names located in row 1 of your file or is there a blank row?
Thanks for watching! Can you clarify what you mean? Are you looking to copy and paste your CSV data into a SP List? If so, open your CSV in Excel and copy/paste it from Excel into SharePoint. You can refer to this section of a different tutorial: ruclips.net/video/hQ5Tw3UL7ik/видео.html
Thanks for watching. This is one of my earlier videos and I was trying to figure out the pacing. I think I've figured out a good pace-check out my more recent tutorials.
Hi ! do you think I can easily transform my CSV from { "body": [ "1010007, I WANT ", "1010007,TO CONCAT ", "1010007,INTO ONE ROW ", "1099999,AND WHY NOT ", "1099999,THIS ONE ALSO " . . . ] } into { "body": [ "1010007, I WANT TO CONCAT INTO ON ROW ", "1099999, AND WHY NOT THIS ONE ASLO ", . . . ] } I can't figure out how to do that and this is the final step I need ! thanks !!
Thanks for watching! If you found this tutorial helpful and plan to build this Microsoft Power Automate flow-please consider giving this video a 👍. It helps me gauge what type of tutorials you like! Thank you! 🤗
For anyone having issues with the fields not showing up after the parse JSON step. Change your schema to allow nulls, but do it using this method: {
"type": "array",
"items": {
"type": "object",
"properties": {
"Append1[Dash]": {
"anyOf": [
{ "type": "string" },
{ "type": "null" }
]
},
You saved my everything with this comment. I had a blank row at the end of my CSV couldn't find anything that would get the JSON to work. Tried all different filter arrays to no avail. Then I saw this comment....life saver thank you thank you thank you
LOVE LOVE LOVE it... Concise, quick, and no fluff...Thank you so much!
Worked! Thank you!! Had to listen at 0.75 to follow haha
You're welcome. Thanks for the feedback. I'm still working on my pacing. I do tend to go a bit fast. I'll work on slowing it down.
Hi, can you make a video to explain the procedure if you want to only update a list by a csv and not just adding items?
Many thanks for the great content!
Just stumbled upon this channel and instantly subscribed.
Amazing! Thank you for subscribing!
This is so great, really covering something I need to implement. However, I'm running that I think I've tracked down to my column headers starting in Row 2. Is there a way to specify that that is where the column headers are and to ignore Row 1 or a 'simple' automated way to delete Row 1 before starting with the various Compose actions that you've outlined in this video?
This is amazing and almost exactly what I am looking for! Only my CSV file does not have headers. I am very new to using power automate but is there some step I should be skipping to avoid the headers entirely to achieve this same result? I’d really appreciate the tip! Thank you!
fantastic video! a little fast at times but that's just me! every step worked and I learned quite a lot! thank you very much!!
Thanks for watching and for the feedback! I'm working on the pacing.... happy you got something from it! 😊
I love the speed because I watch it one time fast speed, then after I change the speed to .75 and bring up my Power Automate. This allows me to download, and then have guidance after with FireFox Picture in Picture mode on top 😉
... I was actually rather impressed and assumed you made it this fast for a reason LOL!
I NEEDED THIS SO MUCH TY!!!
Hello. This tutortial is great! Really appreciate you sharing the information. I have to filter a column in my spreadsheet priot to import. What would you recommend the best way to do this?
Thanks for watching. When using an Excel table in a flow you use the Filter Query in the action. However, if you have more than one condition that you need to filter on, you'll have to use a Filter Array action. You can refer to this YT tutorial: ruclips.net/video/HVAf1aTEQNg/видео.htmlsi=1YJ2hxWMJ9jY4uVa
@@acreativeopinion thanks for the reply! I'll give it a shot.
This was really helpful, thank you!
Hi there! A fantastic tutorial, concise and easy to follow.
I would have one question for my specific dataset, when running the automation, I receive an error at the Create Item action that says "The 'inputs.parameters' of workflow operation 'Create_item' of type 'OpenApiConnection' is not valid. Error details: Input parameter 'item/Project_x0020_Start' is required to be of type 'String/date'. The runtime value '"16-05-2023"' to be converted doesn't have the expected format 'String/date'."
What should I do to resolve this issue?
Thank you so much! It looks like your date needs to be formatted. Take a look at this short and adjust your date format: ruclips.net/user/shortsHmxKckDdWQo
LMK if that works!
@@acreativeopinion Unfortunately no, I am now receiving this error:
InvalidTemplate. Unable to process template language expressions in action 'Compose_-_SOS_Formatted' inputs at line '0' and column '0': 'The template language function 'int' was invoked with a parameter that is not valid. The value cannot be converted to the target type.'.
@@Emil904 I'm so sorry. I made a mistake. That video I directed you to was for converting Excel dates for SharePoint. In your case, you will need to use this expression formatDateTime([insert your date here], 'yyyy-MM-dd'). give that a try.
@@acreativeopinion It states that the expression is invalid.
Are you inserting your date into the [insert your date] part of the formula? Try to create a compose action to store your date from the CSV file, and the insert the output of that compose action into the [insert your date] portion of the expression. See if that works.
Great video, as always.
I have a csv in my OneDrive that is connected to PowerBi for data visualisation. The csv contains the data from our CRM which doesn't directly talk to PowerBi.
So right now, I export the data from CRM, open the existing csv that is connected to PowerBi and replace the values expect the column heading.
After watching your video, I am wondering if I could download the csv, right click and automate the list to SharePoint and change the PowerBi dataset to SharePoint list.
My question is, does it have to be the exact file name (As each export will have a data/time stamp to it) and when we import - does it replace the sharepoint list or add to the existing list? Or a much larger question - is there a better way to handle my situation? :(
Thanks for watching! I don't have much experience with Power BI (yet) because I use a Macbook Pro as my main device and the Power BI web app can't connect to a SharePoint list. Anyway, the file name can be anything you want. At the 6:15 in the video I go over how to change the manual trigger to a Selected File Trigger.
When you import it will add to the SharePoint list. When you run your CRM export are you exporting out unique data each time or is there some overlap? If there is some overlap... you will need a way to check SharePoint to see if the data already exists. This could be done using a unique identifier if you have it in your CSV export. If you import a unique identifier then you could run a check before adding to the SP list.
Another trigger that you could use is the When a File is Created trigger. This trigger would start the flow upon file creation (so you wouldn't need to manually select the file and then have the automation run). You could drag and drop your file into the folder or if you've synced your folder-simply save the CRM export right into that folder (this would be my recommendation). If you take this approach I would also recommend adding a column in your Document Library to hold a timestamp of when the file was imported into SharePoint.
I hope this helps!
Great video! Very easy to follow. What do I do if some of my data contains commas? I am processing CSV files attached to received emails and I cannot manipulate the CSV file before it hits Power Automate. Thanks!
Thanks for watching! I wish I could help. I had the same issue too with a different set of data and ended up opening up the CSV in Excel and doing a find/replace. Not ideal.
If you do happen to find a solution-pls drop it down below!
@@acreativeopinion Good to know I'm not alone! I thought I had a solution by using a DataFlow (ruclips.net/video/8IvHxRnwJ7Q/видео.html) but it turns out that I am locked out of this feature Back to the drawing board!
and I had another idea based on ruclips.net/video/9J6ThPWGaG0/видео.html about converting .CSV to .XLSX using Office Scripts but the scrips returns an error that a header contains more than 255 characters (again, likely due to commas and line breaks in the data). So many roadblocks when dealing with complex .CSVs.
This was great info. How would you handle an additional , in a field?
Thank you! The easiest way would be to open the file in Excel and run a find replace in the specific column and replace it with a | or ;.
BRILLIANT! Thank you so much!
This exactly what I needed.. Thank you
You're welcome! Glad it helped.
Great video ♥
Awesome, Thank you!
Hi There - This is amazing, Thank you - BUT I NEED SOME HELP. 05:03 - I am unable to see all the fields. Only Item and Body. I was having an issue in my Scheme where I was getting "Invalid type. Expected String but got Null." I added null into the Schema and was able to pass that step - I am not sure if that is now causing this issue. Please help? Thanks so much again :)
Can you pls post a topic in the Microsoft Power Automate Community: powerusers.microsoft.com/t5/Microsoft-Power-Automate/ct-p/MPACommunity.
It's really hard to troubleshoot in the YT comments section. Thx!
The last column header and value has "/r"(carriage retrun). When I tried to map that column to a Sharepoint column(Textbox) it is not accepting the value and not populating the value. Please help how to fix that.
I fixed it by adding the following in Compose - Split Rows. Instead of using Carriage return from Compose.
split(outputs('ComposeCSVContent'),decodeUriComponent('%0D%0A'))
@@salainayaganarunmozhi9558 Hi, Salai! Thank you for the tip. I have the absolute same problem, but if I add your code instead of an 'enter' in the compose, and use it to split, i get the error: "Unable to process template language expressions in action 'Compose - Split Rows' inputs at line '0' and column '0': 'The template language function 'split' expects its second parameter to be of type string. The provided value is of type 'Array'."
Did you add any extra steps??
I'm having same issue now, @acreativeopinion ... I am not sure how this issue turned up. I didn't change anything in these areas but the extra /r on my column is now breaking things!! ACK!!
@@erichwill4839
1. I used "Get File Content using path" to read the .csv file
2. Next, I used "Compose" and used -- string(outputs('Get_file_content_using_path')?['body'])
3. Next, I used "Compose" and used -- split(outputs('Compose'), decodeUriComponent('%0D%0A'))
Above 3 steps remove /r from my output.
@@sethdorr2812 please try the steps mentioned in reply to Erich
ありがとう、ありがとう とても役に立ちました
会社のメンバーにもチャンネル登録するように案内しました
Hi! Your video is great and helped me so much but I too seem to be stuck. When adding the data to SharePoint an error appears when parsing the json - expected string got null. Tried adding null as a type but didn’t work…
This is because your array probably contains a null row.
Before getting the column names compose action, add a 'Filter array' action and add the following logic: @equals(empty(item()),false)
This should remove any null rows from your array which is what is causing the Parse JSON action to break
Thank you! In your Parse JSON schema, please try to edit:
"type": "string"
to
"type": ["string", "null"]
You'll need to do this for each property to allow it to accept a null value and string value.
Let me know if that helps!
@@acreativeopinion This fixed it for me, I was able to use the REPLACE all option too within the little search box inside Power automate so it replaced all of them at once.. ran into an error within the actual next step though I am figuring out now! But this got me farther!! Woot ty!!
@@acreativeopinion
Now if I could only figure out how to make it so it updates EXISTING list entries... Is this possible?! It would be LEGENDARY if so!!
What do you think @acreativeopinion
Does the name from the csv header need to be the same as the sharepointlist name?
Thanks for watching. No they don't need to match.
Great stuff.
This is really fantastic. Thanks so much. I am having an issue at Compose Split Rows. Its adding an empty row at the end "" This seems to be added to the csv file and it causes the Parse JSON for Portfolio Slicer to fail validation "Invalid type. Expected String but got Null." This is the code used for Compose Split Rows split(outputs('Compose_Csv_Content'),outputs('Compose_Line_Break')) For the line break I just created the blank row as per the information. I could do with something that removes the empty row before continuing
Thank you! In the Parse JSON schema, please try to edit:
"type": "string"
to
"type": ["string", "null"]
You'll need to do this for each property to allow it to accept a null value and string value.
Let me know if that helps!
@@acreativeopinion I have updated that But now Parse JSON only has Item and body available as options so i cant choose my columns. When I put it back to "type": "string" the columns appear again but the data doesn't parse because of the null value
@@debbieedwards7267 Hmm.. I just tried it on my Parse JSON action and I'm still able to see the columns. Did you replace each instance of "type": "string" with "type":["string", "null"]? Make sure null is in lower case and within double quotes.
Another thing you could try is editing the Parse JSON schema AFTER you've selected your columns.
@@acreativeopinion The "type":["string", "null"] worked for me. I had the same thing. Thank you for this guide, it is really good.
@@acreativeopinion this does work in a way. At least now the data from the csv is saved in the share point list. However, now there is also an empty entry added into the share point list because we accept the empty null row that is for whatsoever reason added to the csv data.
Will this work for >5000 items?
@acreativeopinion: I really appreciate your flow and steps! I used some of the comments to fix a couple spots in the flow, but I am receiving this error "ActionFailed. An action failed. No dependent actions succeeded." I think this is because there is a blank row at the end of my dataset. I've added type: "null" to the Parse JSON schema, which seems to work, however, the flow still reports being failed even though I'm not sure why... Any ideas or help would be most appreciated
Thanks for watching! It's really hard to troubleshoot without seeing your flow. Which action is failing? ....
@@acreativeopinion same issue for me
parse json is failing error as schema validation failed as last entry i am getting null values but i checked csv it doesnt contains null value
Yes, it is the exact same for me! @acreativeopinion @gavinobiern5361 Does this have something to do with the line break compose action? Maybe there is an extra line break added after the last row in the csv which is where the null row is created.
What ive found out, is that my raw input csv when composing the csv to string does already have a
at the end of the last row which must be the reasons for the extra row added. However, I am not yet sure where this
comes from...
Is there a way to skip (or stop at) the null row at the end of the csv file? Otherwise it'll add blank rows to the SharePoint list every time it runs. 👀
Thanks for watching! I'd have to look into that. Is there always a blank row at the end of your file?
Any feedback on this? I am having the same issue@@acreativeopinion
@acreativeopinion - Thank you for the explanation. I am unable to see the import (folder name) when I try to the automate csv file. Currently, testing this with Excel 2016, not sure if that is the challenge. Please could you assist
Not sure which part of the tutorial you are referring to. Is it 7:11? If not, can you pls specify which part of the tutorial you are referring to? Thx.
@@acreativeopinion yes it is from 7:11
@acreativeopinion Following up with you on this. Please do have any idea on the issue
@@Yinyemez What is the trigger you are using for your flow? In order to see an option to run the flow from your Document Library, you need to use the For a selected file trigger.
@acreativeopinion that is exactly the trigger that I am using, but the flow name doesn't appear or show when I click on Automate options on SharePoint
Hi, how do I add condition to only add data that are not exist from the csv file?
Thanks for watching. You would need to cross-check your CSV data against what already exists in SharePoint. You can refer to the YT Tutorial I uploaded on the Filter Array action: ruclips.net/video/HVAf1aTEQNg/видео.html.
In this tutorial I cover how to cross-reference data in Excel with a SharePoint list. The concept will apply to data in a CSV file as well. You will need a unique identifier that exists in your CSV data as well as your SP data so that you can run a cross-reference check.
Hope this helps!
Does this have a limit for importing data?
Thanks for watching! I can't actually confirm as I've only used this flow with a small set of data. However, there are a couple considerations that you should keep in mind.
Max flow duration: If you have a lot of data in your CSV file, it may take a long time to process.
Actions per flow run: Depending on how many actions you need to run per row in your CSV, you may hit your limit (based on your plan).
Hope this helps!
Excelente!!!
sigh. im getting \" added to every field in the csv after the first compose.. and a replace isnt removing it. any ideas? google is not helping :)
Have you tried to open your csv file in Excel and save it as the 'Comma Separated Values' format? Some platforms export out CSV files in CSV UTF-8 format.
Try to open it in Excel and "save as" and see if that helps.
@@acreativeopinion this wont work since its a csv file generated by an API, so converting defeats the automation... trying the Encodian csv parsing to see if it does the same thing :)
i got error in parse json validationfail
Y si el archivo csv pesa demasiado cómo obtengo el contenido?
My Parse JSON is not returning column names. Only showing item and Body in Apply to each step. I'm stuck. Kindly help.
Have you found any solution? I am having a similar challenge.
Thanks for watching! When you view the last flow run, what does the Column Names Compose action (2:01) output? Also, in your CSV file, are your column names located in row 1 of your file or is there a blank row?
how do you copy paste it? lol, my columns are all the same.
Thanks for watching! Can you clarify what you mean? Are you looking to copy and paste your CSV data into a SP List? If so, open your CSV in Excel and copy/paste it from Excel into SharePoint.
You can refer to this section of a different tutorial: ruclips.net/video/hQ5Tw3UL7ik/видео.html
can't thank you enough , but had to lower the speed :D
Thanks for watching. This is one of my earlier videos and I was trying to figure out the pacing. I think I've figured out a good pace-check out my more recent tutorials.
If data is already imported then run the flow its failed
Are you able to clarify which part of the flow is failing?
Watched several dozen times. Still too fast.
you can adjust the play speed to 0.8
Hi ! do you think I can easily transform my CSV from
{
"body": [
"1010007, I WANT
",
"1010007,TO CONCAT
",
"1010007,INTO ONE ROW
",
"1099999,AND WHY NOT
",
"1099999,THIS ONE ALSO
"
.
.
.
]
}
into
{
"body": [
"1010007, I WANT TO CONCAT INTO ON ROW
",
"1099999, AND WHY NOT THIS ONE ASLO
",
.
.
.
]
}
I can't figure out how to do that and this is the final step I need ! thanks !!