Quickly Import CSV Data to a SharePoint List with Power Automate

Поделиться
HTML-код
  • Опубликовано: 5 дек 2024

Комментарии • 110

  • @acreativeopinion
    @acreativeopinion  Год назад +6

    Thanks for watching! If you found this tutorial helpful and plan to build this Microsoft Power Automate flow-please consider giving this video a 👍. It helps me gauge what type of tutorials you like! Thank you! 🤗

  • @techoffice-ep4zc
    @techoffice-ep4zc 7 месяцев назад +4

    For anyone having issues with the fields not showing up after the parse JSON step. Change your schema to allow nulls, but do it using this method: {
    "type": "array",
    "items": {
    "type": "object",
    "properties": {
    "Append1[Dash]": {
    "anyOf": [
    { "type": "string" },
    { "type": "null" }
    ]
    },

    • @erinann7833
      @erinann7833 6 месяцев назад +1

      You saved my everything with this comment. I had a blank row at the end of my CSV couldn't find anything that would get the JSON to work. Tried all different filter arrays to no avail. Then I saw this comment....life saver thank you thank you thank you

  • @Jay-mh1xh
    @Jay-mh1xh Год назад +1

    LOVE LOVE LOVE it... Concise, quick, and no fluff...Thank you so much!

  • @trevorknorpp5335
    @trevorknorpp5335 Год назад +5

    Worked! Thank you!! Had to listen at 0.75 to follow haha

    • @acreativeopinion
      @acreativeopinion  Год назад

      You're welcome. Thanks for the feedback. I'm still working on my pacing. I do tend to go a bit fast. I'll work on slowing it down.

  • @jonaskoniger3337
    @jonaskoniger3337 Год назад +2

    Hi, can you make a video to explain the procedure if you want to only update a list by a csv and not just adding items?
    Many thanks for the great content!

  • @FRANKWHITE1996
    @FRANKWHITE1996 9 месяцев назад

    Just stumbled upon this channel and instantly subscribed.

  • @JacobGustafson-d9k
    @JacobGustafson-d9k 4 месяца назад

    This is so great, really covering something I need to implement. However, I'm running that I think I've tracked down to my column headers starting in Row 2. Is there a way to specify that that is where the column headers are and to ignore Row 1 or a 'simple' automated way to delete Row 1 before starting with the various Compose actions that you've outlined in this video?

  • @codyj8567
    @codyj8567 11 месяцев назад

    This is amazing and almost exactly what I am looking for! Only my CSV file does not have headers. I am very new to using power automate but is there some step I should be skipping to avoid the headers entirely to achieve this same result? I’d really appreciate the tip! Thank you!

  • @seanmcd123
    @seanmcd123 Год назад +1

    fantastic video! a little fast at times but that's just me! every step worked and I learned quite a lot! thank you very much!!

    • @acreativeopinion
      @acreativeopinion  Год назад

      Thanks for watching and for the feedback! I'm working on the pacing.... happy you got something from it! 😊

    • @confusedseth
      @confusedseth Год назад +1

      I love the speed because I watch it one time fast speed, then after I change the speed to .75 and bring up my Power Automate. This allows me to download, and then have guidance after with FireFox Picture in Picture mode on top 😉
      ... I was actually rather impressed and assumed you made it this fast for a reason LOL!
      I NEEDED THIS SO MUCH TY!!!

  • @dankirchner4460
    @dankirchner4460 Год назад

    Hello. This tutortial is great! Really appreciate you sharing the information. I have to filter a column in my spreadsheet priot to import. What would you recommend the best way to do this?

    • @acreativeopinion
      @acreativeopinion  Год назад +1

      Thanks for watching. When using an Excel table in a flow you use the Filter Query in the action. However, if you have more than one condition that you need to filter on, you'll have to use a Filter Array action. You can refer to this YT tutorial: ruclips.net/video/HVAf1aTEQNg/видео.htmlsi=1YJ2hxWMJ9jY4uVa

    • @dankirchner4460
      @dankirchner4460 Год назад

      @@acreativeopinion thanks for the reply! I'll give it a shot.

  • @VioletFlicker
    @VioletFlicker Год назад +1

    This was really helpful, thank you!

  • @Emil904
    @Emil904 Год назад +1

    Hi there! A fantastic tutorial, concise and easy to follow.
    I would have one question for my specific dataset, when running the automation, I receive an error at the Create Item action that says "The 'inputs.parameters' of workflow operation 'Create_item' of type 'OpenApiConnection' is not valid. Error details: Input parameter 'item/Project_x0020_Start' is required to be of type 'String/date'. The runtime value '"16-05-2023"' to be converted doesn't have the expected format 'String/date'."
    What should I do to resolve this issue?

    • @acreativeopinion
      @acreativeopinion  Год назад +1

      Thank you so much! It looks like your date needs to be formatted. Take a look at this short and adjust your date format: ruclips.net/user/shortsHmxKckDdWQo
      LMK if that works!

    • @Emil904
      @Emil904 Год назад

      @@acreativeopinion Unfortunately no, I am now receiving this error:
      InvalidTemplate. Unable to process template language expressions in action 'Compose_-_SOS_Formatted' inputs at line '0' and column '0': 'The template language function 'int' was invoked with a parameter that is not valid. The value cannot be converted to the target type.'.

    • @acreativeopinion
      @acreativeopinion  Год назад

      @@Emil904 I'm so sorry. I made a mistake. That video I directed you to was for converting Excel dates for SharePoint. In your case, you will need to use this expression formatDateTime([insert your date here], 'yyyy-MM-dd'). give that a try.

    • @Emil904
      @Emil904 Год назад

      @@acreativeopinion It states that the expression is invalid.

    • @acreativeopinion
      @acreativeopinion  Год назад

      Are you inserting your date into the [insert your date] part of the formula? Try to create a compose action to store your date from the CSV file, and the insert the output of that compose action into the [insert your date] portion of the expression. See if that works.

  • @brophy1443
    @brophy1443 Год назад

    Great video, as always.
    I have a csv in my OneDrive that is connected to PowerBi for data visualisation. The csv contains the data from our CRM which doesn't directly talk to PowerBi.
    So right now, I export the data from CRM, open the existing csv that is connected to PowerBi and replace the values expect the column heading.
    After watching your video, I am wondering if I could download the csv, right click and automate the list to SharePoint and change the PowerBi dataset to SharePoint list.
    My question is, does it have to be the exact file name (As each export will have a data/time stamp to it) and when we import - does it replace the sharepoint list or add to the existing list? Or a much larger question - is there a better way to handle my situation? :(

    • @acreativeopinion
      @acreativeopinion  Год назад +1

      Thanks for watching! I don't have much experience with Power BI (yet) because I use a Macbook Pro as my main device and the Power BI web app can't connect to a SharePoint list. Anyway, the file name can be anything you want. At the 6:15 in the video I go over how to change the manual trigger to a Selected File Trigger.
      When you import it will add to the SharePoint list. When you run your CRM export are you exporting out unique data each time or is there some overlap? If there is some overlap... you will need a way to check SharePoint to see if the data already exists. This could be done using a unique identifier if you have it in your CSV export. If you import a unique identifier then you could run a check before adding to the SP list.
      Another trigger that you could use is the When a File is Created trigger. This trigger would start the flow upon file creation (so you wouldn't need to manually select the file and then have the automation run). You could drag and drop your file into the folder or if you've synced your folder-simply save the CRM export right into that folder (this would be my recommendation). If you take this approach I would also recommend adding a column in your Document Library to hold a timestamp of when the file was imported into SharePoint.
      I hope this helps!

  • @craig.galbraith
    @craig.galbraith Год назад

    Great video! Very easy to follow. What do I do if some of my data contains commas? I am processing CSV files attached to received emails and I cannot manipulate the CSV file before it hits Power Automate. Thanks!

    • @acreativeopinion
      @acreativeopinion  Год назад +1

      Thanks for watching! I wish I could help. I had the same issue too with a different set of data and ended up opening up the CSV in Excel and doing a find/replace. Not ideal.
      If you do happen to find a solution-pls drop it down below!

    • @craig.galbraith
      @craig.galbraith Год назад

      @@acreativeopinion Good to know I'm not alone! I thought I had a solution by using a DataFlow (ruclips.net/video/8IvHxRnwJ7Q/видео.html) but it turns out that I am locked out of this feature Back to the drawing board!

    • @craig.galbraith
      @craig.galbraith Год назад

      and I had another idea based on ruclips.net/video/9J6ThPWGaG0/видео.html about converting .CSV to .XLSX using Office Scripts but the scrips returns an error that a header contains more than 255 characters (again, likely due to commas and line breaks in the data). So many roadblocks when dealing with complex .CSVs.

  • @damiennovelo6720
    @damiennovelo6720 Год назад

    This was great info. How would you handle an additional , in a field?

    • @acreativeopinion
      @acreativeopinion  Год назад

      Thank you! The easiest way would be to open the file in Excel and run a find replace in the specific column and replace it with a | or ;.

  • @shamaiemaria4217
    @shamaiemaria4217 Год назад

    BRILLIANT! Thank you so much!

  • @sundeepm814
    @sundeepm814 Год назад

    This exactly what I needed.. Thank you

  • @jonaskarlsson477
    @jonaskarlsson477 7 месяцев назад

    Great video ♥

  • @c.mdsalih
    @c.mdsalih 8 месяцев назад

    Awesome, Thank you!

  • @pieteresterhuizen1551
    @pieteresterhuizen1551 Год назад

    Hi There - This is amazing, Thank you - BUT I NEED SOME HELP. 05:03 - I am unable to see all the fields. Only Item and Body. I was having an issue in my Scheme where I was getting "Invalid type. Expected String but got Null." I added null into the Schema and was able to pass that step - I am not sure if that is now causing this issue. Please help? Thanks so much again :)

    • @acreativeopinion
      @acreativeopinion  Год назад

      Can you pls post a topic in the Microsoft Power Automate Community: powerusers.microsoft.com/t5/Microsoft-Power-Automate/ct-p/MPACommunity.
      It's really hard to troubleshoot in the YT comments section. Thx!

  • @salainayaganarunmozhi9558
    @salainayaganarunmozhi9558 Год назад +2

    The last column header and value has "/r"(carriage retrun). When I tried to map that column to a Sharepoint column(Textbox) it is not accepting the value and not populating the value. Please help how to fix that.

    • @salainayaganarunmozhi9558
      @salainayaganarunmozhi9558 Год назад +5

      I fixed it by adding the following in Compose - Split Rows. Instead of using Carriage return from Compose.
      split(outputs('ComposeCSVContent'),decodeUriComponent('%0D%0A'))

    • @erichwill4839
      @erichwill4839 Год назад +1

      @@salainayaganarunmozhi9558 Hi, Salai! Thank you for the tip. I have the absolute same problem, but if I add your code instead of an 'enter' in the compose, and use it to split, i get the error: "Unable to process template language expressions in action 'Compose - Split Rows' inputs at line '0' and column '0': 'The template language function 'split' expects its second parameter to be of type string. The provided value is of type 'Array'."
      Did you add any extra steps??

    • @sethdorr2812
      @sethdorr2812 Год назад

      I'm having same issue now, @acreativeopinion ... I am not sure how this issue turned up. I didn't change anything in these areas but the extra /r on my column is now breaking things!! ACK!!

    • @salainayaganarunmozhi9558
      @salainayaganarunmozhi9558 Год назад +1

      @@erichwill4839
      1. I used "Get File Content using path" to read the .csv file
      2. Next, I used "Compose" and used -- string(outputs('Get_file_content_using_path')?['body'])
      3. Next, I used "Compose" and used -- split(outputs('Compose'), decodeUriComponent('%0D%0A'))
      Above 3 steps remove /r from my output.

    • @salainayaganarunmozhi9558
      @salainayaganarunmozhi9558 Год назад +1

      @@sethdorr2812 please try the steps mentioned in reply to Erich

  • @negima.3
    @negima.3 Год назад +1

    ありがとう、ありがとう とても役に立ちました
    会社のメンバーにもチャンネル登録するように案内しました

  • @gabrielapetrova2251
    @gabrielapetrova2251 Год назад +1

    Hi! Your video is great and helped me so much but I too seem to be stuck. When adding the data to SharePoint an error appears when parsing the json - expected string got null. Tried adding null as a type but didn’t work…

    • @jdwaterman1
      @jdwaterman1 Год назад +1

      This is because your array probably contains a null row.
      Before getting the column names compose action, add a 'Filter array' action and add the following logic: @equals(empty(item()),false)
      This should remove any null rows from your array which is what is causing the Parse JSON action to break

    • @acreativeopinion
      @acreativeopinion  Год назад +2

      Thank you! In your Parse JSON schema, please try to edit:
      "type": "string"
      to
      "type": ["string", "null"]
      You'll need to do this for each property to allow it to accept a null value and string value.
      Let me know if that helps!

    • @confusedseth
      @confusedseth Год назад

      @@acreativeopinion This fixed it for me, I was able to use the REPLACE all option too within the little search box inside Power automate so it replaced all of them at once.. ran into an error within the actual next step though I am figuring out now! But this got me farther!! Woot ty!!

    • @confusedseth
      @confusedseth Год назад

      @@acreativeopinion
      Now if I could only figure out how to make it so it updates EXISTING list entries... Is this possible?! It would be LEGENDARY if so!!
      What do you think @acreativeopinion

  • @jonaskarlsson477
    @jonaskarlsson477 7 месяцев назад

    Does the name from the csv header need to be the same as the sharepointlist name?

    • @acreativeopinion
      @acreativeopinion  6 месяцев назад

      Thanks for watching. No they don't need to match.

  • @jessefavela3694
    @jessefavela3694 Год назад

    Great stuff.

  • @debbieedwards7267
    @debbieedwards7267 Год назад +1

    This is really fantastic. Thanks so much. I am having an issue at Compose Split Rows. Its adding an empty row at the end "" This seems to be added to the csv file and it causes the Parse JSON for Portfolio Slicer to fail validation "Invalid type. Expected String but got Null." This is the code used for Compose Split Rows split(outputs('Compose_Csv_Content'),outputs('Compose_Line_Break')) For the line break I just created the blank row as per the information. I could do with something that removes the empty row before continuing

    • @acreativeopinion
      @acreativeopinion  Год назад +1

      Thank you! In the Parse JSON schema, please try to edit:
      "type": "string"
      to
      "type": ["string", "null"]
      You'll need to do this for each property to allow it to accept a null value and string value.
      Let me know if that helps!

    • @debbieedwards7267
      @debbieedwards7267 Год назад

      @@acreativeopinion I have updated that But now Parse JSON only has Item and body available as options so i cant choose my columns. When I put it back to "type": "string" the columns appear again but the data doesn't parse because of the null value

    • @acreativeopinion
      @acreativeopinion  Год назад

      ​@@debbieedwards7267 Hmm.. I just tried it on my Parse JSON action and I'm still able to see the columns. Did you replace each instance of "type": "string" with "type":["string", "null"]? Make sure null is in lower case and within double quotes.
      Another thing you could try is editing the Parse JSON schema AFTER you've selected your columns.

    • @Michael.D-bx9ty
      @Michael.D-bx9ty Год назад

      @@acreativeopinion The "type":["string", "null"] worked for me. I had the same thing. Thank you for this guide, it is really good.

    • @hvithrvat2917
      @hvithrvat2917 Год назад

      @@acreativeopinion this does work in a way. At least now the data from the csv is saved in the share point list. However, now there is also an empty entry added into the share point list because we accept the empty null row that is for whatsoever reason added to the csv data.

  • @louisdeguzman2447
    @louisdeguzman2447 9 месяцев назад

    Will this work for >5000 items?

  • @gavinobrien5361
    @gavinobrien5361 Год назад

    @acreativeopinion: I really appreciate your flow and steps! I used some of the comments to fix a couple spots in the flow, but I am receiving this error "ActionFailed. An action failed. No dependent actions succeeded." I think this is because there is a blank row at the end of my dataset. I've added type: "null" to the Parse JSON schema, which seems to work, however, the flow still reports being failed even though I'm not sure why... Any ideas or help would be most appreciated

    • @acreativeopinion
      @acreativeopinion  Год назад

      Thanks for watching! It's really hard to troubleshoot without seeing your flow. Which action is failing? ....

    • @raveenamathkar1942
      @raveenamathkar1942 Год назад

      @@acreativeopinion same issue for me

    • @raveenamathkar1942
      @raveenamathkar1942 Год назад

      parse json is failing error as schema validation failed as last entry i am getting null values but i checked csv it doesnt contains null value

    • @hvithrvat2917
      @hvithrvat2917 Год назад

      Yes, it is the exact same for me! @acreativeopinion @gavinobiern5361 Does this have something to do with the line break compose action? Maybe there is an extra line break added after the last row in the csv which is where the null row is created.

    • @hvithrvat2917
      @hvithrvat2917 Год назад

      What ive found out, is that my raw input csv when composing the csv to string does already have a
      at the end of the last row which must be the reasons for the extra row added. However, I am not yet sure where this
      comes from...

  • @ybarsabal
    @ybarsabal Год назад

    Is there a way to skip (or stop at) the null row at the end of the csv file? Otherwise it'll add blank rows to the SharePoint list every time it runs. 👀

    • @acreativeopinion
      @acreativeopinion  Год назад

      Thanks for watching! I'd have to look into that. Is there always a blank row at the end of your file?

    • @RicardoFilho-fe5fs
      @RicardoFilho-fe5fs Год назад

      Any feedback on this? I am having the same issue@@acreativeopinion

  • @Yinyemez
    @Yinyemez Год назад

    @acreativeopinion - Thank you for the explanation. I am unable to see the import (folder name) when I try to the automate csv file. Currently, testing this with Excel 2016, not sure if that is the challenge. Please could you assist

    • @acreativeopinion
      @acreativeopinion  Год назад

      Not sure which part of the tutorial you are referring to. Is it 7:11? If not, can you pls specify which part of the tutorial you are referring to? Thx.

    • @Yinyemez
      @Yinyemez Год назад

      @@acreativeopinion yes it is from 7:11

    • @Yinyemez
      @Yinyemez Год назад

      @acreativeopinion Following up with you on this. Please do have any idea on the issue

    • @acreativeopinion
      @acreativeopinion  Год назад

      @@Yinyemez What is the trigger you are using for your flow? In order to see an option to run the flow from your Document Library, you need to use the For a selected file trigger.

    • @Yinyemez
      @Yinyemez Год назад

      @acreativeopinion that is exactly the trigger that I am using, but the flow name doesn't appear or show when I click on Automate options on SharePoint

  • @abcff8776
    @abcff8776 Год назад

    Hi, how do I add condition to only add data that are not exist from the csv file?

    • @acreativeopinion
      @acreativeopinion  Год назад

      Thanks for watching. You would need to cross-check your CSV data against what already exists in SharePoint. You can refer to the YT Tutorial I uploaded on the Filter Array action: ruclips.net/video/HVAf1aTEQNg/видео.html.
      In this tutorial I cover how to cross-reference data in Excel with a SharePoint list. The concept will apply to data in a CSV file as well. You will need a unique identifier that exists in your CSV data as well as your SP data so that you can run a cross-reference check.
      Hope this helps!

  • @NéstorSantiagoAvendaño
    @NéstorSantiagoAvendaño Год назад

    Does this have a limit for importing data?

    • @acreativeopinion
      @acreativeopinion  Год назад

      Thanks for watching! I can't actually confirm as I've only used this flow with a small set of data. However, there are a couple considerations that you should keep in mind.
      Max flow duration: If you have a lot of data in your CSV file, it may take a long time to process.
      Actions per flow run: Depending on how many actions you need to run per row in your CSV, you may hit your limit (based on your plan).
      Hope this helps!

  • @TutoPlaysAndroid
    @TutoPlaysAndroid Год назад

    Excelente!!!

  • @MrScar74
    @MrScar74 Год назад

    sigh. im getting \" added to every field in the csv after the first compose.. and a replace isnt removing it. any ideas? google is not helping :)

    • @acreativeopinion
      @acreativeopinion  Год назад

      Have you tried to open your csv file in Excel and save it as the 'Comma Separated Values' format? Some platforms export out CSV files in CSV UTF-8 format.
      Try to open it in Excel and "save as" and see if that helps.

    • @MrScar74
      @MrScar74 Год назад

      @@acreativeopinion this wont work since its a csv file generated by an API, so converting defeats the automation... trying the Encodian csv parsing to see if it does the same thing :)

  • @zausanawng3570
    @zausanawng3570 10 месяцев назад +1

    i got error in parse json validationfail

  • @karengc5895
    @karengc5895 10 месяцев назад

    Y si el archivo csv pesa demasiado cómo obtengo el contenido?

  • @krishnasai8605
    @krishnasai8605 Год назад

    My Parse JSON is not returning column names. Only showing item and Body in Apply to each step. I'm stuck. Kindly help.

    • @muonekemichael6260
      @muonekemichael6260 Год назад

      Have you found any solution? I am having a similar challenge.

    • @acreativeopinion
      @acreativeopinion  Год назад

      Thanks for watching! When you view the last flow run, what does the Column Names Compose action (2:01) output? Also, in your CSV file, are your column names located in row 1 of your file or is there a blank row?

  • @toddh3147
    @toddh3147 9 месяцев назад

    how do you copy paste it? lol, my columns are all the same.

    • @acreativeopinion
      @acreativeopinion  9 месяцев назад

      Thanks for watching! Can you clarify what you mean? Are you looking to copy and paste your CSV data into a SP List? If so, open your CSV in Excel and copy/paste it from Excel into SharePoint.
      You can refer to this section of a different tutorial: ruclips.net/video/hQ5Tw3UL7ik/видео.html

  • @tarekyeh6245
    @tarekyeh6245 10 месяцев назад

    can't thank you enough , but had to lower the speed :D

    • @acreativeopinion
      @acreativeopinion  10 месяцев назад

      Thanks for watching. This is one of my earlier videos and I was trying to figure out the pacing. I think I've figured out a good pace-check out my more recent tutorials.

  • @MdHasnainr
    @MdHasnainr Год назад

    If data is already imported then run the flow its failed

    • @acreativeopinion
      @acreativeopinion  Год назад

      Are you able to clarify which part of the flow is failing?

  • @jaycrump6341
    @jaycrump6341 7 месяцев назад

    Watched several dozen times. Still too fast.

    • @montiavi
      @montiavi 7 месяцев назад

      you can adjust the play speed to 0.8

  • @gregoirepesenti
    @gregoirepesenti 3 месяца назад

    Hi ! do you think I can easily transform my CSV from
    {
    "body": [
    "1010007, I WANT
    ",
    "1010007,TO CONCAT
    ",
    "1010007,INTO ONE ROW
    ",
    "1099999,AND WHY NOT
    ",
    "1099999,THIS ONE ALSO
    "
    .
    .
    .
    ]
    }
    into
    {
    "body": [
    "1010007, I WANT TO CONCAT INTO ON ROW
    ",
    "1099999, AND WHY NOT THIS ONE ASLO
    ",
    .
    .
    .
    ]
    }
    I can't figure out how to do that and this is the final step I need ! thanks !!