Power Automate Tutorial: Detecting Duplicate Rows in Multiple Data Sources with Power Platform

Поделиться
HTML-код
  • Опубликовано: 18 окт 2024

Комментарии • 69

  • @danielatiparu4460
    @danielatiparu4460 8 месяцев назад +3

    I don't think I've ever watched such a clear and thorough explanation! It works perfect!

    • @DamoBird365
      @DamoBird365  8 месяцев назад

      Thank you Daniel, it's really appreciated when I get feedback like this. Make sure you have entered my competition for reaching 10,000 subscribers if you haven't already - details in the community tab of my channel homepage.

  • @joelbicoy
    @joelbicoy 5 месяцев назад +1

    This technique have helped reduced process time rather than using loops. Thank you Big Time!

  •  Год назад +1

    Deduplication is pretty common use case. That was my first PAD script to trigger a copy to clipboard a list of unique items that were duplicated in the source file. Great demo as always. Such a time saver. Keep them coming Damien. Thanks for posting! #repurposeTriggered

  • @dangage8819
    @dangage8819 10 месяцев назад

    This was Fantastic! Absolutely perfect. Still have some issues to work out, but this was a huge jump start for me. Thank you for all you do. I'm going to check out your other tutorials in the coming weeks.

  • @A-broken-clay-jar
    @A-broken-clay-jar Год назад +1

    I was learning how to remove records uploaded from Excel to List that's no longer in Excel, they used JSON, but I didn't get it. I am excited to watch your approach. I have subscribed to your channel. Thank you!

    • @DamoBird365
      @DamoBird365  Год назад

      I hope you learn something, you’ve been busy today 👍

  • @juanmanuelaldanatriana71
    @juanmanuelaldanatriana71 Год назад +1

    Great Video! Thank you for sharing, this knowledge is gold! Greetings from Colombia.

    • @DamoBird365
      @DamoBird365  Год назад

      Thanks Juan. Hope it gave you some ideas.

  • @Summane
    @Summane 4 месяца назад +1

    Thank you ! I was able to create my flow with the same logic 💯

  • @chrisgoldthorpe9304
    @chrisgoldthorpe9304 Год назад +2

    Brilliant video for something I’ve wanted to use at work! Thanks so much! If you need an idea for a future helpful video, how about migrating data from Excel to Sharepoint lists? In particular date data, I’m always running into issues with date formats in excel not being recognised when creating items in Sharepoint lists!😅

    • @DamoBird365
      @DamoBird365  Год назад

      Interesting. There is the date setting on the excel action. What problem are you encountering?

    • @chrisgoldthorpe9304
      @chrisgoldthorpe9304 Год назад

      @@DamoBird365 thanks for the reply! Yes so Im aware of that in the advanced settings of the excel action and I’ve tried switching it to ISO8601 but still encounter issues. The columns in the excel table are formatted as dates. I’ve tried parsedatetime expressions, converting the dates to integers and adding them to 1899-12-31… I eventually get things to work but it’s by a long convoluted trial and error process and I’d much rather just be able to map a date formatted column in an excel straight into a date formatted Sharepoint list column if I’m not being over simplistic!

  • @kevinwarren1955
    @kevinwarren1955 Месяц назад

    Hi Damien this is brilliant and helped me solve an issue I am dealing with. Do you know if you can use this method and also return the ID of the original matching record? That way you could have a bit of traceability, for example before the delete, you could export this to csv and display both the duplicate record, with an additional column that contains the original record that this is a duplicate of? That would be really handy to have to do one last step of validation, before doing the delete.

  • @sanilschithambaran7268
    @sanilschithambaran7268 8 месяцев назад +1

    Hi, thank you so much for the nice presentation of finding duplicates. As many of the people mentioned, your explanation about the topic was awesome. I have a query. I have a SharePoint List which obtain data from the Invoices extracted using AI feature in Power Automate. So how to find duplicate entries prior to save the Invoice data in this SharePoint List.

    • @DamoBird365
      @DamoBird365  8 месяцев назад

      You might manage with just using union on itself to remove duplicates. This idea was born from an issue with the CoE duplicating data.

  • @AnthonyLewis-f2c
    @AnthonyLewis-f2c 3 месяца назад

    Hi, Great content. Do you have any steps that would update a SharePoint List when duplicate records are found?

  • @patrickmoore6134
    @patrickmoore6134 3 месяца назад +1

    Hi Damien, not the first time you've come to the rescue for me. Do you have any suggestions if there are more than 100,000 records?

    • @DamoBird365
      @DamoBird365  2 месяца назад

      Probably dataflows? I’m not sure otherwise. Some volume!

  • @AnthonyLewis-f2c
    @AnthonyLewis-f2c 3 месяца назад

    Hi, Great Content. Do you have any steps that would allow me to update the metadata fields for those duplicate records?

  • @hotelranimahaljaipur9380
    @hotelranimahaljaipur9380 Год назад +1

    Helpful vedio , Pls also make a vedio on how to migrate on primise data to cloud using power automate.

    • @DamoBird365
      @DamoBird365  Год назад

      Hi, thanks for your suggestion. Can you explain what onPrem data you want to migrate? I don’t have a video on the gateway tool yet.

  • @GMarshll
    @GMarshll Месяц назад +1

    I have a quick question regarding the current flow we’re working with. At the moment, it seems that the flow counts the duplicates within a column. However, instead of counting these duplicates, we need to sum the total of the column. Is this something that can be accommodated within our existing flow? If so, could you please provide guidance on how to implement this adjustment?
    Thank you in advance for your help. I look forward to your response.

    • @DamoBird365
      @DamoBird365  Месяц назад +1

      Something like: Power Automate - Fast Data Aggregation - Group By, Sum, Count #powerautomate
      ruclips.net/video/z5MxbwURV68/видео.html

    • @GMarshll
      @GMarshll Месяц назад

      Love it!

  • @Carol_1998
    @Carol_1998 6 месяцев назад

    Thank you for the video and for the help! It rly helps me a lot. I would just like to know how to proceed with deleting duplicate items in SharePoint. When using Delete Rows it presents errors. Could you help me?

  • @vjthanasekaran6752
    @vjthanasekaran6752 11 месяцев назад

    Hi, Great video.
    Is there a way to get all the duplicate records instead of ignoring the 1st (ex: if there are 3 duplicate records, need all 3)

  • @seang2012
    @seang2012 3 месяца назад +1

    Question: how would you adapt this for a very large Dataverse table, say greater than 100,000 rows? Are there limits on those variables for holding all that data?

    • @DamoBird365
      @DamoBird365  3 месяца назад

      You might have to batch it. You could explore the chunk expression. I’ve not worked with volumes larger than this.

  • @leehualeong3498
    @leehualeong3498 Месяц назад

    Hello! 👋🏻 I’m new to power automate and thanks for your detailed explanation… I am following every steps and it able to run successfully just that it don’t delete the duplicates… may I know do I need to insert “delete item” after the compose item()?[‘id’]

    • @CourtneyRosenkoetter
      @CourtneyRosenkoetter 24 дня назад

      I am new as well and I can not figure out how to delete. I tried item()?[‘id’] and I could not get it to work.

  • @GraceMarshall-se4mi
    @GraceMarshall-se4mi 6 месяцев назад +1

    Love this flow, one request, I would like identify all of the duplicates as "true", I currently am showing only the sub value as "true" verse both as rows as "true". So I need both the row above and the row below to be true. Is this possible?

    • @DamoBird365
      @DamoBird365  6 месяцев назад +1

      Could you add another key value where the value below is equal to this value and then filter on true for both values?

    • @GMarshll
      @GMarshll 6 месяцев назад

      I think that would work, would you be able to copy the code that you would recommend?

  • @sleepyrick
    @sleepyrick 6 месяцев назад

    how would you go about this with comparing only 2 columns? Is the 'sort' function necessary?

  • @caseyalderson2780
    @caseyalderson2780 Год назад +1

    Thanks for the video. How do you tie back for deletion? I'm trying to use the Delete a row function for excel and getting the following error when attempting to use the output for Compose 3 as my "Key Value" in the Delete a Row inputs. "
    Flow save failed with code 'InvalidTemplate' and message 'The template validation failed: 'The action(s) 'Compose 3' referenced by 'inputs' in action 'Delete_a_row' are not defined in the template.'.'."

    • @DamoBird365
      @DamoBird365  Год назад +1

      Should it be compose_3?

    • @caseyalderson2780
      @caseyalderson2780 Год назад

      Yep🤦‍♂

    • @rikifrihcanal
      @rikifrihcanal Год назад

      Hi what should I use for the key column? The concat used in “select” or something else? Thanks

    • @dangage8819
      @dangage8819 10 месяцев назад

      @@rikifrihcanal the Key should be the ID of the row.

  • @kamalakarreddyuppala3772
    @kamalakarreddyuppala3772 Год назад +1

    Thanks for your video more helpfull for me
    can you add how to add sql data in sharepoint

    • @DamoBird365
      @DamoBird365  Год назад

      I can add it to a list but I don't have immediate plans for this - what is it you are looking to see / achieve via SQL? Azure SQL or SQL onPrem?

  • @alvi_universe
    @alvi_universe Год назад

    Thanks for the great video! This has helped me a lot. I have rebuilt it 1:1. And it works very well! However, it deletes the "most recent" entry and keeps the oldest. I need it the other way around though. If it has duplicate records, it should delete the older ones and keep the youngest record. Do you have any idea?

    • @DamoBird365
      @DamoBird365  Год назад

      This should delete the oldest. Have you sorted the array?

    • @alvi_universe
      @alvi_universe Год назад

      @@DamoBird365 Have tried everything! Have deleted the sort, put back in! It always deletes the most recent entries!

    • @dangage8819
      @dangage8819 10 месяцев назад

      @@alvi_universe Just based on my own testing of this today (after finding this wonderfulness). You may want to flip the T and F from the KeepMe to the DeleteME and see what happens with that. It shouldn't be an issue, but it's something I had to do while figuring out a couple other things. I went off script and flipping those around solved the issue for me. Not saying it will for you, but it's possible. Before implementing it. I'd take out the Delete action at the end and then just look through the other out puts and see if it worked.

    • @moathaladwan6594
      @moathaladwan6594 6 месяцев назад

      @@alvi_universe Have you found a solution to this of why it may be reading the most recent and not the oldest record I'm also looking to do that.

    • @moathaladwan6594
      @moathaladwan6594 6 месяцев назад

      @@dangage8819 Do you happen to have another fix for this?

  • @junedahmed1469
    @junedahmed1469 10 месяцев назад

    Hello, I have created a scheduled flow that runs monthly. I have two excel file in SharePoint, and I created the flow that will pull data from one excel file (let's say Excel 1) and paste the data to ther excel sheet (Excel 2). So, my flow is : Recurrence>List rows in a Table> Add a row in a Table. So for the very first month (January), it runs well, however when my flow runs for February, it also pulls January data again along with February data. How can I stop getting the data that is already pulled by the previous run? Please help me out.

    • @DamoBird365
      @DamoBird365  10 месяцев назад

      You would need to filter. Not something I have a video for exactly. But either filter from list rows (preferably) or separately by filter array.

    • @JunedAhmed-rs2hb
      @JunedAhmed-rs2hb 10 месяцев назад

      @@DamoBird365 Thank you so much for your prompt reply. I tried to filter, Unfortunately it did not work for me. Maybe I did it wrong. Is there anyway that you can help me with please? I am just stuck with it.

  • @PremiumUnleadedOutdoors
    @PremiumUnleadedOutdoors Год назад

    What if you wanted it to search for duplicates for the whole list? If the data isn’t that clean looking just one below will it be far enough.

    • @DamoBird365
      @DamoBird365  Год назад

      If you cannot sort your data like my demo, you could use xpath to search? I demonstrate some xpath techniques in the following video ruclips.net/video/EOmsT9KcWxc/видео.html.

  • @ruchiaggarwal8624
    @ruchiaggarwal8624 8 месяцев назад +1

    The map expression in select 2 is coming invalid and doesn’t working for me. Please help

    • @DamoBird365
      @DamoBird365  8 месяцев назад

      You could post further details on the forum powerusers.microsoft.com/

  • @shomari169
    @shomari169 Год назад +1

    Can you please explain the subtract bit @11:22 in a leymans terms for me pleas, I do unstand but at the same time I don’t lol

    • @DamoBird365
      @DamoBird365  Год назад +1

      Sure. [“object1”,”object2”,”object3”] …?[2] = object3 …?[1] = object2. Then if item() = 2, then [item()] = [2] and [sub(item(),1)] = [1]

    • @shomari169
      @shomari169 Год назад +1

      @@DamoBird365 thank you!!

    • @AdiCristea
      @AdiCristea Год назад +1

      @@DamoBird365 but then how doesn't it error on the first item considering there is no item[-1] to compare it with?

    • @DamoBird365
      @DamoBird365  Год назад +1

      Good spot, ?[-1] returns null, remove the ? and you get an error.

    • @AdiCristea
      @AdiCristea Год назад +1

      @@DamoBird365 you're right, the ? makes all the difference, thank you for replying

  • @samlewis1541
    @samlewis1541 Год назад

    Hello,
    Great video, I've followed each step but I'm getting an error on the 2nd select. Really appreciate if you could point me in the right direction
    The execution of template action 'Select_2' failed: The evaluation of 'query' action 'where' expression '@addProperty(outputs('Compose')?[item()],'IsDuplicate', if(equals(outputs('Compose')?[item()]?['concat'], outputs('Compose')?[sub(item(),1)]?['concat']),'T','F'))' failed: 'The template language expression 'addProperty(outputs('Compose')?[item()],'IsDuplicate', if(equals(outputs('Compose')?[item()]?['concat'], outputs('Compose')?[sub(item(),1)]?['concat']),'T','F'))' cannot be evaluated because property '0' cannot be selected. Property selection is not supported on values of type 'String'.

    • @DamoBird365
      @DamoBird365  Год назад

      I guess you’ve used an invalid array in the compose and it’s being treated as a string.

    • @samlewis1541
      @samlewis1541 Год назад

      @@DamoBird365 Thanks for the reply, its the same as yours in the video. Although when i type 'sort' in the expression field of the compose its doesn't appear as an option (like it does for you), the only option i get is 'SortBy'. I can still use the expression 'sort(body('Select'),'SortBy')' but interestingly it only pulls back one value rather than the 100+ in the first select.

    • @DamoBird365
      @DamoBird365  Год назад

      Sort is the correct expression, here are the docs learn.microsoft.com/en-us/azure/logic-apps/workflow-definition-language-functions-reference#sort. I can’t explain your experience and maybe it’s worth sharing a screenshot on the forum.

    • @samlewis1541
      @samlewis1541 Год назад

      @@DamoBird365 Thanks for your help. Delete the compose step and re added it and it fixed the issue