Web Scraping in Power Automate Desktop | Multiple Pages | Tutorial

Поделиться
HTML-код
  • Опубликовано: 19 июн 2024
  • Do you want to scrape structured data from the internet with Power Automate Desktop (PAD)? In this tutorial you learn how to scrape data from multiple pages.
    ⭐️ The website to scrape from ⭐
    🔗 Scraping website: webscraper.io/test-sites/e-co...
    ⭐️ Skip through the video ⭐
    00:00 - Intro
    00:15 - The website to scrape from
    00:49 - Creation of an Excel file to keep track of all categories
    02:45 - Loop through the Excel file
    05:46 - How to scrape structured data with Power Automate Desktop?
    13:28 - How to scrape data from multiple pages with Power Automate Desktop?
    15:08 - How to change a selector in Power Automate Desktop?
    18:23 - How to change where Power Automate Desktop saves files?
    18:50 - Running the bot
    19:02 - Outro
    ⭐️ Follow me ⭐
    💼 LinkedIn: / thomas--janssen
    🧡 Instagram: TomsTechAcademy
    #powerautomate #powerautomatedesktop #roboticprocessautomation #webscraping

Комментарии • 38

  • @matthewsheehan7840
    @matthewsheehan7840 4 месяца назад +2

    Seen many videos on Web Scraping in Power Automate and yours is the best by far! Thank you this helped me tremendously! I look forward to watching more of your tutorials.

    • @TomsTechAcademy
      @TomsTechAcademy  3 месяца назад

      Thank you Matthew, I'm happy to hear my videos are of value to you and definitely planning to release more :D

  • @BenAyesu
    @BenAyesu 10 месяцев назад +1

    Thank you. Appreciate this!!

    • @TomsTechAcademy
      @TomsTechAcademy  10 месяцев назад +1

      Thank you for your comment Ben, happy automation! 🚀

  • @eladiohernandez6482
    @eladiohernandez6482 10 месяцев назад +1

    Great video! Thanks so much.

    • @TomsTechAcademy
      @TomsTechAcademy  10 месяцев назад

      Thank you for your comment Eladio! Glad my video was useful for you. Happy automation!

  • @geeksupportau
    @geeksupportau 5 месяцев назад +1

    This is gold, thankyou.

    • @TomsTechAcademy
      @TomsTechAcademy  5 месяцев назад

      Thank you Eddie, happy to hear my videos are helpful for you 😀

  • @user-yk6bd5tz9x
    @user-yk6bd5tz9x 8 месяцев назад +2

    OMG, I don't know how to thank you Sir!
    You just helped me solve the first step for my final year project.... Thanks alot.... keep making helpful videos. Thanks so much

    • @TomsTechAcademy
      @TomsTechAcademy  8 месяцев назад

      Glad to hear my video was helpful for you! Good luck with your final year project 🚀

  • @chrisford7351
    @chrisford7351 8 месяцев назад +1

    Excellent!

    • @TomsTechAcademy
      @TomsTechAcademy  8 месяцев назад

      Thank you for your comment Chris! Glad my video was useful for you 😀

  • @swarnpriyaswarn
    @swarnpriyaswarn 8 месяцев назад +1

    Thank you a million zillion times.

    • @TomsTechAcademy
      @TomsTechAcademy  8 месяцев назад

      Thank you 🙏 glad my video was useful for you 😀

  • @mariavlogs327
    @mariavlogs327 2 месяца назад +1

    Thank you for your tutorial, it works on paginated websites. Is there a workaround for "Load More" buttons?

  • @mochannel2482
    @mochannel2482 5 месяцев назад +1

    Thank you

    • @TomsTechAcademy
      @TomsTechAcademy  5 месяцев назад +1

      Thank you for your comment, happy automation!

  • @rahatrahman3070
    @rahatrahman3070 27 дней назад

    Hey please make video to download files from web table hyper link where you have multiple pages.

  • @rockero160
    @rockero160 3 месяца назад +1

    Hi! Great vid! What would you recommend for a web with multiples pages. The web automatically refreshes every 5 or 6 minutes. How do I continue from the last one?

    • @TomsTechAcademy
      @TomsTechAcademy  3 месяца назад

      Hi, sometimes you can use the url to navigate? The structure should look something like: webpage.com/page/1 - in this case you can increase the value of 1 automatically..

  • @damla7556
    @damla7556 6 месяцев назад

    Thank you for this great video. Can I extract information from different websites for the same task and collect it in a single excel. Thank you so much

    • @TomsTechAcademy
      @TomsTechAcademy  6 месяцев назад +1

      Hi, yes you can! For example, you can loop through a list of websites and scrape every one of them and store the output in the same Excel file :) good luck!

  • @stricklerville
    @stricklerville 8 месяцев назад

    Fantastic video. I do have a question, what would be the best way in Power Automate to pull data from the individual pages. Like say in your example you have the Laptop Packard 255 G2, when you click on it, it has more details. How can I automate scraping those details from the individual pages and put it into one row in the excel spreadsheet?

    • @TomsTechAcademy
      @TomsTechAcademy  8 месяцев назад +1

      The easiest way is to first use data scraping as shown in the video and download all individual rows. Then you loop through the Excel file with a For Each and visit every page and extract the specific product data you can only find on the product page.

    • @flyingrock6381
      @flyingrock6381 7 месяцев назад

      @@TomsTechAcademythanks! would love see you pull that together. Cheers!

  • @ReetardM8
    @ReetardM8 3 месяца назад

    great example, thank u! what is the solution for infinite scrolling rather then clicking on the web page numbers?

    • @TomsTechAcademy
      @TomsTechAcademy  10 дней назад

      Hi, I just released a video about web scraping from websites with infinite scrolling: ruclips.net/video/lEnWgRfCN4o/видео.html
      Hope it's helpful for you 😊

  • @praveenasrinivas445
    @praveenasrinivas445 3 месяца назад

    Hi I am trying to extract html table from the links. extract data throws an error failed to extract data. can you pls help me how to resolve this?

  • @virtumind
    @virtumind 7 месяцев назад

    Great tutorial but I see in extract data from web page then advanced there is an option for select paging element, can we use that ? How ?

    • @TomsTechAcademy
      @TomsTechAcademy  7 месяцев назад

      Hi, I explain how to extract individual fields from a webpage in this video at minute 28:20 : ruclips.net/video/Y35ZJs16APQ/видео.html

  • @christiannebloem
    @christiannebloem 2 месяца назад

    Ha Thomas, super bedankt voor de goede video! Ik ben eigenlijk nog net een stap verder aan het denken. Is het ook mogelijk om het veld 'categorie' telkens terug te laten komen? Zodat je in je kolommen krijgt: Categorie = laptop, Productnaam = A , Categorie = laptop, Productnaam = B, etc.? Hoe kan ik aangeven dat datzelfde element dan ook bij elke selectie wordt weergegeven? Lijkt me makkelijker als je meerdere categorieën hebt die over tijd kunnen veranderen, nieuwe worden toegevoegd of namen wijzigen. Groetjes, en ga zo door!

    • @TomsTechAcademy
      @TomsTechAcademy  2 месяца назад

      Hi Christianne,
      Dank voor je vraag! Join aub mijn Discord channel, daar kan ik jou op een makkelijkere manier antwoord geven en ook screenshots etc. delen: discord.gg/a4qUrRuZ

  • @vigneshvangala2235
    @vigneshvangala2235 10 месяцев назад +1

    Do a video on selecting multiple check boxes in web page using power automate desktop

    • @TomsTechAcademy
      @TomsTechAcademy  10 месяцев назад

      Thank you for your comment, no such video planned short term, though will take it into concern :)

  • @Ramkrishna2400
    @Ramkrishna2400 3 месяца назад

    I have a problem in extracting from the web. The data i want doesn’t gets copied in right format in excel.

    • @TomsTechAcademy
      @TomsTechAcademy  3 месяца назад

      Thank you for your comment, if you can be more specific I can try to help you 😀

  • @sayeu
    @sayeu 9 месяцев назад

    It's so slow, took 30+ minutes to scrape 500 pages

    • @TomsTechAcademy
      @TomsTechAcademy  8 месяцев назад

      Power Automate Desktop is an easy Low-Code tool that helps you to scrape data without programming knowledge. If you're looking for a technology which is faster and more robust I would advice Python + BeautifoulSoup.