Web Scraping in Power Automate Desktop | Multiple Pages | Tutorial

Поделиться
HTML-код
  • Опубликовано: 6 янв 2025

Комментарии • 51

  • @matthewsheehan7840
    @matthewsheehan7840 10 месяцев назад +2

    Seen many videos on Web Scraping in Power Automate and yours is the best by far! Thank you this helped me tremendously! I look forward to watching more of your tutorials.

    • @TomsTechAcademy
      @TomsTechAcademy  10 месяцев назад

      Thank you Matthew, I'm happy to hear my videos are of value to you and definitely planning to release more :D

  • @mariavlogs327
    @mariavlogs327 9 месяцев назад +1

    Thank you for your tutorial, it works on paginated websites. Is there a workaround for "Load More" buttons?

  • @bubblesoccerch5509
    @bubblesoccerch5509 2 месяца назад +1

    I don't know what a did exactly, but it works, thank you very much

    • @TomsTechAcademy
      @TomsTechAcademy  2 месяца назад +1

      Glad your automation works! Happy coding 😀

  • @MrsDeveloper-n8n
    @MrsDeveloper-n8n Год назад +2

    OMG, I don't know how to thank you Sir!
    You just helped me solve the first step for my final year project.... Thanks alot.... keep making helpful videos. Thanks so much

    • @TomsTechAcademy
      @TomsTechAcademy  Год назад

      Glad to hear my video was helpful for you! Good luck with your final year project 🚀

  • @muhammadbariakhdan2859
    @muhammadbariakhdan2859 5 месяцев назад +1

    Thank you very much sir. This video gives me an idea how to scrape website with click links. Appriciated it man :)

    • @TomsTechAcademy
      @TomsTechAcademy  5 месяцев назад

      Thank you for your comment, glad my video is helpful for you 😀

  • @geeksupportau
    @geeksupportau 11 месяцев назад +1

    This is gold, thankyou.

    • @TomsTechAcademy
      @TomsTechAcademy  11 месяцев назад

      Thank you Eddie, happy to hear my videos are helpful for you 😀

  • @BenAyesu
    @BenAyesu Год назад +1

    Thank you. Appreciate this!!

    • @TomsTechAcademy
      @TomsTechAcademy  Год назад +1

      Thank you for your comment Ben, happy automation! 🚀

  • @eladiohernandez6482
    @eladiohernandez6482 Год назад +1

    Great video! Thanks so much.

    • @TomsTechAcademy
      @TomsTechAcademy  Год назад

      Thank you for your comment Eladio! Glad my video was useful for you. Happy automation!

  • @praveenasrinivas445
    @praveenasrinivas445 9 месяцев назад +1

    Hi I am trying to extract html table from the links. extract data throws an error failed to extract data. can you pls help me how to resolve this?

  • @swarnpriyaswarn
    @swarnpriyaswarn Год назад +1

    Thank you a million zillion times.

    • @TomsTechAcademy
      @TomsTechAcademy  Год назад

      Thank you 🙏 glad my video was useful for you 😀

  • @solomidmid9266
    @solomidmid9266 2 месяца назад

    great video ! is there a way to scrape data from all the sub pages, example if you open Laptop -> and then click on ProBook -> scrap data, and then go to next item and do the same ?

    • @webscrapingseniors
      @webscrapingseniors 2 месяца назад +1

      Yes, you can automate scraping through subpages! One way is to create a loop that clicks on each category (like Laptop -> ProBook), scrapes the data, and then moves to the next item. You can achieve this using Selenium to handle the clicks and BeautifulSoup to extract the data from each page. This approach allows you to navigate through multiple subpages and collect the information in a structured way. Let me know if you need help with the code!

  • @chrisford7351
    @chrisford7351 Год назад +1

    Excellent!

    • @TomsTechAcademy
      @TomsTechAcademy  Год назад

      Thank you for your comment Chris! Glad my video was useful for you 😀

  • @asom1997
    @asom1997 6 месяцев назад +1

    Thank you so much for your video! I tried the same technique but the extractions stops and goes to Website privacy page. Not sure if this is caused by the web scraper or the code. could you please help me with this matter?

    • @TomsTechAcademy
      @TomsTechAcademy  6 месяцев назад

      Hi, you can share your code / screenshots on my discord and I’m happy to help you; discord.gg/WHJWFNDXXX

  • @rockero160
    @rockero160 9 месяцев назад +1

    Hi! Great vid! What would you recommend for a web with multiples pages. The web automatically refreshes every 5 or 6 minutes. How do I continue from the last one?

    • @TomsTechAcademy
      @TomsTechAcademy  9 месяцев назад

      Hi, sometimes you can use the url to navigate? The structure should look something like: webpage.com/page/1 - in this case you can increase the value of 1 automatically..

  • @damla7556
    @damla7556 Год назад

    Thank you for this great video. Can I extract information from different websites for the same task and collect it in a single excel. Thank you so much

    • @TomsTechAcademy
      @TomsTechAcademy  Год назад +1

      Hi, yes you can! For example, you can loop through a list of websites and scrape every one of them and store the output in the same Excel file :) good luck!

  • @rahatrahman3070
    @rahatrahman3070 7 месяцев назад

    Hey please make video to download files from web table hyper link where you have multiple pages.

  • @virtumind
    @virtumind Год назад

    Great tutorial but I see in extract data from web page then advanced there is an option for select paging element, can we use that ? How ?

    • @TomsTechAcademy
      @TomsTechAcademy  Год назад

      Hi, I explain how to extract individual fields from a webpage in this video at minute 28:20 : ruclips.net/video/Y35ZJs16APQ/видео.html

  • @stricklerville
    @stricklerville Год назад

    Fantastic video. I do have a question, what would be the best way in Power Automate to pull data from the individual pages. Like say in your example you have the Laptop Packard 255 G2, when you click on it, it has more details. How can I automate scraping those details from the individual pages and put it into one row in the excel spreadsheet?

    • @TomsTechAcademy
      @TomsTechAcademy  Год назад +1

      The easiest way is to first use data scraping as shown in the video and download all individual rows. Then you loop through the Excel file with a For Each and visit every page and extract the specific product data you can only find on the product page.

    • @flyingrock6381
      @flyingrock6381 Год назад

      @@TomsTechAcademythanks! would love see you pull that together. Cheers!

  • @JamesOrtega-vr8zn
    @JamesOrtega-vr8zn 5 месяцев назад

    I am trying to scrape my corporate site to excel my issue is after the loop ends at page 279 it keeps copying the last page with an error. Can you help.

    • @webscrapingseniors
      @webscrapingseniors 2 месяца назад

      It sounds like your loop is stuck at the last page, causing it to keep copying the same data. One way to fix this is to add a condition to break the loop once you’ve reached the last page. You can check if the 'Next' button is disabled or absent, or if the current page number matches the total number of pages (279 in your case).
      Once the loop detects that it’s at the last page, it should stop. I can do such tasks in Python, which is more versatile for handling conditions like this. Let me know if you need help

  • @trisha3178
    @trisha3178 4 месяца назад

    Thanks for this. Not winning with next page as it has anchor 2 as ordinal not text 16:22 @matthewsheehan??

  • @mochannel2482
    @mochannel2482 11 месяцев назад +1

    Thank you

    • @TomsTechAcademy
      @TomsTechAcademy  11 месяцев назад +1

      Thank you for your comment, happy automation!

  • @vigneshvangala2235
    @vigneshvangala2235 Год назад +1

    Do a video on selecting multiple check boxes in web page using power automate desktop

    • @TomsTechAcademy
      @TomsTechAcademy  Год назад

      Thank you for your comment, no such video planned short term, though will take it into concern :)

  • @Ramkrishna2400
    @Ramkrishna2400 9 месяцев назад

    I have a problem in extracting from the web. The data i want doesn’t gets copied in right format in excel.

    • @TomsTechAcademy
      @TomsTechAcademy  9 месяцев назад

      Thank you for your comment, if you can be more specific I can try to help you 😀

  • @christiannebloem
    @christiannebloem 9 месяцев назад

    Ha Thomas, super bedankt voor de goede video! Ik ben eigenlijk nog net een stap verder aan het denken. Is het ook mogelijk om het veld 'categorie' telkens terug te laten komen? Zodat je in je kolommen krijgt: Categorie = laptop, Productnaam = A , Categorie = laptop, Productnaam = B, etc.? Hoe kan ik aangeven dat datzelfde element dan ook bij elke selectie wordt weergegeven? Lijkt me makkelijker als je meerdere categorieën hebt die over tijd kunnen veranderen, nieuwe worden toegevoegd of namen wijzigen. Groetjes, en ga zo door!

    • @TomsTechAcademy
      @TomsTechAcademy  9 месяцев назад

      Hi Christianne,
      Dank voor je vraag! Join aub mijn Discord channel, daar kan ik jou op een makkelijkere manier antwoord geven en ook screenshots etc. delen: discord.gg/a4qUrRuZ

  • @tempz4u
    @tempz4u 5 месяцев назад

    it was a good video but could you make in detail with a flow....it was like haphazardly done so thought bit confusing for noob.

  • @sayeu
    @sayeu Год назад

    It's so slow, took 30+ minutes to scrape 500 pages

    • @TomsTechAcademy
      @TomsTechAcademy  Год назад

      Power Automate Desktop is an easy Low-Code tool that helps you to scrape data without programming knowledge. If you're looking for a technology which is faster and more robust I would advice Python + BeautifoulSoup.

  • @john318john
    @john318john 3 месяца назад

    Thank you for your video, i have a similar project to copy receipt and transwction data from the lowes website into an excel spreadsheet daily and save the file inside a sharepoint folder and notify office team when the file is transfered. Let me know if i can contact you via email. Thanks