Seen many videos on Web Scraping in Power Automate and yours is the best by far! Thank you this helped me tremendously! I look forward to watching more of your tutorials.
OMG, I don't know how to thank you Sir! You just helped me solve the first step for my final year project.... Thanks alot.... keep making helpful videos. Thanks so much
great video ! is there a way to scrape data from all the sub pages, example if you open Laptop -> and then click on ProBook -> scrap data, and then go to next item and do the same ?
Yes, you can automate scraping through subpages! One way is to create a loop that clicks on each category (like Laptop -> ProBook), scrapes the data, and then moves to the next item. You can achieve this using Selenium to handle the clicks and BeautifulSoup to extract the data from each page. This approach allows you to navigate through multiple subpages and collect the information in a structured way. Let me know if you need help with the code!
Thank you so much for your video! I tried the same technique but the extractions stops and goes to Website privacy page. Not sure if this is caused by the web scraper or the code. could you please help me with this matter?
Hi! Great vid! What would you recommend for a web with multiples pages. The web automatically refreshes every 5 or 6 minutes. How do I continue from the last one?
Hi, sometimes you can use the url to navigate? The structure should look something like: webpage.com/page/1 - in this case you can increase the value of 1 automatically..
Thank you for this great video. Can I extract information from different websites for the same task and collect it in a single excel. Thank you so much
Hi, yes you can! For example, you can loop through a list of websites and scrape every one of them and store the output in the same Excel file :) good luck!
Fantastic video. I do have a question, what would be the best way in Power Automate to pull data from the individual pages. Like say in your example you have the Laptop Packard 255 G2, when you click on it, it has more details. How can I automate scraping those details from the individual pages and put it into one row in the excel spreadsheet?
The easiest way is to first use data scraping as shown in the video and download all individual rows. Then you loop through the Excel file with a For Each and visit every page and extract the specific product data you can only find on the product page.
I am trying to scrape my corporate site to excel my issue is after the loop ends at page 279 it keeps copying the last page with an error. Can you help.
It sounds like your loop is stuck at the last page, causing it to keep copying the same data. One way to fix this is to add a condition to break the loop once you’ve reached the last page. You can check if the 'Next' button is disabled or absent, or if the current page number matches the total number of pages (279 in your case). Once the loop detects that it’s at the last page, it should stop. I can do such tasks in Python, which is more versatile for handling conditions like this. Let me know if you need help
Ha Thomas, super bedankt voor de goede video! Ik ben eigenlijk nog net een stap verder aan het denken. Is het ook mogelijk om het veld 'categorie' telkens terug te laten komen? Zodat je in je kolommen krijgt: Categorie = laptop, Productnaam = A , Categorie = laptop, Productnaam = B, etc.? Hoe kan ik aangeven dat datzelfde element dan ook bij elke selectie wordt weergegeven? Lijkt me makkelijker als je meerdere categorieën hebt die over tijd kunnen veranderen, nieuwe worden toegevoegd of namen wijzigen. Groetjes, en ga zo door!
Hi Christianne, Dank voor je vraag! Join aub mijn Discord channel, daar kan ik jou op een makkelijkere manier antwoord geven en ook screenshots etc. delen: discord.gg/a4qUrRuZ
Power Automate Desktop is an easy Low-Code tool that helps you to scrape data without programming knowledge. If you're looking for a technology which is faster and more robust I would advice Python + BeautifoulSoup.
Thank you for your video, i have a similar project to copy receipt and transwction data from the lowes website into an excel spreadsheet daily and save the file inside a sharepoint folder and notify office team when the file is transfered. Let me know if i can contact you via email. Thanks
Seen many videos on Web Scraping in Power Automate and yours is the best by far! Thank you this helped me tremendously! I look forward to watching more of your tutorials.
Thank you Matthew, I'm happy to hear my videos are of value to you and definitely planning to release more :D
Thank you for your tutorial, it works on paginated websites. Is there a workaround for "Load More" buttons?
I don't know what a did exactly, but it works, thank you very much
Glad your automation works! Happy coding 😀
OMG, I don't know how to thank you Sir!
You just helped me solve the first step for my final year project.... Thanks alot.... keep making helpful videos. Thanks so much
Glad to hear my video was helpful for you! Good luck with your final year project 🚀
Thank you very much sir. This video gives me an idea how to scrape website with click links. Appriciated it man :)
Thank you for your comment, glad my video is helpful for you 😀
This is gold, thankyou.
Thank you Eddie, happy to hear my videos are helpful for you 😀
Thank you. Appreciate this!!
Thank you for your comment Ben, happy automation! 🚀
Great video! Thanks so much.
Thank you for your comment Eladio! Glad my video was useful for you. Happy automation!
Hi I am trying to extract html table from the links. extract data throws an error failed to extract data. can you pls help me how to resolve this?
Thank you a million zillion times.
Thank you 🙏 glad my video was useful for you 😀
great video ! is there a way to scrape data from all the sub pages, example if you open Laptop -> and then click on ProBook -> scrap data, and then go to next item and do the same ?
Yes, you can automate scraping through subpages! One way is to create a loop that clicks on each category (like Laptop -> ProBook), scrapes the data, and then moves to the next item. You can achieve this using Selenium to handle the clicks and BeautifulSoup to extract the data from each page. This approach allows you to navigate through multiple subpages and collect the information in a structured way. Let me know if you need help with the code!
Excellent!
Thank you for your comment Chris! Glad my video was useful for you 😀
Thank you so much for your video! I tried the same technique but the extractions stops and goes to Website privacy page. Not sure if this is caused by the web scraper or the code. could you please help me with this matter?
Hi, you can share your code / screenshots on my discord and I’m happy to help you; discord.gg/WHJWFNDXXX
Hi! Great vid! What would you recommend for a web with multiples pages. The web automatically refreshes every 5 or 6 minutes. How do I continue from the last one?
Hi, sometimes you can use the url to navigate? The structure should look something like: webpage.com/page/1 - in this case you can increase the value of 1 automatically..
Thank you for this great video. Can I extract information from different websites for the same task and collect it in a single excel. Thank you so much
Hi, yes you can! For example, you can loop through a list of websites and scrape every one of them and store the output in the same Excel file :) good luck!
Hey please make video to download files from web table hyper link where you have multiple pages.
Great tutorial but I see in extract data from web page then advanced there is an option for select paging element, can we use that ? How ?
Hi, I explain how to extract individual fields from a webpage in this video at minute 28:20 : ruclips.net/video/Y35ZJs16APQ/видео.html
Fantastic video. I do have a question, what would be the best way in Power Automate to pull data from the individual pages. Like say in your example you have the Laptop Packard 255 G2, when you click on it, it has more details. How can I automate scraping those details from the individual pages and put it into one row in the excel spreadsheet?
The easiest way is to first use data scraping as shown in the video and download all individual rows. Then you loop through the Excel file with a For Each and visit every page and extract the specific product data you can only find on the product page.
@@TomsTechAcademythanks! would love see you pull that together. Cheers!
I am trying to scrape my corporate site to excel my issue is after the loop ends at page 279 it keeps copying the last page with an error. Can you help.
It sounds like your loop is stuck at the last page, causing it to keep copying the same data. One way to fix this is to add a condition to break the loop once you’ve reached the last page. You can check if the 'Next' button is disabled or absent, or if the current page number matches the total number of pages (279 in your case).
Once the loop detects that it’s at the last page, it should stop. I can do such tasks in Python, which is more versatile for handling conditions like this. Let me know if you need help
Thanks for this. Not winning with next page as it has anchor 2 as ordinal not text 16:22 @matthewsheehan??
Thank you
Thank you for your comment, happy automation!
Do a video on selecting multiple check boxes in web page using power automate desktop
Thank you for your comment, no such video planned short term, though will take it into concern :)
I have a problem in extracting from the web. The data i want doesn’t gets copied in right format in excel.
Thank you for your comment, if you can be more specific I can try to help you 😀
Ha Thomas, super bedankt voor de goede video! Ik ben eigenlijk nog net een stap verder aan het denken. Is het ook mogelijk om het veld 'categorie' telkens terug te laten komen? Zodat je in je kolommen krijgt: Categorie = laptop, Productnaam = A , Categorie = laptop, Productnaam = B, etc.? Hoe kan ik aangeven dat datzelfde element dan ook bij elke selectie wordt weergegeven? Lijkt me makkelijker als je meerdere categorieën hebt die over tijd kunnen veranderen, nieuwe worden toegevoegd of namen wijzigen. Groetjes, en ga zo door!
Hi Christianne,
Dank voor je vraag! Join aub mijn Discord channel, daar kan ik jou op een makkelijkere manier antwoord geven en ook screenshots etc. delen: discord.gg/a4qUrRuZ
it was a good video but could you make in detail with a flow....it was like haphazardly done so thought bit confusing for noob.
It's so slow, took 30+ minutes to scrape 500 pages
Power Automate Desktop is an easy Low-Code tool that helps you to scrape data without programming knowledge. If you're looking for a technology which is faster and more robust I would advice Python + BeautifoulSoup.
Thank you for your video, i have a similar project to copy receipt and transwction data from the lowes website into an excel spreadsheet daily and save the file inside a sharepoint folder and notify office team when the file is transfered. Let me know if i can contact you via email. Thanks