That is absolutely brilliant!!! I have spent the last two days trying to figure out how the do it in Python and within 8 minutes you showed me a much easier straight forward way.
Wow, what an astonishing concept and how wonderfully well you've explained it. I've just applied it in Excel PQ to call an API over and over with a number of variables including a date that changes for each iteration, returning JSON data that is then transformed and presented in a pivot table. Thank you Mynda, xxx
Thank you! It is hard to study in Korea because there is not much data about powerquery. Thanks to this, I integrated several post api into a single query.
I knew this was possible, but ran into some errors while trying to do it on my own. Thank you very much for the great tutorial. Now to let Power Bi Spin!
That's remarkable; this is like the limit of most peoples Python learning, and most co-workers would consider them "dangerous" with those Python abilities. (in the most professional and excellent way of course!)
Hi! Your tutorial is very clear. However, what if the web page you are trying to access needs your credentials first? Do you know how I can go around that? Thank you!
Another great lesson. I have a website with unstructured data for many items. I need specific values for each item from the site. Please, how may I do it automatically and quickly. cftc .gov/dea/futures/deacmesf . htm I only need LONG and SHORT value for each code. Thanks.
Great to hear, Nazaar! The URL provided isn't right. Please post your question and sample Excel file on our forum where we can help you further: www.myonlinetraininghub.com/excel-forum
@@MyOnlineTrainingHub thanks for the quick reply. I just joined the forum. Your forum is clean and organized. Looking forward to learning more. Thanks.
Excellent work. just a question, when i try to refresh it in the system it doesn´t allow. indicates "This dataset includes a dynamic data source. Since dynamic data sources aren't refreshed in the Power BI service, this dataset won't be refreshed", any workaround?
Mynda- @ 1:18 - Instead of the 13 HTML Table options listed in your ‘Navigator’ dialog box, when I try to run the same Power BI query on my end, I am getting only 5 Tables (Table 0 through to Table 5; & an additional Document Table.) And these tables there is hardly any data to work with. Please advise.
Thank you, Madam, for this useful info! But I need to know "How can I scrape pages which have infinite scrolling using power bi?". looking forward to your suggestion!
Hi Hari, if the pages are loading the data on the fly and the data isn't in an html table or visible on the page, then I'm not aware of a way to do that, sorry.
Many thanks for the video! What if I have two variable names? My URL includes both a year and a quarter. I created the two variable names but how do I invoke the function to take all quarters from every year?
Make a table containing the string made up of the quarter and year components and whatever other characters form that section of the URL, and feed that into a single variable.
It's usefull. Thanks you. I am looking for silimilar data scraper software. Do you mind to show me how to work with power BI in the case with differences website please.
Amazing video and awesome ideas that I incorporated instantly! Quick question, how would you go about making each "page" into a separate query (each page a query on its own)?
Glad you liked it, Mohammed! To make each page a separate query, you'd have to create them one by one by pasting in the URL for each page, or copying the query and modifying the URL to point to a different page.
Thank so much for your all playlist videos tutorial..awesome...please add post tutorial for making network or system monitoring with excel..many thks..:-)
Thank you soooo much! You changed my life this weekend. Been struggling with Excel's limitations for years, and lost countless hours of my life sometimes without even accomplishing my goal. I only discovered the existence of Power Query last night with your video, and you blew my mind. A brilliantly well presented and comprehensive video on it too! It got me partway through my current problem, but now I'm stuck again if you can help? I've created Query1 to gets multiple tables from each webpage with 10 records each , and includes a record ID. But each record has a link to a details page for more info for that record. The record ID is used within the URL string to get those details. Can I create a single query that collects the list of records and uses the ID to also collect the details for each record all in one go? Also, with 30,000 records in total, it takes hours to refresh. However, as the historic records don't change, and have a historic date of filing, is there any way for future updates to only get and append the latest records (with a filing date after the last date of the previous dataset, whilst removing any duplicates, and append it to the list? Finally, it would be great if a timestamp could be added in an additional column to denote the date when that query was run, so that I can easily see which data has been added and when. Is any of this possible with PowerQuery?
So pleased that my video was helpful! Please post your questions and sample Excel file on our forum where we can help you further: www.myonlinetraininghub.com/excel-forum
Great video! Extremely useful It works in my case, but only for first 19 sheets out of 89. Starting from 20th sheet i get a blank page without any data, however i can see pages from range 20 to 89 via browser. I would appreciate if you show how many pages could be exported in your exact example
Sounds like the web site is throttling the feed so you can't get the data. Not much you can do about this, other than try splitting the task into multiple queries and run them one at a time.
@@MyOnlineTrainingHub Goods News 1 ..... Solving the throttle problem. When PQ and BI won't work, I use Instant Data Scrapper. It's a free Chrome extension and works 95% of the time. It let's you set a time delay to go to next page. I usually start at 12 seconds then decrease the delay 1 second every 100 pages or so to about 4 or 5 seconds. Most I've ever done it scrapped more than 40,000 pages on a website. It scrapes only when the webpage is active. So if you navigate to a different webpage tab it pauses. To restart scrapping simply make that page active, ie displaying, and click Start Scrapping. To prevent pausing, simply drag the webpage to be stand alone before starting IDS. Goods News 2 ..... it does something that PQ and BI don't do. It extracts embedded URLs. Say email addresses are embedded in people's names. PQ and BI will import the names (as plain text) but I've never figured out how to get them to extract the embedded email address. IDS does extract the embedded URL. Bad News .... IDS doesn't connect to the website so you can "refresh" the query like you can with PQ and BI.
Dear Somthing that i would say iis that t is important to mention is that UsingPower Query in excel give a lots of problem since consumes a lot of Ram memory , i had a really bad experiance with Pq tool in excel ,even though i reduced to the minimun as i could the steps and the volumen of imported data.. so thta is something that MS got to improve for sure.
Hi Rene, occasionally you will come up against limits like you describe, but I would say this is the exception rather than the norm. Sometimes there are settings and alternate approaches to your query's structure that can alleviate performance issues like you describe. Sometimes it's an issue with your PC's specifications e.g. not enough RAM, not 64-bit Excel etc.
Awesome! I managed to import a table for 1 page from a URL. It is a list of books unfortunately the number of books per web page varies. Is there a way to handle the issue of generating each page number in this case? As a backup is there a method of exporting all pages to a csv file and Load & Transform the csv back into PBI or PQ?
Glad it was useful, Charles. In terms of figuring out the number of items on a page, I'm not sure there's any way to do that in advance of accessing the pages. Whether there's a way to export the pages to a csv file would be down to that website and whether it offers that as an option. It's not something Power Query can do.
Incredible overview, thank you so much! Is it possible to do this if you have a site with multiple pages that uses the same URL? I'm trying to scrape data from a public site with multiple pages, but all of them use the same URL - there are no unique identifiers (e.g. page numbers). Any assistance would be greatly appreciated.
Power Query typically can't see 'read more' information unless it's already in the page HTML. If it's generated using JavaScript, then you can't scrape it.
Power Query can struggle with data generated by JavaScript after the initial page load. In such cases, consider using a web scraping tool like Selenium, which can handle JavaScript and interact with 'read more' buttons to load additional content. This way, you can extract all the necessary information from the page. Let me know if you need more guidance!
When we assume that bookstore extend the number of pages in the time, how can I set it up so that query will check all pages available. I can not set it up so, because when it checks the urls which does not exists yet, it will stop sraping procedure. Is it possible to fix it somehow?
You can build in some error handling so that when it gets to a page that doesn't exist it doesn't break the code: ruclips.net/video/Iuo-iTp8aFc/видео.html
Hi Mynda.. another great example and technique. Thanks for sharing it :)) Thumbs up!! PS - Any idea when the Add Table Using Examples feature will come to Power Query in Excel in Microsoft 365?
Thanks, Wayne! No idea when Excel will get Add Table Using Examples :-( it has been available in Power BI for quite a while now, but that doesn't seem to mean anything.
Good afternoon, i followed your instructions however instead of producing the results from the subsequent URL pages it just mirrored the results from the first page. Any ideas? Thanks, Steve
I followed the instructions but my tables are returning with the same items from page one as opposed to the items on the second page. Any ideas what I may have done wrong?
The webpage I am trying to query frequently changes. When refreshing the table, is it possible to maintain the historical data while also pulling in the new information?
@@MyOnlineTrainingHub I spent a couple weeks a year ago trying to do just this. Code, code, code was all I got . ! I love Excel. I could give up my day job if I could and just do Excel....
I need to scrape data from a map page that shows thousands of map-pins that each lead to the contact data that I need. Do you have a video already showing that? Any suggestions? TYVM!!
No examples of that. Unless the map data is stored in a table in the web page HTML then you won't be able to scrape it with Excel. You could try Power BI to get data by example: www.myonlinetraininghub.com/power-query-get-data-from-web-by-example
Great tutorial! Helped me a lot. But do you have any idea, why "Add Table Using Examples" won't work and throws this message: "This Stencil app is disabled for this browser"?
That is absolutely brilliant!!! I have spent the last two days trying to figure out how the do it in Python and within 8 minutes you showed me a much easier straight forward way.
:-) so pleased it was helpful, Robert!
@@MyOnlineTrainingHub can i download epaper into pdf without coding?
lol... literally me too.. i got quite for until python was reading arabic webpages in hex and then i thew my laptop out the window!
Wow, what an astonishing concept and how wonderfully well you've explained it.
I've just applied it in Excel PQ to call an API over and over with a number of variables including a date that changes for each iteration, returning JSON data that is then transformed and presented in a pivot table. Thank you Mynda, xxx
Thank you! So pleased this video was helpful.
Thank you so much for this video. Very practical for my Data Analyts journey. I followed the steps and didn't tun into any errors.
So pleased you found it helpful!
I watched this video on this teachers day, and I believe you are one of the best teacher could help me on web scraping... 🤗
Wow, thank you!
It's like you have read my mind because I was looking to scrape data from web like this currently. Thanks for the tutorial it's really helpful.
My pleasure, Abdul!
A bit beyond me at this point Mynda, Power Query is on my "to learn" list. Well presented.
Thanks, Dave! Power Query is amazing...I'm confident you'll think so too :-)
To think I was doing this manually 🤦🏽♂️. Thank you, this is a huge time saver!
Great to hear, Geoffrey!
Thanks Mynda, there is no way that I would not like this video. It's awesome.
Thanks so much, Awesh! And thanks for sharing it on LinkedIn :-)
Wow...Easily used this tutorial to query printer settings from every Zebra printer on my LAN. Very helpful!
Awesome to hear, David!
I slightly adjusted this to scrape data from a folder full of PDF files. Excellent thanks!
Glad it helped!
Thank you!
It is hard to study in Korea because there is not much data about powerquery.
Thanks to this, I integrated several post api into a single query.
Pleased I could help!
I wasn't even aware that M/Power Query can be used to such extent. Thank you for the great insight!
Glad you enjoyed it, James!
Great video mam!!! I was doing this before python then saved into csv then importing to PBI. Now I can do with PBI directly 👏👏👏
Wow, that's fantastic to hear :-)
This is an amazing way of working with web pages. I have seen people write lengthy macros and Python code for this.
Yes, Power Query is super easy to use. I wish more people knew of it's powers ;-)
I knew this was possible, but ran into some errors while trying to do it on my own. Thank you very much for the great tutorial. Now to let Power Bi Spin!
Glad it helped!
spot on. I was only able to do the first page. This gives me the ability to do an entire site.
That's remarkable; this is like the limit of most peoples Python learning, and most co-workers would consider them "dangerous" with those Python abilities. (in the most professional and excellent way of course!)
:-) Glad you liked it, Mark!
Brilliantly framed and well communicated. Thank you again Mynda.
Thanks so much, Michael!
Wonderful job ! So clear and perfectly explained, thank you so much !
Glad it was helpful!
Awesome use of M for us tiptoeing into the M Script!
Glad you liked it!
Thanks a lot for this tutorial! I could get mutiple api call in single query, best solution ever!
Glad it’ll be useful!
Clear Voice, Beautifully Explained Super-woman.
Thank you so much 🙂
Wonderfull tutorial! that was exactly what I Looking for. I was duplicating datasources for each week to scrap some web data. Thanks a lot!
So pleased it helped, Fabio!
Wooww thank you so much, took me months to find this function. I will try it in a more complicated webpage. thank you
Glad you can make use of it! 😊
You are the legend! Helped me to solve this greyed out "change data source "button
Great to hear, Vincas!
This is simply awesome, now I have to practice this technique.
Enjoy!
I have never tried this, but I frequently convert data from the csv file to the html Datatable, Thanks Mynda.
Hope you can make use of it, Reda!
Merci Beaucoup madame. You made my work very easier
I'm so glad!
Thank you very much. I'm from Turkey. Have a nice day.
Thank you! You too!
YOU ARE THE BEST!! Saved me so much work!
So pleased I could help 😊
Very helpful, a lot less complicated excel formulas in my life now, shame that challenge has gone but I had to think a lot about my queries.
Don't be sad that the challenges have gone...there are plenty of new challenges awaiting; M code, DAX, dynamic array functions :-)
Well Done Mynda
All Your Videos Are Useful
Cheers, Shakira!
Brilliant! Many thanks, Mynda.
Cheers, Ian!
Excellent tutorial, super easy to follow. That’s brilliant 👍
Glad it was helpful! 🙏
Amazing how it is easy to scrape web pages. Thanks for this excellent tutorial.
Glad you like it, Marcel!
Wow you're amazing! Can't believe this information is free! Thank you so much!
You're most welcome, Carlton!
Great video thanks this makes web scraping a lot easier. Thank you.
Great to hear!
Wow, this is clever and exactly what I needed. My mind is blown !!
Awesome. Glad I could help 😊
Wow amazing what you can do! Thanks a lot mynda I’m always learning from you!
So pleased to hear that, Maria!
Great Video... Thanks for the efforts and sharing it. this will be very useful for many tasks...
Great to hear!
Great info, easy to understand. TYVM! I'd love to learn how to do all of this in Google Sheets. Power Query sounds cool!
Glad you liked it, Stephen! Sheets doesn't have Power Query.
thank a lot, that was really really useful. you solve my very big problem. 🙏🙏🙏🙏🙏🙏
Awesome to hear! 😊
Hi Mynda!Great Tutorial,Just Learnt Something New So I Can Have More Fun With POWER BI..Thank You :)
Great to hear you found it useful, Darryl!
Exactly what I was looking for, thanks ! great video
Glad you found it helpful 😊
Marvelous ! You make it so easier, Thanks a lot
Thank you! Glad to hear that!
I really appreciate your tutorial! monysaver! Most data extraction tools are costly.
Glad it was helpful!
Mind blown. This is awesome. Thank you.
Glad you liked it 😊
Great video and so clear with the explanation! My researching will be much easier now!
So pleased it was helpful 😊
As always, an excellent tutorial
Glad you liked it!
Super cool video, thanks Mynda
Cheers, Chris!
Beautiful. It's solved my actual problem. Thx. :)
Great to hear!
Web scrapping, oh I love it 😊
Great to hear :-)
Wow. Amazing video, thank you!
Thanks so much, Adam!
Thank a lotu it was terrific, I'am from México.
Glad you enjoyed it, Arturo!
I still need to watch this video a few times. Our entire organization dont know this i bet
Glad it's helpful!
The website is not anymore updated
Nicely explained, loved it.
Thanks so much, Rakesh!
Hi! Your tutorial is very clear. However, what if the web page you are trying to access needs your credentials first? Do you know how I can go around that? Thank you!
Another great lesson. I have a website with unstructured data for many items. I need specific values for each item from the site. Please, how may I do it automatically and quickly. cftc .gov/dea/futures/deacmesf . htm
I only need LONG and SHORT value for each code. Thanks.
Great to hear, Nazaar! The URL provided isn't right. Please post your question and sample Excel file on our forum where we can help you further: www.myonlinetraininghub.com/excel-forum
@@MyOnlineTrainingHub thanks for the quick reply. I just joined the forum. Your forum is clean and organized. Looking forward to learning more. Thanks.
Excellent work. just a question, when i try to refresh it in the system it doesn´t allow. indicates "This dataset includes a dynamic data source. Since dynamic data sources aren't refreshed in the Power BI service, this dataset won't be refreshed", any workaround?
I don´t know you, but I love you. thanks!
Glad it was helpful!
Excellent as usual...
Thank you so much 😀
AMAZING!!! THANK YOU SO MUCH!!!
My pleasure 😊
Mynda-
@ 1:18 - Instead of the 13 HTML Table options listed in your ‘Navigator’ dialog box, when I try to run the same Power BI query on my end, I am getting only 5 Tables (Table 0 through to Table 5; & an additional Document Table.) And these tables there is hardly any data to work with.
Please advise.
I get that now too. I guess the website has changed it's layout. Table 3 is the one you want.
@@MyOnlineTrainingHub Thank you, Mynda.
Thank you, one more thing if my pages contain PDF files and I need to add another column which contains that PDF and I need to be accessible by link
Thank you, Madam, for this useful info! But I need to know "How can I scrape pages which have infinite scrolling using power bi?". looking forward to your suggestion!
Hi Hari, if the pages are loading the data on the fly and the data isn't in an html table or visible on the page, then I'm not aware of a way to do that, sorry.
Many thanks for the video! What if I have two variable names? My URL includes both a year and a quarter. I created the two variable names but how do I invoke the function to take all quarters from every year?
Make a table containing the string made up of the quarter and year components and whatever other characters form that section of the URL, and feed that into a single variable.
works great, thanks!
Great to hear!
Scraping with PowerBI, hopefully it'll be fully enabled in Excel!
Fingers crossed, Doug!
Very useful! Thank you!
Awesome to hear!
Amazing, thank you 👌🏻👌🏻👌🏻
My pleasure, David!
Thus is AWESOME!!
Glad you found it helpful!
It's usefull. Thanks you. I am looking for silimilar data scraper software. Do you mind to show me how to work with power BI in the case with differences website please.
This is top notch stuff thanks
Glad you enjoyed it :-)
The problem I have is that I want to capture the URLs and I keep getting "No CSS"
Power Query isn't designed to capture the URL, it's designed to scrape the data from the page.
Thanks a lott!! I was wondering if the web page is updated would the loaded data in power bi update too (so basically if it's real time or not)
Only direct query datasets can refresh real time, however, you can schedule refreshes at set intervals.
Amazing video and awesome ideas that I incorporated instantly! Quick question, how would you go about making each "page" into a separate query (each page a query on its own)?
Glad you liked it, Mohammed! To make each page a separate query, you'd have to create them one by one by pasting in the URL for each page, or copying the query and modifying the URL to point to a different page.
@@MyOnlineTrainingHub Thanks for your prompt reply!
Thank so much for your all playlist videos tutorial..awesome...please add post tutorial for making network or system monitoring with excel..many thks..:-)
My pleasure :-) I don't have any data I can use on network or system monitoring, sorry.
@@MyOnlineTrainingHub tysm for your feedback, nope with many thank..
Thank you soooo much! You changed my life this weekend. Been struggling with Excel's limitations for years, and lost countless hours of my life sometimes without even accomplishing my goal. I only discovered the existence of Power Query last night with your video, and you blew my mind. A brilliantly well presented and comprehensive video on it too! It got me partway through my current problem, but now I'm stuck again if you can help?
I've created Query1 to gets multiple tables from each webpage with 10 records each , and includes a record ID. But each record has a link to a details page for more info for that record. The record ID is used within the URL string to get those details. Can I create a single query that collects the list of records and uses the ID to also collect the details for each record all in one go?
Also, with 30,000 records in total, it takes hours to refresh. However, as the historic records don't change, and have a historic date of filing, is there any way for future updates to only get and append the latest records (with a filing date after the last date of the previous dataset, whilst removing any duplicates, and append it to the list?
Finally, it would be great if a timestamp could be added in an additional column to denote the date when that query was run, so that I can easily see which data has been added and when. Is any of this possible with PowerQuery?
So pleased that my video was helpful! Please post your questions and sample Excel file on our forum where we can help you further: www.myonlinetraininghub.com/excel-forum
on point, awesome video
Glad you think so!
Great video! Extremely useful
It works in my case, but only for first 19 sheets out of 89.
Starting from 20th sheet i get a blank page without any data, however i can see pages from range 20 to 89 via browser.
I would appreciate if you show how many pages could be exported in your exact example
Sounds like the web site is throttling the feed so you can't get the data. Not much you can do about this, other than try splitting the task into multiple queries and run them one at a time.
@@MyOnlineTrainingHub
Goods News 1 ..... Solving the throttle problem. When PQ and BI won't work, I use Instant Data Scrapper. It's a free Chrome extension and works 95% of the time. It let's you set a time delay to go to next page. I usually start at 12 seconds then decrease the delay 1 second every 100 pages or so to about 4 or 5 seconds. Most I've ever done it scrapped more than 40,000 pages on a website.
It scrapes only when the webpage is active. So if you navigate to a different webpage tab it pauses. To restart scrapping simply make that page active, ie displaying, and click Start Scrapping. To prevent pausing, simply drag the webpage to be stand alone before starting IDS.
Goods News 2 ..... it does something that PQ and BI don't do. It extracts embedded URLs. Say email addresses are embedded in people's names. PQ and BI will import the names (as plain text) but I've never figured out how to get them to extract the embedded email address. IDS does extract the embedded URL.
Bad News .... IDS doesn't connect to the website so you can "refresh" the query like you can with PQ and BI.
you are a genius👏
Thanks for your kind words, Djamel!
Can you get around captchas for more advanced stuff?
Not captchas, AFAIK.
Dear Somthing that i would say iis that t is important to mention is that UsingPower Query in excel give a lots of problem since consumes a lot of Ram memory , i had a really bad experiance with Pq tool in excel ,even though i reduced to the minimun as i could the steps and the volumen of imported data.. so thta is something that MS got to improve for sure.
Hi Rene, occasionally you will come up against limits like you describe, but I would say this is the exception rather than the norm. Sometimes there are settings and alternate approaches to your query's structure that can alleviate performance issues like you describe. Sometimes it's an issue with your PC's specifications e.g. not enough RAM, not 64-bit Excel etc.
Awesome! I managed to import a table for 1 page from a URL. It is a list of books unfortunately the number of books per web page varies. Is there a way to handle the issue of generating each page number in this case? As a backup is there a method of exporting all pages to a csv file and Load & Transform the csv back into PBI or PQ?
Glad it was useful, Charles. In terms of figuring out the number of items on a page, I'm not sure there's any way to do that in advance of accessing the pages. Whether there's a way to export the pages to a csv file would be down to that website and whether it offers that as an option. It's not something Power Query can do.
Incredible overview, thank you so much! Is it possible to do this if you have a site with multiple pages that uses the same URL? I'm trying to scrape data from a public site with multiple pages, but all of them use the same URL - there are no unique identifiers (e.g. page numbers). Any assistance would be greatly appreciated.
Glad it was helpful. Unfortunately, if the site's URL doesn't change, then you can't scrape the data with Power Query.
@@MyOnlineTrainingHub Understood. Could you recommend any other options that may be helpful? Thank you in advance.
Only to say that if you know JavaScript (I don't) you can write some code to change the 'page' displayed so you can get the data.
Thanks in a million.
My pleasure 😊
Hi! I have encountered a login page before the page that i need to scrap. Anyway can i bypass the page or key in the credentials?
Thank you so much for this video. what if we have "read more" instead of page numbers ?
Power Query typically can't see 'read more' information unless it's already in the page HTML. If it's generated using JavaScript, then you can't scrape it.
Power Query can struggle with data generated by JavaScript after the initial page load. In such cases, consider using a web scraping tool like Selenium, which can handle JavaScript and interact with 'read more' buttons to load additional content. This way, you can extract all the necessary information from the page. Let me know if you need more guidance!
When we assume that bookstore extend the number of pages in the time, how can I set it up so that query will check all pages available. I can not set it up so, because when it checks the urls which does not exists yet, it will stop sraping procedure. Is it possible to fix it somehow?
You can build in some error handling so that when it gets to a page that doesn't exist it doesn't break the code: ruclips.net/video/Iuo-iTp8aFc/видео.html
Useful video. Thanks
Glad it was helpful, Ali!
Hi Mynda.. another great example and technique. Thanks for sharing it :)) Thumbs up!!
PS - Any idea when the Add Table Using Examples feature will come to Power Query in Excel in Microsoft 365?
Thanks, Wayne! No idea when Excel will get Add Table Using Examples :-( it has been available in Power BI for quite a while now, but that doesn't seem to mean anything.
Good afternoon, i followed your instructions however instead of producing the results from the subsequent URL pages it just mirrored the results from the first page. Any ideas? Thanks, Steve
Hi Steve, sounds like the page number isn't being changed with each iteration. Hard to say more without seeing the file.
@@MyOnlineTrainingHub thanks for you help. All rectified now.
Can do a video on how we can scrap the data from after login into portal with our credentials and then fetch the data
I followed the instructions but my tables are returning with the same items from page one as opposed to the items on the second page. Any ideas what I may have done wrong?
Please post your question and sample Excel file on our forum where we can help you further: www.myonlinetraininghub.com/excel-forum
First comment. thanks for the video
Winner :-) hope you found it useful, Anwar.
The webpage I am trying to query frequently changes. When refreshing the table, is it possible to maintain the historical data while also pulling in the new information?
You would have to use VBA to automate taking a copy of the data before refreshing the query.
This is great - can you show how to do this with multiple parameters though - cant find anything understandable on the web!!
Please post your question and sample Excel file on our forum where we can help you further: www.myonlinetraininghub.com/excel-forum
@@MyOnlineTrainingHub thanks - will do :)
I love you. You're a genius...
Glad you enjoyed it, Robert!
@@MyOnlineTrainingHub I spent a couple weeks a year ago trying to do just this. Code, code, code was all I got . !
I love Excel. I could give up my day job if I could and just do Excel....
I need to scrape data from a map page that shows thousands of map-pins that each lead to the contact data that I need. Do you have a video already showing that? Any suggestions? TYVM!!
No examples of that. Unless the map data is stored in a table in the web page HTML then you won't be able to scrape it with Excel. You could try Power BI to get data by example: www.myonlinetraininghub.com/power-query-get-data-from-web-by-example
Great tutorial! Helped me a lot. But do you have any idea, why "Add Table Using Examples" won't work and throws this message: "This Stencil app is disabled for this browser"?
Never heard of that before, Peter. It sounds like you're trying to use Power Query online because there's reference to a browser.