Scrape Data from Multiple Web Pages with Power Query

Поделиться
HTML-код
  • Опубликовано: 1 дек 2024

Комментарии •

  • @robertbartlett3757
    @robertbartlett3757 3 года назад +24

    That is absolutely brilliant!!! I have spent the last two days trying to figure out how the do it in Python and within 8 minutes you showed me a much easier straight forward way.

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  3 года назад +1

      :-) so pleased it was helpful, Robert!

    • @abhinandanaams2613
      @abhinandanaams2613 2 года назад

      @@MyOnlineTrainingHub can i download epaper into pdf without coding?

    • @JayPatel-hc8dq
      @JayPatel-hc8dq Год назад +1

      lol... literally me too.. i got quite for until python was reading arabic webpages in hex and then i thew my laptop out the window!

  • @fentian
    @fentian 28 дней назад

    Wow, what an astonishing concept and how wonderfully well you've explained it.
    I've just applied it in Excel PQ to call an API over and over with a number of variables including a date that changes for each iteration, returning JSON data that is then transformed and presented in a pivot table. Thank you Mynda, xxx

  • @obinnaduru3815
    @obinnaduru3815 3 года назад +5

    Thank you so much for this video. Very practical for my Data Analyts journey. I followed the steps and didn't tun into any errors.

  • @prameelar1753
    @prameelar1753 3 года назад +2

    I watched this video on this teachers day, and I believe you are one of the best teacher could help me on web scraping... 🤗

  • @abdulhaseeb8027
    @abdulhaseeb8027 4 года назад +3

    It's like you have read my mind because I was looking to scrape data from web like this currently. Thanks for the tutorial it's really helpful.

  • @davegoodo3603
    @davegoodo3603 4 года назад +3

    A bit beyond me at this point Mynda, Power Query is on my "to learn" list. Well presented.

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  4 года назад +1

      Thanks, Dave! Power Query is amazing...I'm confident you'll think so too :-)

  • @geoffreyzziwambazza7862
    @geoffreyzziwambazza7862 2 года назад +1

    To think I was doing this manually 🤦🏽‍♂️. Thank you, this is a huge time saver!

  • @awesh1986
    @awesh1986 4 года назад +1

    Thanks Mynda, there is no way that I would not like this video. It's awesome.

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  4 года назад

      Thanks so much, Awesh! And thanks for sharing it on LinkedIn :-)

  • @davidstevens4064
    @davidstevens4064 2 года назад

    Wow...Easily used this tutorial to query printer settings from every Zebra printer on my LAN. Very helpful!

  • @malaniebanney1634
    @malaniebanney1634 Год назад

    I slightly adjusted this to scrape data from a folder full of PDF files. Excellent thanks!

  • @Secret구이구이
    @Secret구이구이 4 года назад

    Thank you!
    It is hard to study in Korea because there is not much data about powerquery.
    Thanks to this, I integrated several post api into a single query.

  • @jamessawyer8565
    @jamessawyer8565 4 года назад +8

    I wasn't even aware that M/Power Query can be used to such extent. Thank you for the great insight!

  • @biswajeetswaro7831
    @biswajeetswaro7831 4 года назад +1

    Great video mam!!! I was doing this before python then saved into csv then importing to PBI. Now I can do with PBI directly 👏👏👏

  • @awesh1986
    @awesh1986 4 года назад

    This is an amazing way of working with web pages. I have seen people write lengthy macros and Python code for this.

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  4 года назад

      Yes, Power Query is super easy to use. I wish more people knew of it's powers ;-)

  • @sushicatsan
    @sushicatsan 3 года назад +1

    I knew this was possible, but ran into some errors while trying to do it on my own. Thank you very much for the great tutorial. Now to let Power Bi Spin!

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  3 года назад

      Glad it helped!

    • @omenaokoro4693
      @omenaokoro4693 Год назад

      spot on. I was only able to do the first page. This gives me the ability to do an entire site.

  • @markhooper279
    @markhooper279 4 года назад

    That's remarkable; this is like the limit of most peoples Python learning, and most co-workers would consider them "dangerous" with those Python abilities. (in the most professional and excellent way of course!)

  • @MichaelHendersonMHC
    @MichaelHendersonMHC 4 года назад +1

    Brilliantly framed and well communicated. Thank you again Mynda.

  • @khalidessaadi8915
    @khalidessaadi8915 24 дня назад

    Wonderful job ! So clear and perfectly explained, thank you so much !

  • @StephanOnisick
    @StephanOnisick Год назад

    Awesome use of M for us tiptoeing into the M Script!

  • @naotoaguilarmorita7079
    @naotoaguilarmorita7079 3 года назад

    Thanks a lot for this tutorial! I could get mutiple api call in single query, best solution ever!

  • @prashantmanshrestha
    @prashantmanshrestha 3 года назад

    Clear Voice, Beautifully Explained Super-woman.

  • @fabio.s.barbosa
    @fabio.s.barbosa 3 года назад

    Wonderfull tutorial! that was exactly what I Looking for. I was duplicating datasources for each week to scrap some web data. Thanks a lot!

  • @victorgabrielcamargo6384
    @victorgabrielcamargo6384 8 месяцев назад

    Wooww thank you so much, took me months to find this function. I will try it in a more complicated webpage. thank you

  • @vincasvosylius6045
    @vincasvosylius6045 4 года назад

    You are the legend! Helped me to solve this greyed out "change data source "button

  • @MichaelBrown-lw9kz
    @MichaelBrown-lw9kz Год назад

    This is simply awesome, now I have to practice this technique.

  • @merbouni
    @merbouni 4 года назад

    I have never tried this, but I frequently convert data from the csv file to the html Datatable, Thanks Mynda.

  • @deepakd-w5h
    @deepakd-w5h Месяц назад

    Merci Beaucoup madame. You made my work very easier

  • @CEYLAN64
    @CEYLAN64 4 года назад +2

    Thank you very much. I'm from Turkey. Have a nice day.

  • @michalvydrzel
    @michalvydrzel 9 месяцев назад

    YOU ARE THE BEST!! Saved me so much work!

  • @julianstarkey9301
    @julianstarkey9301 4 года назад

    Very helpful, a lot less complicated excel formulas in my life now, shame that challenge has gone but I had to think a lot about my queries.

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  4 года назад

      Don't be sad that the challenges have gone...there are plenty of new challenges awaiting; M code, DAX, dynamic array functions :-)

  • @shakiraasfoor7599
    @shakiraasfoor7599 4 года назад

    Well Done Mynda
    All Your Videos Are Useful

  • @iankr
    @iankr 7 месяцев назад

    Brilliant! Many thanks, Mynda.

  • @naveedkhowaja4089
    @naveedkhowaja4089 Год назад

    Excellent tutorial, super easy to follow. That’s brilliant 👍

  • @machadolopes
    @machadolopes 2 года назад

    Amazing how it is easy to scrape web pages. Thanks for this excellent tutorial.

  • @carltonquine9277
    @carltonquine9277 4 года назад

    Wow you're amazing! Can't believe this information is free! Thank you so much!

  • @powerb_i
    @powerb_i 2 года назад

    Great video thanks this makes web scraping a lot easier. Thank you.

  • @stephencross4978
    @stephencross4978 Год назад

    Wow, this is clever and exactly what I needed. My mind is blown !!

  • @mariaalcala5159
    @mariaalcala5159 3 года назад

    Wow amazing what you can do! Thanks a lot mynda I’m always learning from you!

  • @ramakumarguntamadugu1299
    @ramakumarguntamadugu1299 2 года назад

    Great Video... Thanks for the efforts and sharing it. this will be very useful for many tasks...

  • @StephenMattison66
    @StephenMattison66 3 года назад

    Great info, easy to understand. TYVM! I'd love to learn how to do all of this in Google Sheets. Power Query sounds cool!

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  3 года назад

      Glad you liked it, Stephen! Sheets doesn't have Power Query.

  • @peimanhosseini37
    @peimanhosseini37 Год назад

    thank a lot, that was really really useful. you solve my very big problem. 🙏🙏🙏🙏🙏🙏

  • @darrylmorgan
    @darrylmorgan 4 года назад

    Hi Mynda!Great Tutorial,Just Learnt Something New So I Can Have More Fun With POWER BI..Thank You :)

  • @Chriiichriii
    @Chriiichriii 3 года назад

    Exactly what I was looking for, thanks ! great video

  • @AnonymousHunYaar
    @AnonymousHunYaar 2 года назад

    Marvelous ! You make it so easier, Thanks a lot

  • @valentecg8518
    @valentecg8518 Месяц назад

    I really appreciate your tutorial! monysaver! Most data extraction tools are costly.

  • @wrandyrice5447
    @wrandyrice5447 3 года назад

    Mind blown. This is awesome. Thank you.

  • @jamesflieder8164
    @jamesflieder8164 3 года назад

    Great video and so clear with the explanation! My researching will be much easier now!

  • @NadeemShafiqueButt
    @NadeemShafiqueButt Год назад

    As always, an excellent tutorial

  • @chrism9037
    @chrism9037 4 года назад

    Super cool video, thanks Mynda

  • @ssomtom
    @ssomtom 2 года назад

    Beautiful. It's solved my actual problem. Thx. :)

  • @Ismail-Yahya
    @Ismail-Yahya 4 года назад +2

    Web scrapping, oh I love it 😊

  • @adamsteele44
    @adamsteele44 2 года назад

    Wow. Amazing video, thank you!

  • @arturodimas6988
    @arturodimas6988 4 года назад

    Thank a lotu it was terrific, I'am from México.

  • @shrikantbadge3978
    @shrikantbadge3978 Год назад

    I still need to watch this video a few times. Our entire organization dont know this i bet

  • @bryandadiz5677
    @bryandadiz5677 2 года назад +1

    The website is not anymore updated

  • @rakkesh85
    @rakkesh85 4 года назад

    Nicely explained, loved it.

  • @03mariadelmar
    @03mariadelmar 2 года назад

    Hi! Your tutorial is very clear. However, what if the web page you are trying to access needs your credentials first? Do you know how I can go around that? Thank you!

  • @nazaarshadir
    @nazaarshadir 3 года назад +1

    Another great lesson. I have a website with unstructured data for many items. I need specific values for each item from the site. Please, how may I do it automatically and quickly. cftc .gov/dea/futures/deacmesf . htm
    I only need LONG and SHORT value for each code. Thanks.

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  3 года назад +1

      Great to hear, Nazaar! The URL provided isn't right. Please post your question and sample Excel file on our forum where we can help you further: www.myonlinetraininghub.com/excel-forum

    • @nazaarshadir
      @nazaarshadir 3 года назад +1

      @@MyOnlineTrainingHub thanks for the quick reply. I just joined the forum. Your forum is clean and organized. Looking forward to learning more. Thanks.

  • @jbjs5820
    @jbjs5820 2 года назад

    Excellent work. just a question, when i try to refresh it in the system it doesn´t allow. indicates "This dataset includes a dynamic data source. Since dynamic data sources aren't refreshed in the Power BI service, this dataset won't be refreshed", any workaround?

  • @gest4mp
    @gest4mp Год назад

    I don´t know you, but I love you. thanks!

  • @hamidsh4789
    @hamidsh4789 4 года назад

    Excellent as usual...

  • @gaia5141
    @gaia5141 2 года назад

    AMAZING!!! THANK YOU SO MUCH!!!

  • @Kingleer69
    @Kingleer69 2 года назад

    Mynda-
    @ 1:18 - Instead of the 13 HTML Table options listed in your ‘Navigator’ dialog box, when I try to run the same Power BI query on my end, I am getting only 5 Tables (Table 0 through to Table 5; & an additional Document Table.) And these tables there is hardly any data to work with.
    Please advise.

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  2 года назад

      I get that now too. I guess the website has changed it's layout. Table 3 is the one you want.

    • @Kingleer69
      @Kingleer69 2 года назад

      @@MyOnlineTrainingHub Thank you, Mynda.

  • @eslamfahmy87
    @eslamfahmy87 9 месяцев назад

    Thank you, one more thing if my pages contain PDF files and I need to add another column which contains that PDF and I need to be accessible by link

  • @harigokul4450
    @harigokul4450 4 года назад

    Thank you, Madam, for this useful info! But I need to know "How can I scrape pages which have infinite scrolling using power bi?". looking forward to your suggestion!

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  4 года назад

      Hi Hari, if the pages are loading the data on the fly and the data isn't in an html table or visible on the page, then I'm not aware of a way to do that, sorry.

  • @iliyatsekov6044
    @iliyatsekov6044 2 года назад

    Many thanks for the video! What if I have two variable names? My URL includes both a year and a quarter. I created the two variable names but how do I invoke the function to take all quarters from every year?

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  2 года назад +1

      Make a table containing the string made up of the quarter and year components and whatever other characters form that section of the URL, and feed that into a single variable.

  • @stitch6410
    @stitch6410 Месяц назад

    works great, thanks!

  • @DougHExcel
    @DougHExcel 4 года назад

    Scraping with PowerBI, hopefully it'll be fully enabled in Excel!

  • @msoffice6037
    @msoffice6037 4 года назад

    Very useful! Thank you!

  • @MrDavitonio
    @MrDavitonio 4 года назад +1

    Amazing, thank you 👌🏻👌🏻👌🏻

  • @gabapritam
    @gabapritam Год назад

    Thus is AWESOME!!

  • @仁です
    @仁です Год назад

    It's usefull. Thanks you. I am looking for silimilar data scraper software. Do you mind to show me how to work with power BI in the case with differences website please.

  • @briandennehy6380
    @briandennehy6380 4 года назад

    This is top notch stuff thanks

  • @StephanOnisick
    @StephanOnisick Год назад

    The problem I have is that I want to capture the URLs and I keep getting "No CSS"

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  Год назад

      Power Query isn't designed to capture the URL, it's designed to scrape the data from the page.

  • @ritvikbolugudde8688
    @ritvikbolugudde8688 2 года назад

    Thanks a lott!! I was wondering if the web page is updated would the loaded data in power bi update too (so basically if it's real time or not)

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  2 года назад

      Only direct query datasets can refresh real time, however, you can schedule refreshes at set intervals.

  • @m_shakes
    @m_shakes 3 года назад +1

    Amazing video and awesome ideas that I incorporated instantly! Quick question, how would you go about making each "page" into a separate query (each page a query on its own)?

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  3 года назад +2

      Glad you liked it, Mohammed! To make each page a separate query, you'd have to create them one by one by pasting in the URL for each page, or copying the query and modifying the URL to point to a different page.

    • @m_shakes
      @m_shakes 3 года назад +1

      @@MyOnlineTrainingHub Thanks for your prompt reply!

  • @arisekobain6400
    @arisekobain6400 4 года назад

    Thank so much for your all playlist videos tutorial..awesome...please add post tutorial for making network or system monitoring with excel..many thks..:-)

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  4 года назад +1

      My pleasure :-) I don't have any data I can use on network or system monitoring, sorry.

    • @arisekobain6400
      @arisekobain6400 4 года назад

      @@MyOnlineTrainingHub tysm for your feedback, nope with many thank..

  • @bali501
    @bali501 2 года назад

    Thank you soooo much! You changed my life this weekend. Been struggling with Excel's limitations for years, and lost countless hours of my life sometimes without even accomplishing my goal. I only discovered the existence of Power Query last night with your video, and you blew my mind. A brilliantly well presented and comprehensive video on it too! It got me partway through my current problem, but now I'm stuck again if you can help?
    I've created Query1 to gets multiple tables from each webpage with 10 records each , and includes a record ID. But each record has a link to a details page for more info for that record. The record ID is used within the URL string to get those details. Can I create a single query that collects the list of records and uses the ID to also collect the details for each record all in one go?
    Also, with 30,000 records in total, it takes hours to refresh. However, as the historic records don't change, and have a historic date of filing, is there any way for future updates to only get and append the latest records (with a filing date after the last date of the previous dataset, whilst removing any duplicates, and append it to the list?
    Finally, it would be great if a timestamp could be added in an additional column to denote the date when that query was run, so that I can easily see which data has been added and when. Is any of this possible with PowerQuery?

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  2 года назад +2

      So pleased that my video was helpful! Please post your questions and sample Excel file on our forum where we can help you further: www.myonlinetraininghub.com/excel-forum

  • @s1ngularityxd64
    @s1ngularityxd64 3 года назад

    on point, awesome video

  • @ДенисДементьев-т3о
    @ДенисДементьев-т3о 2 года назад

    Great video! Extremely useful
    It works in my case, but only for first 19 sheets out of 89.
    Starting from 20th sheet i get a blank page without any data, however i can see pages from range 20 to 89 via browser.
    I would appreciate if you show how many pages could be exported in your exact example

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  2 года назад

      Sounds like the web site is throttling the feed so you can't get the data. Not much you can do about this, other than try splitting the task into multiple queries and run them one at a time.

    • @gzfraud
      @gzfraud Год назад

      @@MyOnlineTrainingHub
      Goods News 1 ..... Solving the throttle problem. When PQ and BI won't work, I use Instant Data Scrapper. It's a free Chrome extension and works 95% of the time. It let's you set a time delay to go to next page. I usually start at 12 seconds then decrease the delay 1 second every 100 pages or so to about 4 or 5 seconds. Most I've ever done it scrapped more than 40,000 pages on a website.
      It scrapes only when the webpage is active. So if you navigate to a different webpage tab it pauses. To restart scrapping simply make that page active, ie displaying, and click Start Scrapping. To prevent pausing, simply drag the webpage to be stand alone before starting IDS.
      Goods News 2 ..... it does something that PQ and BI don't do. It extracts embedded URLs. Say email addresses are embedded in people's names. PQ and BI will import the names (as plain text) but I've never figured out how to get them to extract the embedded email address. IDS does extract the embedded URL.
      Bad News .... IDS doesn't connect to the website so you can "refresh" the query like you can with PQ and BI.

  • @djamelboulila1278
    @djamelboulila1278 4 года назад

    you are a genius👏

  • @austinbright-j3o
    @austinbright-j3o 3 месяца назад

    Can you get around captchas for more advanced stuff?

  • @reng7777
    @reng7777 4 года назад

    Dear Somthing that i would say iis that t is important to mention is that UsingPower Query in excel give a lots of problem since consumes a lot of Ram memory , i had a really bad experiance with Pq tool in excel ,even though i reduced to the minimun as i could the steps and the volumen of imported data.. so thta is something that MS got to improve for sure.

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  4 года назад

      Hi Rene, occasionally you will come up against limits like you describe, but I would say this is the exception rather than the norm. Sometimes there are settings and alternate approaches to your query's structure that can alleviate performance issues like you describe. Sometimes it's an issue with your PC's specifications e.g. not enough RAM, not 64-bit Excel etc.

  • @charlesmcdermott282
    @charlesmcdermott282 4 года назад

    Awesome! I managed to import a table for 1 page from a URL. It is a list of books unfortunately the number of books per web page varies. Is there a way to handle the issue of generating each page number in this case? As a backup is there a method of exporting all pages to a csv file and Load & Transform the csv back into PBI or PQ?

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  4 года назад

      Glad it was useful, Charles. In terms of figuring out the number of items on a page, I'm not sure there's any way to do that in advance of accessing the pages. Whether there's a way to export the pages to a csv file would be down to that website and whether it offers that as an option. It's not something Power Query can do.

  • @eo4922
    @eo4922 2 года назад

    Incredible overview, thank you so much! Is it possible to do this if you have a site with multiple pages that uses the same URL? I'm trying to scrape data from a public site with multiple pages, but all of them use the same URL - there are no unique identifiers (e.g. page numbers). Any assistance would be greatly appreciated.

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  2 года назад +1

      Glad it was helpful. Unfortunately, if the site's URL doesn't change, then you can't scrape the data with Power Query.

    • @eo4922
      @eo4922 2 года назад

      @@MyOnlineTrainingHub Understood. Could you recommend any other options that may be helpful? Thank you in advance.

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  2 года назад

      Only to say that if you know JavaScript (I don't) you can write some code to change the 'page' displayed so you can get the data.

  • @johng5295
    @johng5295 Год назад

    Thanks in a million.

  • @WeKnowIt100
    @WeKnowIt100 3 года назад

    Hi! I have encountered a login page before the page that i need to scrap. Anyway can i bypass the page or key in the credentials?

  • @youse3
    @youse3 2 месяца назад

    Thank you so much for this video. what if we have "read more" instead of page numbers ?

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  2 месяца назад +1

      Power Query typically can't see 'read more' information unless it's already in the page HTML. If it's generated using JavaScript, then you can't scrape it.

    • @webscrapingseniors
      @webscrapingseniors Месяц назад

      Power Query can struggle with data generated by JavaScript after the initial page load. In such cases, consider using a web scraping tool like Selenium, which can handle JavaScript and interact with 'read more' buttons to load additional content. This way, you can extract all the necessary information from the page. Let me know if you need more guidance!

  • @marosbrezovsky751
    @marosbrezovsky751 2 года назад

    When we assume that bookstore extend the number of pages in the time, how can I set it up so that query will check all pages available. I can not set it up so, because when it checks the urls which does not exists yet, it will stop sraping procedure. Is it possible to fix it somehow?

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  2 года назад

      You can build in some error handling so that when it gets to a page that doesn't exist it doesn't break the code: ruclips.net/video/Iuo-iTp8aFc/видео.html

  • @shoaibrehman9988
    @shoaibrehman9988 4 года назад

    Useful video. Thanks

  • @wayneedmondson1065
    @wayneedmondson1065 4 года назад +1

    Hi Mynda.. another great example and technique. Thanks for sharing it :)) Thumbs up!!
    PS - Any idea when the Add Table Using Examples feature will come to Power Query in Excel in Microsoft 365?

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  4 года назад +1

      Thanks, Wayne! No idea when Excel will get Add Table Using Examples :-( it has been available in Power BI for quite a while now, but that doesn't seem to mean anything.

  • @stevewilson1544
    @stevewilson1544 2 года назад

    Good afternoon, i followed your instructions however instead of producing the results from the subsequent URL pages it just mirrored the results from the first page. Any ideas? Thanks, Steve

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  2 года назад

      Hi Steve, sounds like the page number isn't being changed with each iteration. Hard to say more without seeing the file.

    • @stevewilson1544
      @stevewilson1544 2 года назад +1

      @@MyOnlineTrainingHub thanks for you help. All rectified now.

  • @sadinenim5360
    @sadinenim5360 2 года назад

    Can do a video on how we can scrap the data from after login into portal with our credentials and then fetch the data

  • @zTurtle13
    @zTurtle13 2 года назад

    I followed the instructions but my tables are returning with the same items from page one as opposed to the items on the second page. Any ideas what I may have done wrong?

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  2 года назад

      Please post your question and sample Excel file on our forum where we can help you further: www.myonlinetraininghub.com/excel-forum

  • @parvez301
    @parvez301 4 года назад +1

    First comment. thanks for the video

  • @calleranchero3212
    @calleranchero3212 3 года назад

    The webpage I am trying to query frequently changes. When refreshing the table, is it possible to maintain the historical data while also pulling in the new information?

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  3 года назад

      You would have to use VBA to automate taking a copy of the data before refreshing the query.

  • @peterh7842
    @peterh7842 2 года назад

    This is great - can you show how to do this with multiple parameters though - cant find anything understandable on the web!!

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  2 года назад

      Please post your question and sample Excel file on our forum where we can help you further: www.myonlinetraininghub.com/excel-forum

    • @peterh7842
      @peterh7842 2 года назад

      @@MyOnlineTrainingHub thanks - will do :)

  • @robertcooper3759
    @robertcooper3759 4 года назад

    I love you. You're a genius...

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  4 года назад +1

      Glad you enjoyed it, Robert!

    • @robertcooper3759
      @robertcooper3759 4 года назад

      @@MyOnlineTrainingHub I spent a couple weeks a year ago trying to do just this. Code, code, code was all I got . !
      I love Excel. I could give up my day job if I could and just do Excel....

  • @StephenMattison66
    @StephenMattison66 3 года назад

    I need to scrape data from a map page that shows thousands of map-pins that each lead to the contact data that I need. Do you have a video already showing that? Any suggestions? TYVM!!

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  3 года назад

      No examples of that. Unless the map data is stored in a table in the web page HTML then you won't be able to scrape it with Excel. You could try Power BI to get data by example: www.myonlinetraininghub.com/power-query-get-data-from-web-by-example

  • @petermcallister908
    @petermcallister908 2 года назад

    Great tutorial! Helped me a lot. But do you have any idea, why "Add Table Using Examples" won't work and throws this message: "This Stencil app is disabled for this browser"?

    • @MyOnlineTrainingHub
      @MyOnlineTrainingHub  2 года назад

      Never heard of that before, Peter. It sounds like you're trying to use Power Query online because there's reference to a browser.