The fastest website ever?

Поделиться
HTML-код
  • Опубликовано: 1 фев 2025

Комментарии •

  • @aayushnarayanofficial
    @aayushnarayanofficial 2 месяца назад +691

    Great example to show that choosing the right technology will not automatically make the website fast. You have to write good code.

    • @Z4KIUS
      @Z4KIUS 2 месяца назад +28

      good code in bad tech is often faster than bad code in good tech

    • @JohnSmith-op7ls
      @JohnSmith-op7ls 2 месяца назад +6

      @@Z4KIUSTrivial performance gains like this rarely matter to begin with. Spend your time addressing issues that cost real money or adding features that make it.
      Chasing tiny page load speeds is just mindless busywork.

    • @Z4KIUS
      @Z4KIUS 2 месяца назад +7

      @@JohnSmith-op7ls good feature beats minuscule speed improvements, but big speed regressions at some point beat any features

    • @JohnSmith-op7ls
      @JohnSmith-op7ls 2 месяца назад

      @ But this isn’t about addressing relevant performance issues, it’s about pointlessly squeezing out a bit more, in a contrived demo, just for the sake of it.

    • @ulrich-tonmoy
      @ulrich-tonmoy 2 месяца назад

      why not make these feature the framework thing instead

  • @alexmortensen6901
    @alexmortensen6901 2 месяца назад +597

    Interestingly enough, the edge McMaster has is not that their website is so insanely fast, its that everything you order will be delivered to you company in a couple of hours. So if you think the page loading is fast, checkout their delivery, lol

    • @drooplug
      @drooplug 2 месяца назад +16

      Better than amazon!

    • @thewhitefalcon8539
      @thewhitefalcon8539 2 месяца назад +50

      Their other edge is that they have every conceivable product. They are all around a premium quality service with high prices to match. When you need something specific, fast, to exact specifications and perfect every time, you use this company. When price matters more, you try your luck on Ali.

    • @drooplug
      @drooplug 2 месяца назад

      @thewhitefalcon8539 I'll say they have a massive selection, but I often do not find what I am looking for there.

    • @Ginto_O
      @Ginto_O 2 месяца назад +2

      couple hours delivery is quite slow for Russia. The delivery here us usually 15 to 30 minutes

    • @allenklingsporn6993
      @allenklingsporn6993 2 месяца назад +15

      ​@@Ginto_O You clearly don't live in a rural area of Russia. McMaster delivers ANYWHERE in the Continental US that fast.

  • @elephunk6898
    @elephunk6898 2 месяца назад +328

    Worked at McMaster for a few years. This kind of glosses over how we’re also able to perfectly sort/filter/and serve up data on over a half million different part numbers. There’s a looooot of stuff going on in the backend for this

    • @t3dotgg
      @t3dotgg  2 месяца назад +74

      It’s very very impressive stuff, especially for how long it’s existed and worked. I wish more of that info was public so I could have talked in depth about it 🙃

  • @drooplug
    @drooplug 2 месяца назад +314

    I like how theo thinks McMaster's competitive edge is their website and not that they knock on your door with your parts 3 minutes after you complete the order. 😄

    • @tsunami870
      @tsunami870 2 месяца назад +22

      I live right next to a warehouse so for me it's more like 1 minute 😂

  • @DanielCouper-vf5zh
    @DanielCouper-vf5zh 2 месяца назад +36

    I've used this as my go-to pat response to "can you give me an example of good web design/UX/UI" in interviews for years, is great that it's getting attention now 🎉

  • @mbainrot
    @mbainrot 2 месяца назад +218

    The craziest shit with McMasterCarr thou... is it's even fast for Australians. And we can't even buy shit from them without shenanigans

    • @bugged1212
      @bugged1212 2 месяца назад

      No one cares about Australia, it's irrelevant to world affairs. Shoo.

    • @Coppertine_
      @Coppertine_ 2 месяца назад

      4-6.. second.. load times on every page..

    • @kaldogorath
      @kaldogorath Месяц назад

      @@Coppertine_ That's fast for Oz isn't it?

    • @Coppertine_
      @Coppertine_ Месяц назад

      @@kaldogorath not really.. compared to the next version, it's instant speeds

  • @rikschaaf
    @rikschaaf 2 месяца назад +21

    13:45 prefetching is great! When I started experimenting with HTMX, I immediately turned that on there as well (it supports both on mouse down and on hover, depending on your preferences). Great to see that next.js also supports it.

    • @gillesfrancois2278
      @gillesfrancois2278 2 месяца назад +1

      But is it really a good idea to download all the products pages when the user scrolls ? Can you imagine the server cost of doing that on a production site with a lot of visitors ?

  • @ChaseFreedomMusician
    @ChaseFreedomMusician 2 месяца назад +28

    So one of the things you seemed to miss was that with was a classic .NET 4.5 ASP website. So the tech for this is about 15 years old. All that javascript at 4:45 is auto genned. The back page for this is much simpler.

  • @webengineeringhistory
    @webengineeringhistory 2 месяца назад +6

    This is an interesting intersection between web development and ux. The site has amazing ux and software engineering.

  • @allenklingsporn6993
    @allenklingsporn6993 2 месяца назад +28

    McMaster-Carr, shouldering the weight of America's industrial might since 1901.

  • @alexmortensen6901
    @alexmortensen6901 2 месяца назад +28

    As an engineer, McMaster is the greatest website known to man

  • @SidTheITGuy
    @SidTheITGuy 2 месяца назад +5

    The depths that you go to is honestly, unreal. I can only imagine what it takes to put these videos out. Kudos to you, my man!

    • @t3dotgg
      @t3dotgg  2 месяца назад +7

      @mohitkumar-jv2bx as I mentioned in my reply above, the only one "conveniently missing" points here is you.
      1) All of these examples use real databases. The DB for NextFaster has millions of entries. It's a fair comparison.
      2) I've seen the bill for this project. Hosting costs are under $20 a month despite the absolutely insane amounts of traffic.

    • @AshesWake-sf7uw
      @AshesWake-sf7uw 2 месяца назад

      @@t3dotgg just 20$ dollars for these many requests 💀. I mean i get it, most of these are just favicon/really small requests which don't take a lot of bandwidth, but the amount of requests a single user generates on this site is just absurd. So, that low price is indeed shocking.

  • @MikkoRantalainen
    @MikkoRantalainen 2 месяца назад +64

    The real magic: accept 2h delay for every change and you can cache *everything* for 2h.

    • @PraiseYeezus
      @PraiseYeezus 2 месяца назад +3

      the realer magic: 300ms delay for every change and caching things only after they're requested

    • @xanderplayz3446
      @xanderplayz3446 2 месяца назад

      @@PraiseYeezusrealest magic: cache everything in the browser indexedb, and store a hash, so when the hash sent from the server to the client is different, the client downloads everything over again

    • @xiaoluwang7367
      @xiaoluwang7367 2 месяца назад

      @@PraiseYeezusis this actually how McMaster works??

    • @PraiseYeezus
      @PraiseYeezus 2 месяца назад

      @@xiaoluwang7367 no that's how Vercel's infra works

  • @NebulaCoding
    @NebulaCoding 26 дней назад +1

    As someone who makes games and web apps, I think mouse down should be the default for games and mouse up should be the default for web apps, but both have uses in each case. The biggest feature of mouse down is responsiveness, paramount for game actions, and mouse up's biggest thing is that you can cancel (most) actions by dragging before letting go, which has saved me personally from miss-clicking more times than I can count. If you're making a menu in a game, or you need drag or selection options, you might consider mouse up. If you want an action on a web app to be responsive above all else you should consider mouse down. Neither one is really a catch all for everything.

  • @spageen
    @spageen 2 месяца назад +96

    McMaster-Carr Speedrun (100%, glitchless)

    • @hqcart1
      @hqcart1 2 месяца назад +5

      jquery baby, oh yaah, yo heard me right

  • @henriquematias1986
    @henriquematias1986 2 месяца назад +18

    We all did things like this back on 90’s/00’s and it worked like a charm, no frameworks, no jQuery

    • @PraiseYeezus
      @PraiseYeezus 2 месяца назад +7

      which site did you build that performs this well?

    • @henriquematias1986
      @henriquematias1986 2 месяца назад

      @@PraiseYeezus Brazilian Channel 5 ( Rede Globo ) covering the Olympics in 2004 is a good example; we had super tight requirements with the size of the CSS and imagery.
      Basically, back in the day, at the end of 90's beginning of 00, you had to make websites that performed well because broadband wasn't so well spread, especially in South America.
      So it was expected that designers would know how to compress images and videos to the maximun amount of compression possible.
      Often, internet banners had incredibly low limits in size, so everyone back in the day would squeeze as many KB as possible of every single file.
      Nowadays, a lot of "designers" and "developers" will put images online without even trying to compress or make them the correct size before slapping them online.

    • @henriquematias1986
      @henriquematias1986 2 месяца назад

      ​@@PraiseYeezus for some reason my comment keeps being deleted. So i will rewrite it briefly, I wrote the main Brazilian website ( by our main tv channel, which was "the official channel" ) for covering the Athens olympics in the early 00's and many other websites and at the time broadband wasn't so popular and everyone in the team, designers and developers and project managers were well aware of file sizes and compression types, most projects had strict rules for file sizes and page load times. XMLHttpRequest API was standard and so it was having different if conditions for different popular browsers, jQuery was not there yet.

    • @randomuser66438
      @randomuser66438 2 месяца назад +1

      No jQuery before HTML5 and ES6 sounds like an awfully bad decision

  • @JenuelGanawed
    @JenuelGanawed 2 месяца назад +2

    this is really good to implement in a ecommerce website... it makes shopping online really really fast.

  •  2 месяца назад +4

    Man... I absolutely love your honesty when doing ads! Seriously!

  • @dgoenka1
    @dgoenka1 2 месяца назад +4

    I noticed a small but significant tweak that probably helps a lot: B&W images.. they probably get a lot of saving by the compression on top of the fact that images here are all small.. tthe result: the browser is done quicker loading and rendering the images

  • @theexploderofworlds3855
    @theexploderofworlds3855 2 месяца назад +1

    Used to work in a machine shop and would pick up hardware from one of their warehouses regularly. Great customer service and hardware, great company.

  • @m4lwElrohir
    @m4lwElrohir 2 месяца назад +26

    except for image flickering, pretty smooth UX

  • @lev1ato
    @lev1ato 2 месяца назад +14

    I have learned a lot from this video, more videos like this would be awesome

  • @eddie_dane
    @eddie_dane 2 месяца назад +14

    25:32 Good sir, everything here is magical, if think back to the days of vanilla and jquery, but I get your point.

  • @nemopeti
    @nemopeti 2 месяца назад +25

    What about server/CDN and network costs for this amount of prefetch? How it works on mobile clients, where is no hover event?

    • @shirkit
      @shirkit 2 месяца назад +21

      There's no free lunch. The original project does the same. You can choose not to preload the images if you're worried about that, only the HTML content.
      I'm gonna tell you for my company the price of traffic is easily covered by improved user experience.
      Also on mobile you can track the viewport and prefetch on item visible for a certain amount or some other metric, you'd need to research for a particular use case, or don't prefetch images and only the HTML for everything.
      Trade-offs are always there.

    • @ibnurasikh
      @ibnurasikh 2 месяца назад +17

      It's an eCommerce site, so network and bandwidth costs are very very low compared to the revenue generated from sales. However, load speed is crucial. I've seen a 30% drop in CTR/visitors when my website's page load time is slow.

    • @chri-k
      @chri-k Месяц назад

      @@ibnurasikhwhy tf are people so impatient

    • @ibnurasikh
      @ibnurasikh Месяц назад

      ​@@chri-k We may not fully understand why, but it’s a fact: an average discrepancy of 30% is common in tracking data. Personally, I’ve experienced decreases as high as 50%. A discrepancy of 4-10% is generally acceptable because some users may accidentally click the link without genuine intent.
      Now, imagine 10,000 users click your link, but analytics only registers 7,000 page views. That’s a huge gap! Every click has a cost. For example, if your cost-per-click (CPC) is $0.50, losing 3,000 clicks means losing $1,500. What a mess!

  • @filipturczynowicz-suszycki7728
    @filipturczynowicz-suszycki7728 2 месяца назад +1

    Great breakdown Theo!!

  • @juanenriquesegebre8873
    @juanenriquesegebre8873 2 месяца назад +7

    Love how at 3:57 he goes out of topic to compliment how pretty the nuts on this website are.

  • @jepqmw
    @jepqmw 2 месяца назад +7

    same thing is implemented in soundcloud when you hover on music it loads buffer and once u click it loaded buffer starts playing

  • @AndreiLiubinski
    @AndreiLiubinski 2 месяца назад +4

    3:57 >>thats pretty nuts
    Yeas, those are pretty nuts. And bolts

  • @hunter2473
    @hunter2473 2 месяца назад +8

    The images on Masters are actually Sprites

    • @tom_marsden
      @tom_marsden 2 месяца назад +3

      Great point. With sprites you are fetching far less images and then just using offsets.

  • @lever1209
    @lever1209 Месяц назад

    im typically a javascript hater coming from a backend history, but THIS is what js was meant to do, and people need to see and respect this more

  • @maazmunir9213
    @maazmunir9213 2 месяца назад

    This was a good video, learnt alot thanks!

  • @GuiChaguri
    @GuiChaguri 2 месяца назад +3

    I wonder how this project would perform on a self-hosted environment. We all know Vercel does a bunch of special optimizations for Next hosted in their cloud. I'm guessing it will still run pretty fast, but some of these optimizations will not work out of the box or not work at all

  • @brileecart
    @brileecart 2 месяца назад +10

    As a purchase manager that orders from McMaster CONSTANTLY, it's wild to me every time their website gets talked about. Worlds colliding or something lol

  • @KvapuJanjalia
    @KvapuJanjalia 2 месяца назад +6

    Looks like they are using good ol’ ASPNET Web Parts technology, which is considered dead nowadays.

  • @radiozradioz2419
    @radiozradioz2419 2 месяца назад +22

    Can Theo just appreciate a good website without dunking on it and shilling NextJS? He doesn't need to be so defensive all the time.

  • @BenoitStPierre
    @BenoitStPierre 2 месяца назад +2

    I'm really interested to hear why you're coming around to mousedown for interactions. I'm still in the mouseup camp but I haven't dug into it much and would love to hear what the arguments are! Future video?

  • @aymenbachiri-yh2hd
    @aymenbachiri-yh2hd 2 месяца назад +2

    This is awesome

  • @m-ok-6379
    @m-ok-6379 2 месяца назад +29

    95% React or any other JS framework developers can't build a website as fast as McMaster site.

    • @nicholasmaniccia1005
      @nicholasmaniccia1005 2 месяца назад +3

      I love how his take is "this loads a ton of JS" .. it's like sure dude but the JS that runs doesn't block the thread. He also says it loads more JS than half the things he builds... And would I be surprised to find out if half the things he builders are as complex as this site.
      I wanna make it through this video but this guy is just not making the points he thinks he is so far.

    • @_sjoe
      @_sjoe Месяц назад +2

      @@nicholasmaniccia1005 you’re repeating exactly what he said at 5:53.
      You missed the entire point. He’s saying that you CAN have lots of JS and still be fast, which is a counterpoint to the common idea that “tons of JS makes things slow.” His point is that McMaster is fast because the JS serves a purpose, and he is directly refuting the claim that their site isn’t heavy on JS.
      I don’t agree with every Theo video, but this was a good one. He did a pretty good job of correcting misconceptions about speed on the web.
      You can use basically any library or framework and make it as fast or slow as you want. It’s about knowing the web platform and how to use the tools.

  • @gr33nDestiny
    @gr33nDestiny 2 месяца назад

    Thanks for this, its Awesome

  • @90vackoo
    @90vackoo 2 месяца назад

    Thanks for finally doing this

  • @emilemil1
    @emilemil1 2 месяца назад +8

    Tbh I don't think it feels that fast, especially for a page that is all text aside from some tiny black and white images. Getting a simple page like that to behave like the NextFaster example isn't that difficult, preloading, caching, and not doing full page reloads will get you most of the way there. The reason most websites are slower is because A. they're loading much more data, and B. their focus is on adding features and developing quickly, not trying to get page loads down to milliseconds.

  • @robwhitaker8534
    @robwhitaker8534 2 месяца назад +2

    Googles page speed tool is nothing to do with site speed to user, and everything to do with first page load. Optimizing for first page load and optimizing for general site speed are two different kettles of fish. Google has to assume the user is loading the site for the first time

  • @bluegamer4210
    @bluegamer4210 2 месяца назад +3

    These videos are always fun to watch but I'd really like it if you were to put chapters in a video.

  • @jpegxguy
    @jpegxguy 29 дней назад

    The fast backend is at the core of it. The Ui can be good but I've seen bittlenecks from the backend taking seconds to respond
    What's more impressive to me is how detailed the selection is. You can find exact miniscule parts and there's dimentions and specififcations on everything. That's so cool. My country's lcoal shops do not offer any detail. Only generic descriptions.; I need dimentions man

  • @Coldsteak
    @Coldsteak 2 месяца назад

    2:34 thanks for validating me Theo

  • @Sammysapphira
    @Sammysapphira 2 месяца назад +2

    Pre fetching is something im shocked isnt more common. It used to be on lots of websites but then disappeared.

    • @BenFenner
      @BenFenner 2 месяца назад

      Good. Stop sending me shit I didn't ask for.
      -- All users everywhere.

  • @danglad5546
    @danglad5546 2 месяца назад

    Super useful video!

  • @vahn_legaia
    @vahn_legaia 15 дней назад

    The McMaster devs need to teach the rest of the web how to code and optimize.

  • @BorisBarroso
    @BorisBarroso 2 месяца назад +1

    Sveltekit can do that if you use the default behavior that loads a link on hover. Prefetching images is cool.

    • @d34d10ck
      @d34d10ck 2 месяца назад +1

      I don't think it's default behavior. You do have to explicitly set data-sveltekit-preload-data="hover" in either the anchor or body tag , don't you?

    • @BorisBarroso
      @BorisBarroso 2 месяца назад

      @ ok newer versions of sveltekit require this. I haven’t generated a new project in some time. Anyway is dead simple to make load content on hover.

  • @HamdiRizal
    @HamdiRizal 2 месяца назад +2

    If your ceo/manager asks you to rank higher on Pagespeed Insights, show them this video.

  • @hqcart1
    @hqcart1 2 месяца назад +27

    First, the comparision between McMaster and NextFaster is not fair, McMaster does actually query the database on each product, while NextFaster downloads 10MB on the first page. this is not going to work if you have bigger database.
    McMaster Tech:
    1. Jquery
    2. Styled Component
    this proves that all newcomers frameworks wanting to fix slowness problems that other frameworks had originally weren't there, bad coding and adding dependencies are what we don't need.

    • @t3dotgg
      @t3dotgg  2 месяца назад +5

      Did you watch the video? McMaster loads more data than NextFaster. Next also queries a database with literally millions of items in it.

    • @hqcart1
      @hqcart1 2 месяца назад

      @@t3dotgg even if it does, it's not as simple, what kind of enhancement on the database? is it in memory?, how big is it, is it redundant? knowing that NextFaster is all about speed, i am sure 100% they did some of the hacks to make it look that good, but in the real world, hello darkness my old friend...

    • @t3dotgg
      @t3dotgg  2 месяца назад +2

      @@hqcart1 Why don’t you take a look? It’s all open source and they’re very transparent about how it works.
      The database is Neon, which is a serverless ready Postgres provider. They provide most of what you’d hire a db admin for (backups, pooling, sharding etc)

    • @MrTonyFromEarth
      @MrTonyFromEarth 2 месяца назад +3

      People in the comments seriously overestimate how slow database queries are. In reality accessing a database is nothing compared to, say, network latency.

  • @ThePaisan
    @ThePaisan 2 месяца назад +2

    Wes Bos made a video on same thing 2weeks back then Codedamn hoped on the same thing and a dozen others.

  • @Crazyclay78YT
    @Crazyclay78YT Месяц назад

    ive been wondering why the website for our dispensary is so slow, maybe i can look at the code and see

  • @chrisalupului
    @chrisalupului 2 месяца назад +1

    Appreciate you Theo, thanks for the video! 😄👍
    Does fast mean more opportunities for vulnerabilities or less? Just curious your input on it.

    • @t3dotgg
      @t3dotgg  2 месяца назад +3

      Fast usually means simple. Simple usually means less surface area. Less surface area usually means less room for exploits. There's no hard rules here, but generally speaking, simpler = better

  • @Ng900-p4x
    @Ng900-p4x 2 месяца назад +1

    My marketing team needs to know when images were loaded for some reason. I need to write unoptimized in the next Image tag because when images are optimized by next js the URL has some params for getting the optimized image.
    Also, they say why the image loading feels slow :(

    • @nullvoid3545
      @nullvoid3545 2 месяца назад +2

      If you assume no malicious actors, then maybe the clients could keep track of page loads and dump them to the server in batches later on?

  • @m12652
    @m12652 2 месяца назад +11

    2:10 isn't pre-loading just because someone hovers a bit wasteful? I'd want to see stats on pre-loads to clicks first.

    • @doc8527
      @doc8527 2 месяца назад +9

      yes, it will be a waste if you don't click, that's the tradeoff of choosing pre-fetching. Your traffic and billing can sky rocket if you are not being careful. They can afford the prefetch to provide a better UX for their clients.
      Hence, there are lots of times you don't want to prefetch.

    • @m12652
      @m12652 2 месяца назад +2

      @ a waste is a waste, to the environment its a big deal. Whats the carbon footprint of those trillions of unnecessary preloads combined I wonder?

    • @tinnick
      @tinnick 2 месяца назад +5

      To be honest, hovering doesn’t exist on mobile devices which is where the concern about wasteful request network bill is mostly relevant so I think it’s a good trade off for desktop devices.
      Yeah, yeah. Hover might technically exist on mobile too, but if you disable it the trade off is only on desktop.

    • @tinnick
      @tinnick 2 месяца назад +1

      @@m12652
      Really 😅.
      Humans are quite wasteful too if you’re going to that length about environment concerns.
      Should we remove all toilets in the world because it’s inconvenient every time some one takes a dump to recycle as manure?
      I don’t think so, and I hope humanity is not heading that way.
      I think, It would be best in human interests to not sacrifice inconvenience but make up with other means for things we have been a little wasteful of.

    • @m12652
      @m12652 2 месяца назад

      @ very educated and totally relevant... you must be one of those people that thinks israel is a country 🙄

  • @rajofearth
    @rajofearth 2 месяца назад +3

    i used Brave Browser's Leo AI
    0:00 - Introduction to the video and the McMaster website
    1:40 - Analyzing the network requests and performance of the McMaster website
    5:00 - Introducing the "Next Faster" website as a comparison
    7:00 - Analyzing the performance and optimizations of the Next Faster website
    12:00 - Diving into the code of the Next Faster website
    16:00 - Discussing the custom Link component and image prefetching
    20:00 - Comparing the performance of McMaster vs Next Faster with throttling
    23:00 - Discussion of potential improvements to Next.js to incorporate the optimizations used in Next Faster
    26:00 - Conclusion and final thoughts

  • @TheSmashir
    @TheSmashir Месяц назад

    not related but try using violet shampoo ! helped me for my decoloration haha

  • @JLarky
    @JLarky 2 месяца назад

    Some of those optimizations are already in Next (2 weeks later)

  • @RealOscarMay
    @RealOscarMay 2 месяца назад

    The website also looks pretty good

  • @shgysk8zer0
    @shgysk8zer0 2 месяца назад +3

    You just gave me an idea to promote some things I work on because... I write things that are both minimal and fast. I'm sure I could attain that speed, and with lower load size.

    • @shgysk8zer0
      @shgysk8zer0 2 месяца назад

      Putting it to use on my local server for an 11ty site I took navigating after initial load down to ~25ms. Mostly only took 4 lines for setup, but I had to refactor some things to deal with adding event listeners on page load. Added < 6kb to my bundle size, before compression.
      Could probably get it down to like 4ms and even reduce bundle size, while making it easier to maintain, but that'd basically mean a complete rewrite of things.

  • @lld4ae
    @lld4ae 2 месяца назад +2

    Could you try using a 3G connection with a maximum speed of 16 Mbit/s for the German audience? That is what we have for optimizing websites here.🤮

  • @cherubin7th
    @cherubin7th 2 месяца назад +2

    nextfaster's way how the images flicker in makes me feel bad.

  • @nihardongara3025
    @nihardongara3025 2 месяца назад +1

    That’s how the old days worked

  • @deatho0ne587
    @deatho0ne587 2 месяца назад +1

    11:20 I am not a fan of loading tons of data before a user gets to a page. Yes, it is nice for user experience, but it is not nice for user download rates or company server rates.
    Did see stopLoading if the mouse moves out, which is nice

  • @noext7001
    @noext7001 2 месяца назад +1

    its good until you want SEO, google dont see tag

  • @arbitervildred8999
    @arbitervildred8999 2 месяца назад

    vanilla html is indeed faster than any framework, and as you said js doesn't matter, but what it does matters, routing and all that bs takes up computing, vanilla refs to text that you already downloaded is instant, the site is pretty basic too, they don't have to import 20 different components and scripts, and then write 20 lines to display a image and a text in a box

  • @zahash1045
    @zahash1045 2 месяца назад

    I’m sure it feels amazing to use this site on your optic fiber internet connection

    • @t3dotgg
      @t3dotgg  2 месяца назад +1

      I’m not on fiber sadly :( I also tried it with a very slow vpn and it still felt great!

  • @aghileslounis
    @aghileslounis 2 месяца назад +36

    Euhh...is it only me or?
    You are comparing a personal project with 0 users to McMaster? I'm confused.
    First of all, this next.js example is completely static, McMaster is NOT. It's fully dynamic as the server does a bunch of checks about the product's availability and much more.
    Like if you change something on the server, it's reflected immediately on McMaster. In this Next.js example it will not, it statically generates the pages.
    The Next.js example is more of a blog. It can NEVER EVER be a marketplace. You'll build 1000000 pages? Rebuild them every X time?....
    It's crazy to think that you can just like that, build something better and insult people's intelligence.
    It's NOT faster AT ALL. You're comparing a BLOG to a fully functional huge MARKETPLACE

    • @whydoyouneedmyname_
      @whydoyouneedmyname_ 2 месяца назад +4

      It's impressive surely, but it even talks about the optimizations it doesn't do compared to the real thing. It's like saying one of those design youtube/netflix clones are faster than the real thing

    • @aghileslounis
      @aghileslounis 2 месяца назад +3

      @@whydoyouneedmyname_ It's not impressive, i'm so sorry. It's just building the pages and prefetching. McMaster is a x1000 times more complex than that to achieve the speed in a real world situation.
      You could never ever achieve the speed of McMaster in reality using only these techniques, they are not enough, neither realist for a real marketplace

    • @anonymousfigure37
      @anonymousfigure37 2 месяца назад

      The Next.js example is not "completely static". Your claim to know about McMaster's backend caching logic is dubious (and provably incorrect; other videos detailing McMaster have covered this) because you don't even seem to know what this fully open source thing is doing even though the code is literally in the video you're commenting on. "x1000 times more complex" is goofy as hell too.

    • @aghileslounis
      @aghileslounis 2 месяца назад +3

      @@anonymousfigure37 I may have exaggerated, I can concede you that no problem, but I understand what it's doing and what McMaster is doing. I can tell that It's on completely another level.
      The code in this video was bad in a sense that It can't work on a real marketplace unless you change it to support all the McMaster features, which will make it way slower...even worse: if you keep it like that, it will crash instantly! The site wouldn't even load.

    • @anonymousfigure37
      @anonymousfigure37 2 месяца назад +1

      @@aghileslounis I think the biggest additional complexity the McMaster site has in terms of functionality is the whole left navigation experience, which certainly complicates the caching story. In fact if you get specific enough with the left nav filters, you start to experience slowdowns because of cache misses.
      I can't think of anything that McMaster is doing that would be difficult to implement performantly in Next.js. I mentioned the left nav interface; what features specifically do you have in mind?

  • @LaserFur
    @LaserFur 2 месяца назад

    When the McMaster-Carr website was first created it was not fast. back then it was faster to pull out the huge book than to access the website.

  • @kevin.malone
    @kevin.malone Месяц назад +1

    Bro I'm sure somebody has told you this by now, but I just gotta say it. That hair color is horrible for your complexion. Your skin is a reddish complexion, and the bleached blond just maximizes that redness in you. If you went with a darker color, or even platinum, it would tone down the redness so much. The brassy blond is like the worst possible color.
    At the very least, try adjusting your color cast in your video and tone down the magenta and bump green. But honestly, you just gotta ditch that tone of blond. It's time.

  • @olavisau
    @olavisau 2 месяца назад +15

    Um... loading a lot of JS is not always fine. At that point it only works quickly with high internet speeds, which is not something everybody has across the world. If your target customers are in the US / EU / Australia and other areas where internet bandwidth is fast, then sure you can send in more data to avoid more requests, but if your target customers are every country or africe / latam, then you really have to think about every byte sent to the customer.

  • @ac130kz
    @ac130kz 2 месяца назад

    a better idea is to preload cached pages with blurhashes and lazily load images afterwards. It's even faster and uses less resources (bandwidth, cpu)

    • @saiv46
      @saiv46 2 месяца назад +2

      You don't need bluehashes with Progressive JPEG.

    • @ac130kz
      @ac130kz 2 месяца назад

      @@saiv46 not all images are jpegs

    • @nullvoid3545
      @nullvoid3545 2 месяца назад

      @@ac130kz but they can be.

  • @ListenSomething
    @ListenSomething 2 месяца назад +8

    I am just really curious, why we just cannot use SPA version with a restful API of that instead of Next.js, especially if we're going to fetch all the endpoints in advance? I feel like we always reinvent the same wheel again and again. I remember my website which was fetching the HTML with sync ajax in 2013 with the exactly same speed. Surely, it wasn't complicated to build like in Next.js with dozens of optimizations.
    IMHO, there are many ways to build a website which can load faster. Surely, 99% of them easier than implementing in Next.js.
    Sorry, I just don't understand. Maybe, I am not nerd enough to get the point.

    • @redditrepo473
      @redditrepo473 2 месяца назад +3

      While you are objectively correct in saying that SPA + REST is superior, the fact is that Next has a significant footprint in the industry and as a result there will be content made around it

    • @t3dotgg
      @t3dotgg  2 месяца назад +8

      Show me one faster website built in a similar way with simpler tech. If it doesn’t have millions of pages, it doesn’t count.

    • @BCRooke1
      @BCRooke1 2 месяца назад +4

      And if you anticipate using mobile, having a REST API would be a big win

  • @pixiedev
    @pixiedev 2 месяца назад +5

    5 years ago I made a SPA website for my college using just Django and vanilla js and that was fast as f 😅.
    I made a router and for first request It download the full page then for any click it download only the part of page and then I attach/replace to the page part and head scripts without changing the layout.
    /first-page (full page with layout)
    /next-page?req=spa (only changed content not full layout)

  • @shadmansudipto7287
    @shadmansudipto7287 2 месяца назад +1

    2:34 no. This is dotnet framework which isn't very fast. The newer dotnet core is fast.

  • @amanx13
    @amanx13 2 месяца назад

    The Rollercoaster Tycoon of HTML

  • @kira_io
    @kira_io 2 месяца назад

    what about the desktop performance on lighthouse though?

  • @FusionHyperion
    @FusionHyperion 2 месяца назад +1

    the problem is always between the chair and the screen

  • @AashutoshRathi
    @AashutoshRathi 2 месяца назад

    "instant-fuckin-taneously"

  • @KlimYadrintsev
    @KlimYadrintsev 2 месяца назад +2

    Can someone please explain to me, if it is 1 mil products it means 1 mil photos. Which if you are using vercel image optimisation is around 5000 dollars. Who out of this enthusiast payed that much?
    The only reason I don’t use vercel image is because my side project makes no money and is not worth to spend 5 dollars per 1000 images

    • @Pandazaar
      @Pandazaar 2 месяца назад +3

      You do understand that if a legit shop has a million products, it's probably way too profitable to bother about $5k

    • @KlimYadrintsev
      @KlimYadrintsev 2 месяца назад +1

      @ the profit margins in e-commerce as an average is 30%
      5k is not a small amount

    • @UnknownPerson-wg1hw
      @UnknownPerson-wg1hw 2 месяца назад +1

      ​@@KlimYadrintsev uh.. yes it is

  • @thelethalmoo
    @thelethalmoo 2 месяца назад

    I wish they could do a fast version of jiiiraaa

  • @nightshade427
    @nightshade427 2 месяца назад +3

    do nextMaster pregenerate all product pages and everything? Wonder how long that takes to build? I don't think it fair comparison to the original site since I don't think they are pregenerating all product pages.

  • @Pikachu-oo5ro
    @Pikachu-oo5ro 2 месяца назад

    On mobile I assume they do come kind of intersection observers?

  • @SaurabhGuptaYT
    @SaurabhGuptaYT 2 месяца назад +2

    On face Prefetching looks good, but in most cases the amount of avoidable requests and load on the server it increases is not worth it unless it's just sending the static data which can be cached easily. If any db requests or computation is happening in the backend for those requests then it's just a waste of resources.

    • @whentheyD
      @whentheyD 2 месяца назад +1

      most websites that do not have UGC would benefit from this. correct me if i'm wrong

    • @hoangnguyenvuhuy5535
      @hoangnguyenvuhuy5535 2 месяца назад

      I think their website is niche enough to avoid the high cost. If it is a new website or some major market place like amazon or something then it will be the case. But industrial goods? Can't imagine there are that many user in the first place.

  • @delavanty
    @delavanty 20 дней назад

    If only amazon could use something similar.... Let alone the crashes u get randomly in their app lol

  • @cnikolov
    @cnikolov 2 месяца назад +1

    I think faster next missed to compress the images with brotli.

  • @Miguelmigs24
    @Miguelmigs24 2 месяца назад +1

    Couldn't stop noticing you're using a terminal called "Ghostty", what is that?

  • @salmenbejaoui1696
    @salmenbejaoui1696 2 месяца назад

    How much load prefetching all links can generate on the server? what about server and bandwidth costs?

  • @michaelnjensen
    @michaelnjensen 2 дня назад

    While this (NextFaster) is fast, it's also missing like 80% of what a real website needs to do/support, just doing a cached listing/product pages with no other functionality is easy in most frameworks, and would be equally fast.

  • @YevheniiViazovskyi
    @YevheniiViazovskyi 2 месяца назад

    I'm kinda scared of that request bombardment
    I clicked like 5 links and got 1.5k requests

  • @rikschaaf
    @rikschaaf 2 месяца назад

    I'm a fan of hover/mouse-down prefetch and mouse-up show. It's not infrequent for me that during the time that I'm clicking, I change my mind, so I turn the click into a drag to cancel the page navigation. For me it's fine that the page get's prefetched in such a case, as long as I can still cancel the navigation as described. Whether you use hover or mouse-down for prefetching depends on the usecase.

  • @deepanyai
    @deepanyai Месяц назад

    The website is still slow for me idk..

  • @alehkhantsevich113
    @alehkhantsevich113 2 месяца назад +3

    From Europe NextFaster doesn't feel fast. I would say McMaster-Carr feels much faster from here.

  • @peroconino
    @peroconino 2 месяца назад

    So, why they did all of that? Wouldnt be better to just use nextjs built-in prefetch?

  • @jeffhappens1
    @jeffhappens1 2 месяца назад +1

    How do we measure the speed?

  • @Gecho_Agency
    @Gecho_Agency 2 месяца назад

    Cheers Wes Bos

  • @dzlfiqar
    @dzlfiqar 2 месяца назад

    what is the browser that you are using?

  • @saiphaneeshk.h.5482
    @saiphaneeshk.h.5482 2 месяца назад +1

    Will it be as fast as it is now if caching invalidation of 2hrs is removed?
    Or is it playing a major role in time reduction?

    • @nullvoid3545
      @nullvoid3545 2 месяца назад

      I'm not very familiar with JS and so I don't know if he showed this in the video, but I wonder what exactly this 2 hour cache invalidation timeout effects?
      If things like stock and price cant update on every load or even update live, then I get the reasons for suspecting the cache is misrepresenting the comparison, but I lack the immediate skills to check without outpacing my own interest.
      But like, images only updating every 2 hours.
      Sure, why not?

  • @workworksam
    @workworksam 2 месяца назад

    Alright, but how to deal with huge Google Analytics, Tags and Facebook Pixels weight????