Interestingly enough, the edge McMaster has is not that their website is so insanely fast, its that everything you order will be delivered to you company in a couple of hours. So if you think the page loading is fast, checkout their delivery, lol
Their other edge is that they have every conceivable product. They are all around a premium quality service with high prices to match. When you need something specific, fast, to exact specifications and perfect every time, you use this company. When price matters more, you try your luck on Ali.
@@Z4KIUSTrivial performance gains like this rarely matter to begin with. Spend your time addressing issues that cost real money or adding features that make it. Chasing tiny page load speeds is just mindless busywork.
@ But this isn’t about addressing relevant performance issues, it’s about pointlessly squeezing out a bit more, in a contrived demo, just for the sake of it.
Worked at McMaster for a few years. This kind of glosses over how we’re also able to perfectly sort/filter/and serve up data on over a half million different part numbers. There’s a looooot of stuff going on in the backend for this
It’s very very impressive stuff, especially for how long it’s existed and worked. I wish more of that info was public so I could have talked in depth about it 🙃
I like how theo thinks McMaster's competitive edge is their website and not that they knock on your door with your parts 3 minutes after you complete the order. 😄
I've used this as my go-to pat response to "can you give me an example of good web design/UX/UI" in interviews for years, is great that it's getting attention now 🎉
So one of the things you seemed to miss was that with was a classic .NET 4.5 ASP website. So the tech for this is about 15 years old. All that javascript at 4:45 is auto genned. The back page for this is much simpler.
13:45 prefetching is great! When I started experimenting with HTMX, I immediately turned that on there as well (it supports both on mouse down and on hover, depending on your preferences). Great to see that next.js also supports it.
But is it really a good idea to download all the products pages when the user scrolls ? Can you imagine the server cost of doing that on a production site with a lot of visitors ?
@@PraiseYeezusrealest magic: cache everything in the browser indexedb, and store a hash, so when the hash sent from the server to the client is different, the client downloads everything over again
There's no free lunch. The original project does the same. You can choose not to preload the images if you're worried about that, only the HTML content. I'm gonna tell you for my company the price of traffic is easily covered by improved user experience. Also on mobile you can track the viewport and prefetch on item visible for a certain amount or some other metric, you'd need to research for a particular use case, or don't prefetch images and only the HTML for everything. Trade-offs are always there.
It's an eCommerce site, so network and bandwidth costs are very very low compared to the revenue generated from sales. However, load speed is crucial. I've seen a 30% drop in CTR/visitors when my website's page load time is slow.
@@chri-k We may not fully understand why, but it’s a fact: an average discrepancy of 30% is common in tracking data. Personally, I’ve experienced decreases as high as 50%. A discrepancy of 4-10% is generally acceptable because some users may accidentally click the link without genuine intent. Now, imagine 10,000 users click your link, but analytics only registers 7,000 page views. That’s a huge gap! Every click has a cost. For example, if your cost-per-click (CPC) is $0.50, losing 3,000 clicks means losing $1,500. What a mess!
@@PraiseYeezus Brazilian Channel 5 ( Rede Globo ) covering the Olympics in 2004 is a good example; we had super tight requirements with the size of the CSS and imagery. Basically, back in the day, at the end of 90's beginning of 00, you had to make websites that performed well because broadband wasn't so well spread, especially in South America. So it was expected that designers would know how to compress images and videos to the maximun amount of compression possible. Often, internet banners had incredibly low limits in size, so everyone back in the day would squeeze as many KB as possible of every single file. Nowadays, a lot of "designers" and "developers" will put images online without even trying to compress or make them the correct size before slapping them online.
@@PraiseYeezus for some reason my comment keeps being deleted. So i will rewrite it briefly, I wrote the main Brazilian website ( by our main tv channel, which was "the official channel" ) for covering the Athens olympics in the early 00's and many other websites and at the time broadband wasn't so popular and everyone in the team, designers and developers and project managers were well aware of file sizes and compression types, most projects had strict rules for file sizes and page load times. XMLHttpRequest API was standard and so it was having different if conditions for different popular browsers, jQuery was not there yet.
I love how his take is "this loads a ton of JS" .. it's like sure dude but the JS that runs doesn't block the thread. He also says it loads more JS than half the things he builds... And would I be surprised to find out if half the things he builders are as complex as this site. I wanna make it through this video but this guy is just not making the points he thinks he is so far.
@@nicholasmaniccia1005 you’re repeating exactly what he said at 5:53. You missed the entire point. He’s saying that you CAN have lots of JS and still be fast, which is a counterpoint to the common idea that “tons of JS makes things slow.” His point is that McMaster is fast because the JS serves a purpose, and he is directly refuting the claim that their site isn’t heavy on JS. I don’t agree with every Theo video, but this was a good one. He did a pretty good job of correcting misconceptions about speed on the web. You can use basically any library or framework and make it as fast or slow as you want. It’s about knowing the web platform and how to use the tools.
I noticed a small but significant tweak that probably helps a lot: B&W images.. they probably get a lot of saving by the compression on top of the fact that images here are all small.. tthe result: the browser is done quicker loading and rendering the images
Sid, i agree but he conveniently missed few key points as he is a React/Next Shill. Few key points Theo is missing: 1) Real site most likely is making db calls, cache(think redis/memcached) etc on the backend. whereas this “fake” site most likely is only mocking data. 2)Theo conveniently missed pointing out out the pricing. At scale Vercel will suck all the money out of your pocket. Whereas for the real site, they likely would just need to up a new ec2.
@mohitkumar-jv2bx as I mentioned in my reply above, the only one "conveniently missing" points here is you. 1) All of these examples use real databases. The DB for NextFaster has millions of entries. It's a fair comparison. 2) I've seen the bill for this project. Hosting costs are under $20 a month despite the absolutely insane amounts of traffic.
@@t3dotgg just 20$ dollars for these many requests 💀. I mean i get it, most of these are just favicon/really small requests which don't take a lot of bandwidth, but the amount of requests a single user generates on this site is just absurd. So, that low price is indeed shocking.
As a purchase manager that orders from McMaster CONSTANTLY, it's wild to me every time their website gets talked about. Worlds colliding or something lol
Месяц назад+4
Man... I absolutely love your honesty when doing ads! Seriously!
yes, it will be a waste if you don't click, that's the tradeoff of choosing pre-fetching. Your traffic and billing can sky rocket if you are not being careful. They can afford the prefetch to provide a better UX for their clients. Hence, there are lots of times you don't want to prefetch.
To be honest, hovering doesn’t exist on mobile devices which is where the concern about wasteful request network bill is mostly relevant so I think it’s a good trade off for desktop devices. Yeah, yeah. Hover might technically exist on mobile too, but if you disable it the trade off is only on desktop.
@@m12652 Really 😅. Humans are quite wasteful too if you’re going to that length about environment concerns. Should we remove all toilets in the world because it’s inconvenient every time some one takes a dump to recycle as manure? I don’t think so, and I hope humanity is not heading that way. I think, It would be best in human interests to not sacrifice inconvenience but make up with other means for things we have been a little wasteful of.
Tbh I don't think it feels that fast, especially for a page that is all text aside from some tiny black and white images. Getting a simple page like that to behave like the NextFaster example isn't that difficult, preloading, caching, and not doing full page reloads will get you most of the way there. The reason most websites are slower is because A. they're loading much more data, and B. their focus is on adding features and developing quickly, not trying to get page loads down to milliseconds.
I'm really interested to hear why you're coming around to mousedown for interactions. I'm still in the mouseup camp but I haven't dug into it much and would love to hear what the arguments are! Future video?
I wonder how this project would perform on a self-hosted environment. We all know Vercel does a bunch of special optimizations for Next hosted in their cloud. I'm guessing it will still run pretty fast, but some of these optimizations will not work out of the box or not work at all
i used Brave Browser's Leo AI 0:00 - Introduction to the video and the McMaster website 1:40 - Analyzing the network requests and performance of the McMaster website 5:00 - Introducing the "Next Faster" website as a comparison 7:00 - Analyzing the performance and optimizations of the Next Faster website 12:00 - Diving into the code of the Next Faster website 16:00 - Discussing the custom Link component and image prefetching 20:00 - Comparing the performance of McMaster vs Next Faster with throttling 23:00 - Discussion of potential improvements to Next.js to incorporate the optimizations used in Next Faster 26:00 - Conclusion and final thoughts
Googles page speed tool is nothing to do with site speed to user, and everything to do with first page load. Optimizing for first page load and optimizing for general site speed are two different kettles of fish. Google has to assume the user is loading the site for the first time
I'm not very familiar with JS and so I don't know if he showed this in the video, but I wonder what exactly this 2 hour cache invalidation timeout effects? If things like stock and price cant update on every load or even update live, then I get the reasons for suspecting the cache is misrepresenting the comparison, but I lack the immediate skills to check without outpacing my own interest. But like, images only updating every 2 hours. Sure, why not?
do nextMaster pregenerate all product pages and everything? Wonder how long that takes to build? I don't think it fair comparison to the original site since I don't think they are pregenerating all product pages.
My marketing team needs to know when images were loaded for some reason. I need to write unoptimized in the next Image tag because when images are optimized by next js the URL has some params for getting the optimized image. Also, they say why the image loading feels slow :(
11:20 I am not a fan of loading tons of data before a user gets to a page. Yes, it is nice for user experience, but it is not nice for user download rates or company server rates. Did see stopLoading if the mouse moves out, which is nice
First, the comparision between McMaster and NextFaster is not fair, McMaster does actually query the database on each product, while NextFaster downloads 10MB on the first page. this is not going to work if you have bigger database. McMaster Tech: 1. Jquery 2. Styled Component this proves that all newcomers frameworks wanting to fix slowness problems that other frameworks had originally weren't there, bad coding and adding dependencies are what we don't need.
@@t3dotgg even if it does, it's not as simple, what kind of enhancement on the database? is it in memory?, how big is it, is it redundant? knowing that NextFaster is all about speed, i am sure 100% they did some of the hacks to make it look that good, but in the real world, hello darkness my old friend...
@@hqcart1 Why don’t you take a look? It’s all open source and they’re very transparent about how it works. The database is Neon, which is a serverless ready Postgres provider. They provide most of what you’d hire a db admin for (backups, pooling, sharding etc)
People in the comments seriously overestimate how slow database queries are. In reality accessing a database is nothing compared to, say, network latency.
Can someone please explain to me, if it is 1 mil products it means 1 mil photos. Which if you are using vercel image optimisation is around 5000 dollars. Who out of this enthusiast payed that much? The only reason I don’t use vercel image is because my side project makes no money and is not worth to spend 5 dollars per 1000 images
You just gave me an idea to promote some things I work on because... I write things that are both minimal and fast. I'm sure I could attain that speed, and with lower load size.
Putting it to use on my local server for an 11ty site I took navigating after initial load down to ~25ms. Mostly only took 4 lines for setup, but I had to refactor some things to deal with adding event listeners on page load. Added < 6kb to my bundle size, before compression. Could probably get it down to like 4ms and even reduce bundle size, while making it easier to maintain, but that'd basically mean a complete rewrite of things.
Fast usually means simple. Simple usually means less surface area. Less surface area usually means less room for exploits. There's no hard rules here, but generally speaking, simpler = better
I am just really curious, why we just cannot use SPA version with a restful API of that instead of Next.js, especially if we're going to fetch all the endpoints in advance? I feel like we always reinvent the same wheel again and again. I remember my website which was fetching the HTML with sync ajax in 2013 with the exactly same speed. Surely, it wasn't complicated to build like in Next.js with dozens of optimizations. IMHO, there are many ways to build a website which can load faster. Surely, 99% of them easier than implementing in Next.js. Sorry, I just don't understand. Maybe, I am not nerd enough to get the point.
While you are objectively correct in saying that SPA + REST is superior, the fact is that Next has a significant footprint in the industry and as a result there will be content made around it
Few key points Theo is missing: 1) Real site most likely is making db calls, cache(think redis/memcached) etc on the backend. whereas this “fake” site most likely is only mocking data. 2)Theo conveniently missed pointing out out the pricing. At scale Vercel will suck all the money out of your pocket. Whereas for the real site, they likely would just need to up a new ec2.
1) All of these examples use real databases. The DB has millions of entries. It's a fair comparison. 2) I've seen the bill for this project. Hosting costs are under $20 a month despite the absolutely insane amounts of traffic
It's very sad and telling about our industry that none of these videos even give a passing mention of the ethics of sending a metric fuckton of bullshit to the user that they haven't asked for. Let alone the issues for those with metered connections.
Euhh...is it only me or? You are comparing a personal project with 0 users to McMaster? I'm confused. First of all, this next.js example is completely static, McMaster is NOT. It's fully dynamic as the server does a bunch of checks about the product's availability and much more. Like if you change something on the server, it's reflected immediately on McMaster. In this Next.js example it will not, it statically generates the pages. The Next.js example is more of a blog. It can NEVER EVER be a marketplace. You'll build 1000000 pages? Rebuild them every X time?.... It's crazy to think that you can just like that, build something better and insult people's intelligence. It's NOT faster AT ALL. You're comparing a BLOG to a fully functional huge MARKETPLACE
It's impressive surely, but it even talks about the optimizations it doesn't do compared to the real thing. It's like saying one of those design youtube/netflix clones are faster than the real thing
@@whydoyouneedmyname_ It's not impressive, i'm so sorry. It's just building the pages and prefetching. McMaster is a x1000 times more complex than that to achieve the speed in a real world situation. You could never ever achieve the speed of McMaster in reality using only these techniques, they are not enough, neither realist for a real marketplace
The Next.js example is not "completely static". Your claim to know about McMaster's backend caching logic is dubious (and provably incorrect; other videos detailing McMaster have covered this) because you don't even seem to know what this fully open source thing is doing even though the code is literally in the video you're commenting on. "x1000 times more complex" is goofy as hell too.
@@anonymousfigure37 I may have exaggerated, I can concede you that no problem, but I understand what it's doing and what McMaster is doing. I can tell that It's on completely another level. The code in this video was bad in a sense that It can't work on a real marketplace unless you change it to support all the McMaster features, which will make it way slower...even worse: if you keep it like that, it will crash instantly! The site wouldn't even load.
@@aghileslounis I think the biggest additional complexity the McMaster site has in terms of functionality is the whole left navigation experience, which certainly complicates the caching story. In fact if you get specific enough with the left nav filters, you start to experience slowdowns because of cache misses. I can't think of anything that McMaster is doing that would be difficult to implement performantly in Next.js. I mentioned the left nav interface; what features specifically do you have in mind?
vanilla html is indeed faster than any framework, and as you said js doesn't matter, but what it does matters, routing and all that bs takes up computing, vanilla refs to text that you already downloaded is instant, the site is pretty basic too, they don't have to import 20 different components and scripts, and then write 20 lines to display a image and a text in a box
Um... loading a lot of JS is not always fine. At that point it only works quickly with high internet speeds, which is not something everybody has across the world. If your target customers are in the US / EU / Australia and other areas where internet bandwidth is fast, then sure you can send in more data to avoid more requests, but if your target customers are every country or africe / latam, then you really have to think about every byte sent to the customer.
why it make fast because it reload the link when the link is hover so when the user is click then it automaticallly view the page because it like already loaded and thank you and i add it to my knowledge now,,
5 years ago I made a SPA website for my college using just Django and vanilla js and that was fast as f 😅. I made a router and for first request It download the full page then for any click it download only the part of page and then I attach/replace to the page part and head scripts without changing the layout. /first-page (full page with layout) /next-page?req=spa (only changed content not full layout)
Considering that it is already prefetched, it's amazing how shit browsers are that they still show those loadings instead of just pop the page into existence.
Interestingly enough, the edge McMaster has is not that their website is so insanely fast, its that everything you order will be delivered to you company in a couple of hours. So if you think the page loading is fast, checkout their delivery, lol
Better than amazon!
Their other edge is that they have every conceivable product. They are all around a premium quality service with high prices to match. When you need something specific, fast, to exact specifications and perfect every time, you use this company. When price matters more, you try your luck on Ali.
@thewhitefalcon8539 I'll say they have a massive selection, but I often do not find what I am looking for there.
couple hours delivery is quite slow for Russia. The delivery here us usually 15 to 30 minutes
@@Ginto_O You clearly don't live in a rural area of Russia. McMaster delivers ANYWHERE in the Continental US that fast.
Great example to show that choosing the right technology will not automatically make the website fast. You have to write good code.
good code in bad tech is often faster than bad code in good tech
@@Z4KIUSTrivial performance gains like this rarely matter to begin with. Spend your time addressing issues that cost real money or adding features that make it.
Chasing tiny page load speeds is just mindless busywork.
@@JohnSmith-op7ls good feature beats minuscule speed improvements, but big speed regressions at some point beat any features
@ But this isn’t about addressing relevant performance issues, it’s about pointlessly squeezing out a bit more, in a contrived demo, just for the sake of it.
why not make these feature the framework thing instead
Worked at McMaster for a few years. This kind of glosses over how we’re also able to perfectly sort/filter/and serve up data on over a half million different part numbers. There’s a looooot of stuff going on in the backend for this
It’s very very impressive stuff, especially for how long it’s existed and worked. I wish more of that info was public so I could have talked in depth about it 🙃
I like how theo thinks McMaster's competitive edge is their website and not that they knock on your door with your parts 3 minutes after you complete the order. 😄
I live right next to a warehouse so for me it's more like 1 minute 😂
The craziest shit with McMasterCarr thou... is it's even fast for Australians. And we can't even buy shit from them without shenanigans
No one cares about Australia, it's irrelevant to world affairs. Shoo.
4-6.. second.. load times on every page..
@@Coppertine_ That's fast for Oz isn't it?
@@kaldogorath not really.. compared to the next version, it's instant speeds
I've used this as my go-to pat response to "can you give me an example of good web design/UX/UI" in interviews for years, is great that it's getting attention now 🎉
McMaster-Carr, shouldering the weight of America's industrial might since 1901.
So one of the things you seemed to miss was that with was a classic .NET 4.5 ASP website. So the tech for this is about 15 years old. All that javascript at 4:45 is auto genned. The back page for this is much simpler.
13:45 prefetching is great! When I started experimenting with HTMX, I immediately turned that on there as well (it supports both on mouse down and on hover, depending on your preferences). Great to see that next.js also supports it.
But is it really a good idea to download all the products pages when the user scrolls ? Can you imagine the server cost of doing that on a production site with a lot of visitors ?
As an engineer, McMaster is the greatest website known to man
McMaster-Carr Speedrun (100%, glitchless)
jquery baby, oh yaah, yo heard me right
This is an interesting intersection between web development and ux. The site has amazing ux and software engineering.
The real magic: accept 2h delay for every change and you can cache *everything* for 2h.
the realer magic: 300ms delay for every change and caching things only after they're requested
@@PraiseYeezusrealest magic: cache everything in the browser indexedb, and store a hash, so when the hash sent from the server to the client is different, the client downloads everything over again
@@PraiseYeezusis this actually how McMaster works??
@@xiaoluwang7367 no that's how Vercel's infra works
What about server/CDN and network costs for this amount of prefetch? How it works on mobile clients, where is no hover event?
There's no free lunch. The original project does the same. You can choose not to preload the images if you're worried about that, only the HTML content.
I'm gonna tell you for my company the price of traffic is easily covered by improved user experience.
Also on mobile you can track the viewport and prefetch on item visible for a certain amount or some other metric, you'd need to research for a particular use case, or don't prefetch images and only the HTML for everything.
Trade-offs are always there.
It's an eCommerce site, so network and bandwidth costs are very very low compared to the revenue generated from sales. However, load speed is crucial. I've seen a 30% drop in CTR/visitors when my website's page load time is slow.
@@ibnurasikhwhy tf are people so impatient
@@chri-k We may not fully understand why, but it’s a fact: an average discrepancy of 30% is common in tracking data. Personally, I’ve experienced decreases as high as 50%. A discrepancy of 4-10% is generally acceptable because some users may accidentally click the link without genuine intent.
Now, imagine 10,000 users click your link, but analytics only registers 7,000 page views. That’s a huge gap! Every click has a cost. For example, if your cost-per-click (CPC) is $0.50, losing 3,000 clicks means losing $1,500. What a mess!
We all did things like this back on 90’s/00’s and it worked like a charm, no frameworks, no jQuery
which site did you build that performs this well?
@@PraiseYeezus Brazilian Channel 5 ( Rede Globo ) covering the Olympics in 2004 is a good example; we had super tight requirements with the size of the CSS and imagery.
Basically, back in the day, at the end of 90's beginning of 00, you had to make websites that performed well because broadband wasn't so well spread, especially in South America.
So it was expected that designers would know how to compress images and videos to the maximun amount of compression possible.
Often, internet banners had incredibly low limits in size, so everyone back in the day would squeeze as many KB as possible of every single file.
Nowadays, a lot of "designers" and "developers" will put images online without even trying to compress or make them the correct size before slapping them online.
@@PraiseYeezus for some reason my comment keeps being deleted. So i will rewrite it briefly, I wrote the main Brazilian website ( by our main tv channel, which was "the official channel" ) for covering the Athens olympics in the early 00's and many other websites and at the time broadband wasn't so popular and everyone in the team, designers and developers and project managers were well aware of file sizes and compression types, most projects had strict rules for file sizes and page load times. XMLHttpRequest API was standard and so it was having different if conditions for different popular browsers, jQuery was not there yet.
No jQuery before HTML5 and ES6 sounds like an awfully bad decision
Love how at 3:57 he goes out of topic to compliment how pretty the nuts on this website are.
95% React or any other JS framework developers can't build a website as fast as McMaster site.
I love how his take is "this loads a ton of JS" .. it's like sure dude but the JS that runs doesn't block the thread. He also says it loads more JS than half the things he builds... And would I be surprised to find out if half the things he builders are as complex as this site.
I wanna make it through this video but this guy is just not making the points he thinks he is so far.
@@nicholasmaniccia1005 you’re repeating exactly what he said at 5:53.
You missed the entire point. He’s saying that you CAN have lots of JS and still be fast, which is a counterpoint to the common idea that “tons of JS makes things slow.” His point is that McMaster is fast because the JS serves a purpose, and he is directly refuting the claim that their site isn’t heavy on JS.
I don’t agree with every Theo video, but this was a good one. He did a pretty good job of correcting misconceptions about speed on the web.
You can use basically any library or framework and make it as fast or slow as you want. It’s about knowing the web platform and how to use the tools.
3:57 >>thats pretty nuts
Yeas, those are pretty nuts. And bolts
except for image flickering, pretty smooth UX
this is really good to implement in a ecommerce website... it makes shopping online really really fast.
I noticed a small but significant tweak that probably helps a lot: B&W images.. they probably get a lot of saving by the compression on top of the fact that images here are all small.. tthe result: the browser is done quicker loading and rendering the images
Used to work in a machine shop and would pick up hardware from one of their warehouses regularly. Great customer service and hardware, great company.
25:32 Good sir, everything here is magical, if think back to the days of vanilla and jquery, but I get your point.
same thing is implemented in soundcloud when you hover on music it loads buffer and once u click it loaded buffer starts playing
The depths that you go to is honestly, unreal. I can only imagine what it takes to put these videos out. Kudos to you, my man!
Sid, i agree but he conveniently missed few key points as he is a React/Next Shill.
Few key points Theo is missing:
1) Real site most likely is making db calls, cache(think redis/memcached) etc on the backend. whereas this “fake” site most likely is only mocking data.
2)Theo conveniently missed pointing out out the pricing. At scale Vercel will suck all the money out of your pocket. Whereas for the real site, they likely would just need to up a new ec2.
@mohitkumar-jv2bx as I mentioned in my reply above, the only one "conveniently missing" points here is you.
1) All of these examples use real databases. The DB for NextFaster has millions of entries. It's a fair comparison.
2) I've seen the bill for this project. Hosting costs are under $20 a month despite the absolutely insane amounts of traffic.
@@t3dotgg just 20$ dollars for these many requests 💀. I mean i get it, most of these are just favicon/really small requests which don't take a lot of bandwidth, but the amount of requests a single user generates on this site is just absurd. So, that low price is indeed shocking.
As a purchase manager that orders from McMaster CONSTANTLY, it's wild to me every time their website gets talked about. Worlds colliding or something lol
Man... I absolutely love your honesty when doing ads! Seriously!
Looks like they are using good ol’ ASPNET Web Parts technology, which is considered dead nowadays.
2:10 isn't pre-loading just because someone hovers a bit wasteful? I'd want to see stats on pre-loads to clicks first.
yes, it will be a waste if you don't click, that's the tradeoff of choosing pre-fetching. Your traffic and billing can sky rocket if you are not being careful. They can afford the prefetch to provide a better UX for their clients.
Hence, there are lots of times you don't want to prefetch.
@ a waste is a waste, to the environment its a big deal. Whats the carbon footprint of those trillions of unnecessary preloads combined I wonder?
To be honest, hovering doesn’t exist on mobile devices which is where the concern about wasteful request network bill is mostly relevant so I think it’s a good trade off for desktop devices.
Yeah, yeah. Hover might technically exist on mobile too, but if you disable it the trade off is only on desktop.
@@m12652
Really 😅.
Humans are quite wasteful too if you’re going to that length about environment concerns.
Should we remove all toilets in the world because it’s inconvenient every time some one takes a dump to recycle as manure?
I don’t think so, and I hope humanity is not heading that way.
I think, It would be best in human interests to not sacrifice inconvenience but make up with other means for things we have been a little wasteful of.
@ very educated and totally relevant... you must be one of those people that thinks israel is a country 🙄
I have learned a lot from this video, more videos like this would be awesome
Too!
Great breakdown Theo!!
Tbh I don't think it feels that fast, especially for a page that is all text aside from some tiny black and white images. Getting a simple page like that to behave like the NextFaster example isn't that difficult, preloading, caching, and not doing full page reloads will get you most of the way there. The reason most websites are slower is because A. they're loading much more data, and B. their focus is on adding features and developing quickly, not trying to get page loads down to milliseconds.
The images on Masters are actually Sprites
Great point. With sprites you are fetching far less images and then just using offsets.
im typically a javascript hater coming from a backend history, but THIS is what js was meant to do, and people need to see and respect this more
This was a good video, learnt alot thanks!
I'm really interested to hear why you're coming around to mousedown for interactions. I'm still in the mouseup camp but I haven't dug into it much and would love to hear what the arguments are! Future video?
2:34 thanks for validating me Theo
I wonder how this project would perform on a self-hosted environment. We all know Vercel does a bunch of special optimizations for Next hosted in their cloud. I'm guessing it will still run pretty fast, but some of these optimizations will not work out of the box or not work at all
This is awesome
i used Brave Browser's Leo AI
0:00 - Introduction to the video and the McMaster website
1:40 - Analyzing the network requests and performance of the McMaster website
5:00 - Introducing the "Next Faster" website as a comparison
7:00 - Analyzing the performance and optimizations of the Next Faster website
12:00 - Diving into the code of the Next Faster website
16:00 - Discussing the custom Link component and image prefetching
20:00 - Comparing the performance of McMaster vs Next Faster with throttling
23:00 - Discussion of potential improvements to Next.js to incorporate the optimizations used in Next Faster
26:00 - Conclusion and final thoughts
Thanks for finally doing this
Thanks for this, its Awesome
These videos are always fun to watch but I'd really like it if you were to put chapters in a video.
what about the desktop performance on lighthouse though?
Googles page speed tool is nothing to do with site speed to user, and everything to do with first page load. Optimizing for first page load and optimizing for general site speed are two different kettles of fish. Google has to assume the user is loading the site for the first time
How much load prefetching all links can generate on the server? what about server and bandwidth costs?
Will it be as fast as it is now if caching invalidation of 2hrs is removed?
Or is it playing a major role in time reduction?
I'm not very familiar with JS and so I don't know if he showed this in the video, but I wonder what exactly this 2 hour cache invalidation timeout effects?
If things like stock and price cant update on every load or even update live, then I get the reasons for suspecting the cache is misrepresenting the comparison, but I lack the immediate skills to check without outpacing my own interest.
But like, images only updating every 2 hours.
Sure, why not?
If your ceo/manager asks you to rank higher on Pagespeed Insights, show them this video.
Can Theo just appreciate a good website without dunking on it and shilling NextJS? He doesn't need to be so defensive all the time.
Super useful video!
Pre fetching is something im shocked isnt more common. It used to be on lots of websites but then disappeared.
Good. Stop sending me shit I didn't ask for.
-- All users everywhere.
do nextMaster pregenerate all product pages and everything? Wonder how long that takes to build? I don't think it fair comparison to the original site since I don't think they are pregenerating all product pages.
What font are you using in vs code?
what is the browser that you are using?
nextfaster's way how the images flicker in makes me feel bad.
So, why they did all of that? Wouldnt be better to just use nextjs built-in prefetch?
My marketing team needs to know when images were loaded for some reason. I need to write unoptimized in the next Image tag because when images are optimized by next js the URL has some params for getting the optimized image.
Also, they say why the image loading feels slow :(
If you assume no malicious actors, then maybe the clients could keep track of page loads and dump them to the server in batches later on?
ive been wondering why the website for our dispensary is so slow, maybe i can look at the code and see
On mobile I assume they do come kind of intersection observers?
11:20 I am not a fan of loading tons of data before a user gets to a page. Yes, it is nice for user experience, but it is not nice for user download rates or company server rates.
Did see stopLoading if the mouse moves out, which is nice
First, the comparision between McMaster and NextFaster is not fair, McMaster does actually query the database on each product, while NextFaster downloads 10MB on the first page. this is not going to work if you have bigger database.
McMaster Tech:
1. Jquery
2. Styled Component
this proves that all newcomers frameworks wanting to fix slowness problems that other frameworks had originally weren't there, bad coding and adding dependencies are what we don't need.
Did you watch the video? McMaster loads more data than NextFaster. Next also queries a database with literally millions of items in it.
@@t3dotgg even if it does, it's not as simple, what kind of enhancement on the database? is it in memory?, how big is it, is it redundant? knowing that NextFaster is all about speed, i am sure 100% they did some of the hacks to make it look that good, but in the real world, hello darkness my old friend...
@@hqcart1 Why don’t you take a look? It’s all open source and they’re very transparent about how it works.
The database is Neon, which is a serverless ready Postgres provider. They provide most of what you’d hire a db admin for (backups, pooling, sharding etc)
People in the comments seriously overestimate how slow database queries are. In reality accessing a database is nothing compared to, say, network latency.
All of this prefetching, is it intensive on a server (assuming a live production environment)?
Seems like it would be no?
a business doesnt care if they can deliver fast products
Wes Bos made a video on same thing 2weeks back then Codedamn hoped on the same thing and a dozen others.
The website also looks pretty good
That’s how the old days worked
Sveltekit can do that if you use the default behavior that loads a link on hover. Prefetching images is cool.
I don't think it's default behavior. You do have to explicitly set data-sveltekit-preload-data="hover" in either the anchor or body tag , don't you?
@ ok newer versions of sveltekit require this. I haven’t generated a new project in some time. Anyway is dead simple to make load content on hover.
what is the font that is being used here.
Some of those optimizations are already in Next (2 weeks later)
2:34 no. This is dotnet framework which isn't very fast. The newer dotnet core is fast.
Can someone please explain to me, if it is 1 mil products it means 1 mil photos. Which if you are using vercel image optimisation is around 5000 dollars. Who out of this enthusiast payed that much?
The only reason I don’t use vercel image is because my side project makes no money and is not worth to spend 5 dollars per 1000 images
You do understand that if a legit shop has a million products, it's probably way too profitable to bother about $5k
@ the profit margins in e-commerce as an average is 30%
5k is not a small amount
@@KlimYadrintsev uh.. yes it is
You just gave me an idea to promote some things I work on because... I write things that are both minimal and fast. I'm sure I could attain that speed, and with lower load size.
Putting it to use on my local server for an 11ty site I took navigating after initial load down to ~25ms. Mostly only took 4 lines for setup, but I had to refactor some things to deal with adding event listeners on page load. Added < 6kb to my bundle size, before compression.
Could probably get it down to like 4ms and even reduce bundle size, while making it easier to maintain, but that'd basically mean a complete rewrite of things.
How do we measure the speed?
I use a stopwatch, personally
Appreciate you Theo, thanks for the video! 😄👍
Does fast mean more opportunities for vulnerabilities or less? Just curious your input on it.
Fast usually means simple. Simple usually means less surface area. Less surface area usually means less room for exploits. There's no hard rules here, but generally speaking, simpler = better
The Rollercoaster Tycoon of HTML
Couldn't stop noticing you're using a terminal called "Ghostty", what is that?
Sponsor? I feel Vercel(Next.js) is a long term sponsor of the channel.
I’m sure it feels amazing to use this site on your optic fiber internet connection
I’m not on fiber sadly :( I also tried it with a very slow vpn and it still felt great!
I am just really curious, why we just cannot use SPA version with a restful API of that instead of Next.js, especially if we're going to fetch all the endpoints in advance? I feel like we always reinvent the same wheel again and again. I remember my website which was fetching the HTML with sync ajax in 2013 with the exactly same speed. Surely, it wasn't complicated to build like in Next.js with dozens of optimizations.
IMHO, there are many ways to build a website which can load faster. Surely, 99% of them easier than implementing in Next.js.
Sorry, I just don't understand. Maybe, I am not nerd enough to get the point.
While you are objectively correct in saying that SPA + REST is superior, the fact is that Next has a significant footprint in the industry and as a result there will be content made around it
Show me one faster website built in a similar way with simpler tech. If it doesn’t have millions of pages, it doesn’t count.
And if you anticipate using mobile, having a REST API would be a big win
Few key points Theo is missing:
1) Real site most likely is making db calls, cache(think redis/memcached) etc on the backend. whereas this “fake” site most likely is only mocking data.
2)Theo conveniently missed pointing out out the pricing. At scale Vercel will suck all the money out of your pocket. Whereas for the real site, they likely would just need to up a new ec2.
1) All of these examples use real databases. The DB has millions of entries. It's a fair comparison.
2) I've seen the bill for this project. Hosting costs are under $20 a month despite the absolutely insane amounts of traffic
hey blind 12:40 theo shows next one uses db calls
@@t3dotgg I mean realistically the site would have had that much traffic for a few days only no?
not related but try using violet shampoo ! helped me for my decoloration haha
It's very sad and telling about our industry that none of these videos even give a passing mention of the ethics of sending a metric fuckton of bullshit to the user that they haven't asked for. Let alone the issues for those with metered connections.
Euhh...is it only me or?
You are comparing a personal project with 0 users to McMaster? I'm confused.
First of all, this next.js example is completely static, McMaster is NOT. It's fully dynamic as the server does a bunch of checks about the product's availability and much more.
Like if you change something on the server, it's reflected immediately on McMaster. In this Next.js example it will not, it statically generates the pages.
The Next.js example is more of a blog. It can NEVER EVER be a marketplace. You'll build 1000000 pages? Rebuild them every X time?....
It's crazy to think that you can just like that, build something better and insult people's intelligence.
It's NOT faster AT ALL. You're comparing a BLOG to a fully functional huge MARKETPLACE
It's impressive surely, but it even talks about the optimizations it doesn't do compared to the real thing. It's like saying one of those design youtube/netflix clones are faster than the real thing
@@whydoyouneedmyname_ It's not impressive, i'm so sorry. It's just building the pages and prefetching. McMaster is a x1000 times more complex than that to achieve the speed in a real world situation.
You could never ever achieve the speed of McMaster in reality using only these techniques, they are not enough, neither realist for a real marketplace
The Next.js example is not "completely static". Your claim to know about McMaster's backend caching logic is dubious (and provably incorrect; other videos detailing McMaster have covered this) because you don't even seem to know what this fully open source thing is doing even though the code is literally in the video you're commenting on. "x1000 times more complex" is goofy as hell too.
@@anonymousfigure37 I may have exaggerated, I can concede you that no problem, but I understand what it's doing and what McMaster is doing. I can tell that It's on completely another level.
The code in this video was bad in a sense that It can't work on a real marketplace unless you change it to support all the McMaster features, which will make it way slower...even worse: if you keep it like that, it will crash instantly! The site wouldn't even load.
@@aghileslounis I think the biggest additional complexity the McMaster site has in terms of functionality is the whole left navigation experience, which certainly complicates the caching story. In fact if you get specific enough with the left nav filters, you start to experience slowdowns because of cache misses.
I can't think of anything that McMaster is doing that would be difficult to implement performantly in Next.js. I mentioned the left nav interface; what features specifically do you have in mind?
this video is sponsored by nuts and bolts !
Cheers Wes Bos
Back when websites were built by code veterans optimizing for 1ms
I wish they could do a fast version of jiiiraaa
When the McMaster-Carr website was first created it was not fast. back then it was faster to pull out the huge book than to access the website.
vanilla html is indeed faster than any framework, and as you said js doesn't matter, but what it does matters, routing and all that bs takes up computing, vanilla refs to text that you already downloaded is instant, the site is pretty basic too, they don't have to import 20 different components and scripts, and then write 20 lines to display a image and a text in a box
Designer is the enemy of the web performance.
Um... loading a lot of JS is not always fine. At that point it only works quickly with high internet speeds, which is not something everybody has across the world. If your target customers are in the US / EU / Australia and other areas where internet bandwidth is fast, then sure you can send in more data to avoid more requests, but if your target customers are every country or africe / latam, then you really have to think about every byte sent to the customer.
why it make fast because it reload the link when the link is hover so when the user is click then it automaticallly view the page because it like already loaded and thank you and i add it to my knowledge now,,
From Europe NextFaster doesn't feel fast. I would say McMaster-Carr feels much faster from here.
It's not 1.6mb js transferred . Its only 784kb in 4:03
a better idea is to preload cached pages with blurhashes and lazily load images afterwards. It's even faster and uses less resources (bandwidth, cpu)
You don't need bluehashes with Progressive JPEG.
@@saiv46 not all images are jpegs
@@ac130kz but they can be.
Perks of watching Theo 🎉
Faster than my Figma prototype
Alright, but how to deal with huge Google Analytics, Tags and Facebook Pixels weight????
async loading
its good until you want SEO, google dont see tag
5 years ago I made a SPA website for my college using just Django and vanilla js and that was fast as f 😅.
I made a router and for first request It download the full page then for any click it download only the part of page and then I attach/replace to the page part and head scripts without changing the layout.
/first-page (full page with layout)
/next-page?req=spa (only changed content not full layout)
HTMX sounds like it’s up your alley
Considering that it is already prefetched, it's amazing how shit browsers are that they still show those loadings instead of just pop the page into existence.