@@Z4KIUSTrivial performance gains like this rarely matter to begin with. Spend your time addressing issues that cost real money or adding features that make it. Chasing tiny page load speeds is just mindless busywork.
@ But this isn’t about addressing relevant performance issues, it’s about pointlessly squeezing out a bit more, in a contrived demo, just for the sake of it.
Interestingly enough, the edge McMaster has is not that their website is so insanely fast, its that everything you order will be delivered to you company in a couple of hours. So if you think the page loading is fast, checkout their delivery, lol
Their other edge is that they have every conceivable product. They are all around a premium quality service with high prices to match. When you need something specific, fast, to exact specifications and perfect every time, you use this company. When price matters more, you try your luck on Ali.
Worked at McMaster for a few years. This kind of glosses over how we’re also able to perfectly sort/filter/and serve up data on over a half million different part numbers. There’s a looooot of stuff going on in the backend for this
It’s very very impressive stuff, especially for how long it’s existed and worked. I wish more of that info was public so I could have talked in depth about it 🙃
I like how theo thinks McMaster's competitive edge is their website and not that they knock on your door with your parts 3 minutes after you complete the order. 😄
I've used this as my go-to pat response to "can you give me an example of good web design/UX/UI" in interviews for years, is great that it's getting attention now 🎉
13:45 prefetching is great! When I started experimenting with HTMX, I immediately turned that on there as well (it supports both on mouse down and on hover, depending on your preferences). Great to see that next.js also supports it.
So one of the things you seemed to miss was that with was a classic .NET 4.5 ASP website. So the tech for this is about 15 years old. All that javascript at 4:45 is auto genned. The back page for this is much simpler.
As a purchase manager that orders from McMaster CONSTANTLY, it's wild to me every time their website gets talked about. Worlds colliding or something lol
@@PraiseYeezusrealest magic: cache everything in the browser indexedb, and store a hash, so when the hash sent from the server to the client is different, the client downloads everything over again
There's no free lunch. The original project does the same. You can choose not to preload the images if you're worried about that, only the HTML content. I'm gonna tell you for my company the price of traffic is easily covered by improved user experience. Also on mobile you can track the viewport and prefetch on item visible for a certain amount or some other metric, you'd need to research for a particular use case, or don't prefetch images and only the HTML for everything. Trade-offs are always there.
It's an eCommerce site, so network and bandwidth costs are very very low compared to the revenue generated from sales. However, load speed is crucial. I've seen a 30% drop in CTR/visitors when my website's page load time is slow.
Sid, i agree but he conveniently missed few key points as he is a React/Next Shill. Few key points Theo is missing: 1) Real site most likely is making db calls, cache(think redis/memcached) etc on the backend. whereas this “fake” site most likely is only mocking data. 2)Theo conveniently missed pointing out out the pricing. At scale Vercel will suck all the money out of your pocket. Whereas for the real site, they likely would just need to up a new ec2.
@mohitkumar-jv2bx as I mentioned in my reply above, the only one "conveniently missing" points here is you. 1) All of these examples use real databases. The DB for NextFaster has millions of entries. It's a fair comparison. 2) I've seen the bill for this project. Hosting costs are under $20 a month despite the absolutely insane amounts of traffic.
@@t3dotgg just 20$ dollars for these many requests 💀. I mean i get it, most of these are just favicon/really small requests which don't take a lot of bandwidth, but the amount of requests a single user generates on this site is just absurd. So, that low price is indeed shocking.
@@PraiseYeezus Brazilian Channel 5 ( Rede Globo ) covering the Olympics in 2004 is a good example; we had super tight requirements with the size of the CSS and imagery. Basically, back in the day, at the end of 90's beginning of 00, you had to make websites that performed well because broadband wasn't so well spread, especially in South America. So it was expected that designers would know how to compress images and videos to the maximun amount of compression possible. Often, internet banners had incredibly low limits in size, so everyone back in the day would squeeze as many KB as possible of every single file. Nowadays, a lot of "designers" and "developers" will put images online without even trying to compress or make them the correct size before slapping them online.
@@PraiseYeezus for some reason my comment keeps being deleted. So i will rewrite it briefly, I wrote the main Brazilian website ( by our main tv channel, which was "the official channel" ) for covering the Athens olympics in the early 00's and many other websites and at the time broadband wasn't so popular and everyone in the team, designers and developers and project managers were well aware of file sizes and compression types, most projects had strict rules for file sizes and page load times. XMLHttpRequest API was standard and so it was having different if conditions for different popular browsers, jQuery was not there yet.
I noticed a small but significant tweak that probably helps a lot: B&W images.. they probably get a lot of saving by the compression on top of the fact that images here are all small.. tthe result: the browser is done quicker loading and rendering the images
"Choosing a specific technology won't necessarily make your website faster"...Nextjs makes all optimizations default..."Choosing Nextjs will definitely make your website faster"
I wonder how this project would perform on a self-hosted environment. We all know Vercel does a bunch of special optimizations for Next hosted in their cloud. I'm guessing it will still run pretty fast, but some of these optimizations will not work out of the box or not work at all
Tbh I don't think it feels that fast, especially for a page that is all text aside from some tiny black and white images. Getting a simple page like that to behave like the NextFaster example isn't that difficult, preloading, caching, and not doing full page reloads will get you most of the way there. The reason most websites are slower is because A. they're loading much more data, and B. their focus is on adding features and developing quickly, not trying to get page loads down to milliseconds.
I'm really interested to hear why you're coming around to mousedown for interactions. I'm still in the mouseup camp but I haven't dug into it much and would love to hear what the arguments are! Future video?
Fast usually means simple. Simple usually means less surface area. Less surface area usually means less room for exploits. There's no hard rules here, but generally speaking, simpler = better
You just gave me an idea to promote some things I work on because... I write things that are both minimal and fast. I'm sure I could attain that speed, and with lower load size.
Putting it to use on my local server for an 11ty site I took navigating after initial load down to ~25ms. Mostly only took 4 lines for setup, but I had to refactor some things to deal with adding event listeners on page load. Added < 6kb to my bundle size, before compression. Could probably get it down to like 4ms and even reduce bundle size, while making it easier to maintain, but that'd basically mean a complete rewrite of things.
yes, it will be a waste if you don't click, that's the tradeoff of choosing pre-fetching. Your traffic and billing can sky rocket if you are not being careful. They can afford the prefetch to provide a better UX for their clients. Hence, there are lots of times you don't want to prefetch.
To be honest, hovering doesn’t exist on mobile devices which is where the concern about wasteful request network bill is mostly relevant so I think it’s a good trade off for desktop devices. Yeah, yeah. Hover might technically exist on mobile too, but if you disable it the trade off is only on desktop.
@@m12652 Really 😅. Humans are quite wasteful too if you’re going to that length about environment concerns. Should we remove all toilets in the world because it’s inconvenient every time some one takes a dump to recycle as manure? I don’t think so, and I hope humanity is not heading that way. I think, It would be best in human interests to not sacrifice inconvenience but make up with other means for things we have been a little wasteful of.
Few key points Theo is missing: 1) Real site most likely is making db calls, cache(think redis/memcached) etc on the backend. whereas this “fake” site most likely is only mocking data. 2)Theo conveniently missed pointing out out the pricing. At scale Vercel will suck all the money out of your pocket. Whereas for the real site, they likely would just need to up a new ec2.
1) All of these examples use real databases. The DB has millions of entries. It's a fair comparison. 2) I've seen the bill for this project. Hosting costs are under $20 a month despite the absolutely insane amounts of traffic
First, the comparision between McMaster and NextFaster is not fair, McMaster does actually query the database on each product, while NextFaster downloads 10MB on the first page. this is not going to work if you have bigger database. McMaster Tech: 1. Jquery 2. Styled Component this proves that all newcomers frameworks wanting to fix slowness problems that other frameworks had originally weren't there, bad coding and adding dependencies are what we don't need.
@@t3dotgg even if it does, it's not as simple, what kind of enhancement on the database? is it in memory?, how big is it, is it redundant? knowing that NextFaster is all about speed, i am sure 100% they did some of the hacks to make it look that good, but in the real world, hello darkness my old friend...
@@hqcart1 Why don’t you take a look? It’s all open source and they’re very transparent about how it works. The database is Neon, which is a serverless ready Postgres provider. They provide most of what you’d hire a db admin for (backups, pooling, sharding etc)
People in the comments seriously overestimate how slow database queries are. In reality accessing a database is nothing compared to, say, network latency.
Googles page speed tool is nothing to do with site speed to user, and everything to do with first page load. Optimizing for first page load and optimizing for general site speed are two different kettles of fish. Google has to assume the user is loading the site for the first time
I'm not very familiar with JS and so I don't know if he showed this in the video, but I wonder what exactly this 2 hour cache invalidation timeout effects? If things like stock and price cant update on every load or even update live, then I get the reasons for suspecting the cache is misrepresenting the comparison, but I lack the immediate skills to check without outpacing my own interest. But like, images only updating every 2 hours. Sure, why not?
My marketing team needs to know when images were loaded for some reason. I need to write unoptimized in the next Image tag because when images are optimized by next js the URL has some params for getting the optimized image. Also, they say why the image loading feels slow :(
do nextMaster pregenerate all product pages and everything? Wonder how long that takes to build? I don't think it fair comparison to the original site since I don't think they are pregenerating all product pages.
I am just really curious, why we just cannot use SPA version with a restful API of that instead of Next.js, especially if we're going to fetch all the endpoints in advance? I feel like we always reinvent the same wheel again and again. I remember my website which was fetching the HTML with sync ajax in 2013 with the exactly same speed. Surely, it wasn't complicated to build like in Next.js with dozens of optimizations. IMHO, there are many ways to build a website which can load faster. Surely, 99% of them easier than implementing in Next.js. Sorry, I just don't understand. Maybe, I am not nerd enough to get the point.
While you are objectively correct in saying that SPA + REST is superior, the fact is that Next has a significant footprint in the industry and as a result there will be content made around it
Can someone please explain to me, if it is 1 mil products it means 1 mil photos. Which if you are using vercel image optimisation is around 5000 dollars. Who out of this enthusiast payed that much? The only reason I don’t use vercel image is because my side project makes no money and is not worth to spend 5 dollars per 1000 images
5 years ago I made a SPA website for my college using just Django and vanilla js and that was fast as f 😅. I made a router and for first request It download the full page then for any click it download only the part of page and then I attach/replace to the page part and head scripts without changing the layout. /first-page (full page with layout) /next-page?req=spa (only changed content not full layout)
This is also example that Fastest website doesnt really that matter, we care because we look at the number, but what does that means to website consumer? sometimes more function could be helpful rather than microptimizing stuff
why it make fast because it reload the link when the link is hover so when the user is click then it automaticallly view the page because it like already loaded and thank you and i add it to my knowledge now,,
Um... loading a lot of JS is not always fine. At that point it only works quickly with high internet speeds, which is not something everybody has across the world. If your target customers are in the US / EU / Australia and other areas where internet bandwidth is fast, then sure you can send in more data to avoid more requests, but if your target customers are every country or africe / latam, then you really have to think about every byte sent to the customer.
Euhh...is it only me or? You are comparing a personal project with 0 users to McMaster? I'm confused. First of all, this next.js example is completely static, McMaster is NOT. It's fully dynamic as the server does a bunch of checks about the product's availability and much more. Like if you change something on the server, it's reflected immediately on McMaster. In this Next.js example it will not, it statically generates the pages. The Next.js example is more of a blog. It can NEVER EVER be a marketplace. You'll build 1000000 pages? Rebuild them every X time?.... It's crazy to think that you can just like that, build something better and insult people's intelligence. It's NOT faster AT ALL. You're comparing a BLOG to a fully functional huge MARKETPLACE
It's impressive surely, but it even talks about the optimizations it doesn't do compared to the real thing. It's like saying one of those design youtube/netflix clones are faster than the real thing
@@whydoyouneedmyname_ It's not impressive, i'm so sorry. It's just building the pages and prefetching. McMaster is a x1000 times more complex than that to achieve the speed in a real world situation. You could never ever achieve the speed of McMaster in reality using only these techniques, they are not enough, neither realist for a real marketplace
The Next.js example is not "completely static". Your claim to know about McMaster's backend caching logic is dubious (and provably incorrect; other videos detailing McMaster have covered this) because you don't even seem to know what this fully open source thing is doing even though the code is literally in the video you're commenting on. "x1000 times more complex" is goofy as hell too.
@@anonymousfigure37 I may have exaggerated, I can concede you that no problem, but I understand what it's doing and what McMaster is doing. I can tell that It's on completely another level. The code in this video was bad in a sense that It can't work on a real marketplace unless you change it to support all the McMaster features, which will make it way slower...even worse: if you keep it like that, it will crash instantly! The site wouldn't even load.
@@aghileslounis I think the biggest additional complexity the McMaster site has in terms of functionality is the whole left navigation experience, which certainly complicates the caching story. In fact if you get specific enough with the left nav filters, you start to experience slowdowns because of cache misses. I can't think of anything that McMaster is doing that would be difficult to implement performantly in Next.js. I mentioned the left nav interface; what features specifically do you have in mind?
Great example to show that choosing the right technology will not automatically make the website fast. You have to write good code.
good code in bad tech is often faster than bad code in good tech
@@Z4KIUSTrivial performance gains like this rarely matter to begin with. Spend your time addressing issues that cost real money or adding features that make it.
Chasing tiny page load speeds is just mindless busywork.
@@JohnSmith-op7ls good feature beats minuscule speed improvements, but big speed regressions at some point beat any features
@ But this isn’t about addressing relevant performance issues, it’s about pointlessly squeezing out a bit more, in a contrived demo, just for the sake of it.
why not make these feature the framework thing instead
Interestingly enough, the edge McMaster has is not that their website is so insanely fast, its that everything you order will be delivered to you company in a couple of hours. So if you think the page loading is fast, checkout their delivery, lol
Better than amazon!
Their other edge is that they have every conceivable product. They are all around a premium quality service with high prices to match. When you need something specific, fast, to exact specifications and perfect every time, you use this company. When price matters more, you try your luck on Ali.
@thewhitefalcon8539 I'll say they have a massive selection, but I often do not find what I am looking for there.
couple hours delivery is quite slow for Russia. The delivery here us usually 15 to 30 minutes
@@Ginto_O You clearly don't live in a rural area of Russia. McMaster delivers ANYWHERE in the Continental US that fast.
Worked at McMaster for a few years. This kind of glosses over how we’re also able to perfectly sort/filter/and serve up data on over a half million different part numbers. There’s a looooot of stuff going on in the backend for this
It’s very very impressive stuff, especially for how long it’s existed and worked. I wish more of that info was public so I could have talked in depth about it 🙃
I like how theo thinks McMaster's competitive edge is their website and not that they knock on your door with your parts 3 minutes after you complete the order. 😄
I live right next to a warehouse so for me it's more like 1 minute 😂
The craziest shit with McMasterCarr thou... is it's even fast for Australians. And we can't even buy shit from them without shenanigans
No one cares about Australia, it's irrelevant to world affairs. Shoo.
I've used this as my go-to pat response to "can you give me an example of good web design/UX/UI" in interviews for years, is great that it's getting attention now 🎉
McMaster-Carr, shouldering the weight of America's industrial might since 1901.
13:45 prefetching is great! When I started experimenting with HTMX, I immediately turned that on there as well (it supports both on mouse down and on hover, depending on your preferences). Great to see that next.js also supports it.
McMaster-Carr Speedrun (100%, glitchless)
jquery baby, oh yaah, yo heard me right
As an engineer, McMaster is the greatest website known to man
So one of the things you seemed to miss was that with was a classic .NET 4.5 ASP website. So the tech for this is about 15 years old. All that javascript at 4:45 is auto genned. The back page for this is much simpler.
As a purchase manager that orders from McMaster CONSTANTLY, it's wild to me every time their website gets talked about. Worlds colliding or something lol
The real magic: accept 2h delay for every change and you can cache *everything* for 2h.
the realer magic: 300ms delay for every change and caching things only after they're requested
@@PraiseYeezusrealest magic: cache everything in the browser indexedb, and store a hash, so when the hash sent from the server to the client is different, the client downloads everything over again
@@PraiseYeezusis this actually how McMaster works??
@@xiaoluwang7367 no that's how Vercel's infra works
25:32 Good sir, everything here is magical, if think back to the days of vanilla and jquery, but I get your point.
What about server/CDN and network costs for this amount of prefetch? How it works on mobile clients, where is no hover event?
There's no free lunch. The original project does the same. You can choose not to preload the images if you're worried about that, only the HTML content.
I'm gonna tell you for my company the price of traffic is easily covered by improved user experience.
Also on mobile you can track the viewport and prefetch on item visible for a certain amount or some other metric, you'd need to research for a particular use case, or don't prefetch images and only the HTML for everything.
Trade-offs are always there.
It's an eCommerce site, so network and bandwidth costs are very very low compared to the revenue generated from sales. However, load speed is crucial. I've seen a 30% drop in CTR/visitors when my website's page load time is slow.
except for image flickering, pretty smooth UX
Looks like they are using good ol’ ASPNET Web Parts technology, which is considered dead nowadays.
The depths that you go to is honestly, unreal. I can only imagine what it takes to put these videos out. Kudos to you, my man!
Sid, i agree but he conveniently missed few key points as he is a React/Next Shill.
Few key points Theo is missing:
1) Real site most likely is making db calls, cache(think redis/memcached) etc on the backend. whereas this “fake” site most likely is only mocking data.
2)Theo conveniently missed pointing out out the pricing. At scale Vercel will suck all the money out of your pocket. Whereas for the real site, they likely would just need to up a new ec2.
@mohitkumar-jv2bx as I mentioned in my reply above, the only one "conveniently missing" points here is you.
1) All of these examples use real databases. The DB for NextFaster has millions of entries. It's a fair comparison.
2) I've seen the bill for this project. Hosting costs are under $20 a month despite the absolutely insane amounts of traffic.
@@t3dotgg just 20$ dollars for these many requests 💀. I mean i get it, most of these are just favicon/really small requests which don't take a lot of bandwidth, but the amount of requests a single user generates on this site is just absurd. So, that low price is indeed shocking.
this is really good to implement in a ecommerce website... it makes shopping online really really fast.
This is an interesting intersection between web development and ux. The site has amazing ux and software engineering.
We all did things like this back on 90’s/00’s and it worked like a charm, no frameworks, no jQuery
which site did you build that performs this well?
@@PraiseYeezus Brazilian Channel 5 ( Rede Globo ) covering the Olympics in 2004 is a good example; we had super tight requirements with the size of the CSS and imagery.
Basically, back in the day, at the end of 90's beginning of 00, you had to make websites that performed well because broadband wasn't so well spread, especially in South America.
So it was expected that designers would know how to compress images and videos to the maximun amount of compression possible.
Often, internet banners had incredibly low limits in size, so everyone back in the day would squeeze as many KB as possible of every single file.
Nowadays, a lot of "designers" and "developers" will put images online without even trying to compress or make them the correct size before slapping them online.
@@PraiseYeezus for some reason my comment keeps being deleted. So i will rewrite it briefly, I wrote the main Brazilian website ( by our main tv channel, which was "the official channel" ) for covering the Athens olympics in the early 00's and many other websites and at the time broadband wasn't so popular and everyone in the team, designers and developers and project managers were well aware of file sizes and compression types, most projects had strict rules for file sizes and page load times. XMLHttpRequest API was standard and so it was having different if conditions for different popular browsers, jQuery was not there yet.
No jQuery before HTML5 and ES6 sounds like an awfully bad decision
Love how at 3:57 he goes out of topic to compliment how pretty the nuts on this website are.
Man... I absolutely love your honesty when doing ads! Seriously!
3:57 >>thats pretty nuts
Yeas, those are pretty nuts. And bolts
95% React or any other JS framework developers can't build a website as fast as McMaster site.
Used to work in a machine shop and would pick up hardware from one of their warehouses regularly. Great customer service and hardware, great company.
These videos are always fun to watch but I'd really like it if you were to put chapters in a video.
Great breakdown Theo!!
about every youtube streamer has covered this already
I have learned a lot from this video, more videos like this would be awesome
Too!
I noticed a small but significant tweak that probably helps a lot: B&W images.. they probably get a lot of saving by the compression on top of the fact that images here are all small.. tthe result: the browser is done quicker loading and rendering the images
same thing is implemented in soundcloud when you hover on music it loads buffer and once u click it loaded buffer starts playing
This was a good video, learnt alot thanks!
The images on Masters are actually Sprites
Great point. With sprites you are fetching far less images and then just using offsets.
This is awesome
Thanks for finally doing this
Thanks for this, its Awesome
"Choosing a specific technology won't necessarily make your website faster"...Nextjs makes all optimizations default..."Choosing Nextjs will definitely make your website faster"
Back when websites were built by code veterans optimizing for 1ms
Super useful video!
If your ceo/manager asks you to rank higher on Pagespeed Insights, show them this video.
Wes Bos made a video on same thing 2weeks back then Codedamn hoped on the same thing and a dozen others.
I wonder how this project would perform on a self-hosted environment. We all know Vercel does a bunch of special optimizations for Next hosted in their cloud. I'm guessing it will still run pretty fast, but some of these optimizations will not work out of the box or not work at all
Tbh I don't think it feels that fast, especially for a page that is all text aside from some tiny black and white images. Getting a simple page like that to behave like the NextFaster example isn't that difficult, preloading, caching, and not doing full page reloads will get you most of the way there. The reason most websites are slower is because A. they're loading much more data, and B. their focus is on adding features and developing quickly, not trying to get page loads down to milliseconds.
Can Theo just appreciate a good website without dunking on it and shilling NextJS? He doesn't need to be so defensive all the time.
nextfaster's way how the images flicker in makes me feel bad.
I'm really interested to hear why you're coming around to mousedown for interactions. I'm still in the mouseup camp but I haven't dug into it much and would love to hear what the arguments are! Future video?
Ah yes be prepared for a lot of bandwith cost . Especially when using aws wrapper vercel
Appreciate you Theo, thanks for the video! 😄👍
Does fast mean more opportunities for vulnerabilities or less? Just curious your input on it.
Fast usually means simple. Simple usually means less surface area. Less surface area usually means less room for exploits. There's no hard rules here, but generally speaking, simpler = better
You just gave me an idea to promote some things I work on because... I write things that are both minimal and fast. I'm sure I could attain that speed, and with lower load size.
Putting it to use on my local server for an 11ty site I took navigating after initial load down to ~25ms. Mostly only took 4 lines for setup, but I had to refactor some things to deal with adding event listeners on page load. Added < 6kb to my bundle size, before compression.
Could probably get it down to like 4ms and even reduce bundle size, while making it easier to maintain, but that'd basically mean a complete rewrite of things.
The website also looks pretty good
htmx prefetch can do similar hover prefetching pretty easily, including the images of the prefetched page
Im currently convincing my principal eng’s to rewrite our whole website because i saw the next-faster page the other day… wish me luck.
Sponsor? I feel Vercel(Next.js) is a long term sponsor of the channel.
Pre fetching is something im shocked isnt more common. It used to be on lots of websites but then disappeared.
I’m sure it feels amazing to use this site on your optic fiber internet connection
I’m not on fiber sadly :( I also tried it with a very slow vpn and it still felt great!
2:10 isn't pre-loading just because someone hovers a bit wasteful? I'd want to see stats on pre-loads to clicks first.
yes, it will be a waste if you don't click, that's the tradeoff of choosing pre-fetching. Your traffic and billing can sky rocket if you are not being careful. They can afford the prefetch to provide a better UX for their clients.
Hence, there are lots of times you don't want to prefetch.
@ a waste is a waste, to the environment its a big deal. Whats the carbon footprint of those trillions of unnecessary preloads combined I wonder?
To be honest, hovering doesn’t exist on mobile devices which is where the concern about wasteful request network bill is mostly relevant so I think it’s a good trade off for desktop devices.
Yeah, yeah. Hover might technically exist on mobile too, but if you disable it the trade off is only on desktop.
@@m12652
Really 😅.
Humans are quite wasteful too if you’re going to that length about environment concerns.
Should we remove all toilets in the world because it’s inconvenient every time some one takes a dump to recycle as manure?
I don’t think so, and I hope humanity is not heading that way.
I think, It would be best in human interests to not sacrifice inconvenience but make up with other means for things we have been a little wasteful of.
@ very educated and totally relevant... you must be one of those people that thinks israel is a country 🙄
The Rollercoaster Tycoon of HTML
Some of those optimizations are already in Next (2 weeks later)
Few key points Theo is missing:
1) Real site most likely is making db calls, cache(think redis/memcached) etc on the backend. whereas this “fake” site most likely is only mocking data.
2)Theo conveniently missed pointing out out the pricing. At scale Vercel will suck all the money out of your pocket. Whereas for the real site, they likely would just need to up a new ec2.
1) All of these examples use real databases. The DB has millions of entries. It's a fair comparison.
2) I've seen the bill for this project. Hosting costs are under $20 a month despite the absolutely insane amounts of traffic
hey blind 12:40 theo shows next one uses db calls
@@t3dotgg I mean realistically the site would have had that much traffic for a few days only no?
First, the comparision between McMaster and NextFaster is not fair, McMaster does actually query the database on each product, while NextFaster downloads 10MB on the first page. this is not going to work if you have bigger database.
McMaster Tech:
1. Jquery
2. Styled Component
this proves that all newcomers frameworks wanting to fix slowness problems that other frameworks had originally weren't there, bad coding and adding dependencies are what we don't need.
Did you watch the video? McMaster loads more data than NextFaster. Next also queries a database with literally millions of items in it.
@@t3dotgg even if it does, it's not as simple, what kind of enhancement on the database? is it in memory?, how big is it, is it redundant? knowing that NextFaster is all about speed, i am sure 100% they did some of the hacks to make it look that good, but in the real world, hello darkness my old friend...
@@hqcart1 Why don’t you take a look? It’s all open source and they’re very transparent about how it works.
The database is Neon, which is a serverless ready Postgres provider. They provide most of what you’d hire a db admin for (backups, pooling, sharding etc)
People in the comments seriously overestimate how slow database queries are. In reality accessing a database is nothing compared to, say, network latency.
That’s how the old days worked
Googles page speed tool is nothing to do with site speed to user, and everything to do with first page load. Optimizing for first page load and optimizing for general site speed are two different kettles of fish. Google has to assume the user is loading the site for the first time
Sveltekit can do that if you use the default behavior that loads a link on hover. Prefetching images is cool.
I don't think it's default behavior. You do have to explicitly set data-sveltekit-preload-data="hover" in either the anchor or body tag , don't you?
@ ok newer versions of sveltekit require this. I haven’t generated a new project in some time. Anyway is dead simple to make load content on hover.
Will it be as fast as it is now if caching invalidation of 2hrs is removed?
Or is it playing a major role in time reduction?
I'm not very familiar with JS and so I don't know if he showed this in the video, but I wonder what exactly this 2 hour cache invalidation timeout effects?
If things like stock and price cant update on every load or even update live, then I get the reasons for suspecting the cache is misrepresenting the comparison, but I lack the immediate skills to check without outpacing my own interest.
But like, images only updating every 2 hours.
Sure, why not?
Cheers Wes Bos
How much load prefetching all links can generate on the server? what about server and bandwidth costs?
My marketing team needs to know when images were loaded for some reason. I need to write unoptimized in the next Image tag because when images are optimized by next js the URL has some params for getting the optimized image.
Also, they say why the image loading feels slow :(
If you assume no malicious actors, then maybe the clients could keep track of page loads and dump them to the server in batches later on?
I wish they could do a fast version of jiiiraaa
do nextMaster pregenerate all product pages and everything? Wonder how long that takes to build? I don't think it fair comparison to the original site since I don't think they are pregenerating all product pages.
I am just really curious, why we just cannot use SPA version with a restful API of that instead of Next.js, especially if we're going to fetch all the endpoints in advance? I feel like we always reinvent the same wheel again and again. I remember my website which was fetching the HTML with sync ajax in 2013 with the exactly same speed. Surely, it wasn't complicated to build like in Next.js with dozens of optimizations.
IMHO, there are many ways to build a website which can load faster. Surely, 99% of them easier than implementing in Next.js.
Sorry, I just don't understand. Maybe, I am not nerd enough to get the point.
While you are objectively correct in saying that SPA + REST is superior, the fact is that Next has a significant footprint in the industry and as a result there will be content made around it
Show me one faster website built in a similar way with simpler tech. If it doesn’t have millions of pages, it doesn’t count.
And if you anticipate using mobile, having a REST API would be a big win
its good until you want SEO, google dont see tag
From Europe NextFaster doesn't feel fast. I would say McMaster-Carr feels much faster from here.
Can someone please explain to me, if it is 1 mil products it means 1 mil photos. Which if you are using vercel image optimisation is around 5000 dollars. Who out of this enthusiast payed that much?
The only reason I don’t use vercel image is because my side project makes no money and is not worth to spend 5 dollars per 1000 images
You do understand that if a legit shop has a million products, it's probably way too profitable to bother about $5k
@ the profit margins in e-commerce as an average is 30%
5k is not a small amount
@@KlimYadrintsev uh.. yes it is
5 years ago I made a SPA website for my college using just Django and vanilla js and that was fast as f 😅.
I made a router and for first request It download the full page then for any click it download only the part of page and then I attach/replace to the page part and head scripts without changing the layout.
/first-page (full page with layout)
/next-page?req=spa (only changed content not full layout)
HTMX sounds like it’s up your alley
Designer is the enemy of the web performance.
I'm kinda scared of that request bombardment
I clicked like 5 links and got 1.5k requests
this video is sponsored by nuts and bolts !
the problem is always between the chair and the screen
PEBKAC
So, why they did all of that? Wouldnt be better to just use nextjs built-in prefetch?
On mobile I assume they do come kind of intersection observers?
Faster than my Figma prototype
a better idea is to preload cached pages with blurhashes and lazily load images afterwards. It's even faster and uses less resources (bandwidth, cpu)
You don't need bluehashes with Progressive JPEG.
@@saiv46 not all images are jpegs
@@ac130kz but they can be.
Perks of watching Theo 🎉
How do we measure the speed?
I use a stopwatch, personally
Couldn't stop noticing you're using a terminal called "Ghostty", what is that?
Oh god
Htmx preload extension ftw
When the McMaster-Carr website was first created it was not fast. back then it was faster to pull out the huge book than to access the website.
What would be cool is of its fast on GRPS
I think faster next missed to compress the images with brotli.
What font are you using in vs code?
This is also example that Fastest website doesnt really that matter, we care because we look at the number, but what does that means to website consumer?
sometimes more function could be helpful rather than microptimizing stuff
why it make fast because it reload the link when the link is hover so when the user is click then it automaticallly view the page because it like already loaded and thank you and i add it to my knowledge now,,
"instant-fuckin-taneously"
this is why a lot of web technologies feel pointless to- _OH_
Um... loading a lot of JS is not always fine. At that point it only works quickly with high internet speeds, which is not something everybody has across the world. If your target customers are in the US / EU / Australia and other areas where internet bandwidth is fast, then sure you can send in more data to avoid more requests, but if your target customers are every country or africe / latam, then you really have to think about every byte sent to the customer.
Euhh...is it only me or?
You are comparing a personal project with 0 users to McMaster? I'm confused.
First of all, this next.js example is completely static, McMaster is NOT. It's fully dynamic as the server does a bunch of checks about the product's availability and much more.
Like if you change something on the server, it's reflected immediately on McMaster. In this Next.js example it will not, it statically generates the pages.
The Next.js example is more of a blog. It can NEVER EVER be a marketplace. You'll build 1000000 pages? Rebuild them every X time?....
It's crazy to think that you can just like that, build something better and insult people's intelligence.
It's NOT faster AT ALL. You're comparing a BLOG to a fully functional huge MARKETPLACE
It's impressive surely, but it even talks about the optimizations it doesn't do compared to the real thing. It's like saying one of those design youtube/netflix clones are faster than the real thing
@@whydoyouneedmyname_ It's not impressive, i'm so sorry. It's just building the pages and prefetching. McMaster is a x1000 times more complex than that to achieve the speed in a real world situation.
You could never ever achieve the speed of McMaster in reality using only these techniques, they are not enough, neither realist for a real marketplace
The Next.js example is not "completely static". Your claim to know about McMaster's backend caching logic is dubious (and provably incorrect; other videos detailing McMaster have covered this) because you don't even seem to know what this fully open source thing is doing even though the code is literally in the video you're commenting on. "x1000 times more complex" is goofy as hell too.
@@anonymousfigure37 I may have exaggerated, I can concede you that no problem, but I understand what it's doing and what McMaster is doing. I can tell that It's on completely another level.
The code in this video was bad in a sense that It can't work on a real marketplace unless you change it to support all the McMaster features, which will make it way slower...even worse: if you keep it like that, it will crash instantly! The site wouldn't even load.
@@aghileslounis I think the biggest additional complexity the McMaster site has in terms of functionality is the whole left navigation experience, which certainly complicates the caching story. In fact if you get specific enough with the left nav filters, you start to experience slowdowns because of cache misses.
I can't think of anything that McMaster is doing that would be difficult to implement performantly in Next.js. I mentioned the left nav interface; what features specifically do you have in mind?
Do a vid on tauri 2.0
what is the font that is being used here.