Server side rendering does not require full page reloads. Partials are virtually equivalent to components, the only difference is where the content rendering takes place. Scalability argument doesn't matter for 80% of developers and this applies more if you want a full SPA experience, you can use a hybrid approach where pages are pages and reactive content is reactive content. I think there's space for more than one style of frontend development, plus, why are all the frontend frameworks adopting SSR if it's not the solution?
It's crazy because we've been here before. We had anti-JS zealots years ago when JS was invented. Guess what, JS won because it made web applications much easier to develop and better for users. Now they're coming back and saying "well, just use a little javascript to sprinkle interactivity here and there." No, we tried that too. Anybody who tried writing complex applications with jQuery on server rendered pages knows it sucked. And it wasn't because developers sucked then. We tried the pattern, and it evolved into modern UI libraries like React and Vue
If I'm speaking broadly it's because it's broadly the truth. Sure there are exceptions. And we all know there are drawbacks with JS, I'll be the first to admit it. But the solution to that isn't going back to the same old design patterns we had 10-15 years ago and solved with JS component models. It would be like if we said "there are problems with cars" and the solution was to bring back horses. The reason it looks like such a convenient solution is because it's just regression.
I think the whole discussion comes from the feeling that webdevelopment moves really fast and that there is always new stuff out there being worked on. Andrews view sounds a bit nostalgic, like: "back in the days, just HTML, CSS and JS ... don't need all this fancy new stuff, just plain and simple...". I suppose many people vibe with that, but that's not the whole picture. Sure, for a simple static site, just go with that, but if you want something bigger, you will need a more sophisticated approach, which may be more complicated (which is only natural), but less complicated compared to a big project that doesn't have a system and an idea behind it, so: big project (with a framework) vs. big project (without a framework) I mean a framework is basically the result of optimizing your project for something bigger, that's why they exist.
I've built the same complex app both in Vanilla js and Solid js. It took about 2 months to build in Vanilla js, but I used the jQuery and DataTables library to solve a lot of my problems. Without jQuery it would have taken me even longer. The app was functional but not great. Then I built the same app again using Solid js and it took just 2 weeks and had a lot of unexpected benefits like being able to build dynamic data tables without importing the library - and having pages retain large amounts of data without having to reload it every time the user navigates away and comes back. To achieve the same using jQuery you would have to hide/show entire pages and the code would get very messy and counter-intuitive very quickly.
As someone who is currently in the process of rebuilding a vanilla app in solid, I second this take, although I still like using vanilla JS for persona/experimental projects. It just seems like the more you have to roll out by hand, the more opportunities there are to learn.
@@BraintemOrg I've dabbled in Laravel professionally in back in 2012. It's a good choice for server side projects. For me, Solid js is the best for client side projects. There's no reason a website couldn't use both.
Why do people need a one solution for all? Can't people have a discussion and pick out best tools for their requirements? Isn't that why we are engineers ffs?
This video response wasn't to say we need a single solution, you're absolutely correct we *should* use the right tool fort he job, and what I'm arguing is that SSR for web applications is *not* the right tool (for all the points covered). If however you don't have a web application and your site is static, then I wouldn't even SSR or SPA is the right tool in that case SSG would be the right tool.
I think you gloss over one of the biggest reasons: locality of data. SPA’s have the benefit of being able to leverage a client side cache which can feel far more responsive than needing to wait for a server response for every interaction.
One more thing to say here is that. once you add the html blocks to PHP or Blade/Laravel view engine. It get very difficult to make changes to layout and design. your html markup and css classes gets scattered into different strings in those templates. You can easily further tweak the frontend in case of SPA or SSR with Nextjs. --- Practically SPAs - are only suitable for certain use cases like admin Section, or sites where content changes frequently. Because, SPA make it difficult for search engine crawlers, which results in bad SEO. SSR - Server Side Rendered sites are more suited for things like blogs, landing pages, ecommerce sites etc
Very good point here regarding SEO, that is indeed on key legitimate criticism of SPA, there has been some solutions in this area, as you may already be aware there are solutions such as Nuxt and Next which can do isomorphic builds, but that's another rabbit hole!
Using HTMX and some template engine can fix a lot of backend issues including spaghetti code. Then there is rendring part of the code and core api part that feches data. Also if the request header request Json then the api can return Json instead of Html. That way it works like rest service for mobile etc. Only reason where in client heavy application like games etc. This approach does not work well. But many applications are CRUD applications that can benefit from simplicity of operations. Specially when it come to user permission and roles implementation which backend is more close to in an application.
A few points 1. Html over the wire scales just fine 2. You're missing the point that you are adding the complexity or reimplementing all of the server side logic on the client via redux, or vuex, etc.. this creates just as much spaghetti code ( i haven't seen an enterprise production app prevent this). 3. Building an app based on your needs now and adding api end point later is just fine. 4. (Opinion) i haven't seen something like Nextjs used at a large scale. I have seen Angular and React, and it was just as much as a mess.
React and Vue are built to have the application live in the browser, completely without a server, if need be. The same goes for Blazor. Those apps will store data on the client and push to the server when available. It adds lots of logic on the client side and basically doubles every domain logic. Htmx requires the server being always on. Which is okay for most 99% of apps. Htmx features should be Html standard.
Great video. I think many people are just frustrated with the current state of JS tooling, which is constantly evolving/breaking, fragmented, often slow and not simple to use. Meanwhile most of my backends from a decade ago probably still build and run (or require minimal migrations). If JS frontend dev would require a simple "js build --release" all the time and the same at every project much less people would complain. It's getting better with things like Vite or Bun, but still quite a bit to go.
Thank you! Absolutely agree the JS fatigue is real and I have found Vite to be a breath of real fresh air, and that's the positive direction the ecosystem needs to move towards, I think I may do the next video into vite!
5 min into video you do not provide any reasoning for why he is wrong, you just keep arguing that he does not know why, but if you know then explain it. Then it also seems you are ok with someone else abstracting the parts which renders json and stiches that on the front end, and that is perfectly fine. And sure if your product gets too big that you don't want to have large teams writing custom components for each and every item you serve to users that are running your webpage on various clients then it makes perfect sense to migrate to a framework that would help you do that more quickly and efficiently, but most websites out there are not that. They are not serving millions of users every hour, and what ends up happening when you visit these websites is you get an over engineerd bloatware that ends up using your entire memory just to animate the scoll into view. But that's just me
I literally provided the reasons as to why he is wrong, please re-watch carefully. I re-iterated the points throughout the video as well. Almost everyone else has understood the points I raised against Andrew.
Fundamentally, the server does not know anything about the state of the UI. It cannot know it. Sometimes, that's okay, because you just send the initial state of the UI and don't care about what happens next. But the moment you try doing something more complicated you will need it and a server-side render architecture is just not right for that. If you chose it, you'll be stuck with that forever, and you'll end up making more and more workarounds to somehow make it work. Wanna have a form on a page and also a paginated list? Well, crap you're out of luck, you now have to do partial page renders and manually patch up the client side. It can be done, but it'll be a mess. And the next programmer will have no clue what is going on. While such things just come out of the box with React etc.
If server-side is so great then how come apps are routinely rewritten to use a frontend framework but I've literally never heard of any effort inside any company to rewrite an existing frontend framework based app to be server-side only.
Hey. Was interesting to listen, but I think you missed the point of the original video. Andrew shows that "some" projects actually can be done without frontend frameworks. People were creating websites using plain php since the beginning of the web to serve millions of users, and nowadays it's also can be maintainable in the long-term when using modern frameworks. The computation scalability theory does not hold strong here. What your video does not mention is the added complexity when you're using frontend framework. Build tooling, wiring this up, and maintaining, deployment etc in the long-term is very complex nowadays. More and more prior front-end frameworks reinventing the wheel and adding SSR again. Unless you have an investment or free human resources to deal with it, there is no "lie" to shift the added complexity to a later stage. 98% of the projects would never go past this stage and you're not Google to juggle dozens of moving pieces by your own.
Sure I hear what you're saying, however I don't fully accept the "added complexity" part as that was one of the points raised by Andrew and what I'm suggesting is that in actual fact one doesn't even technically need a front end framework and yet still be able to fully do a SPA with ajax that doesn't rely on mashing the data and UI code on the server side. In fact you can watch my earlier video where I pretty much do that using Hono + Deno that has ZERO JavaScript build tooling, dependencies (client side) and zero frameworks, just raw JavaScript etc. Having built many modern production sites, I don't agree with this assessment that it's complex to maintain, are *some* frameworks overly complex? Yes that part I will absolutely agree with (I'm looking at you React!). But we shouldn't tar all frameworks with the same brush as that is unfair and incorrect. I would suggest that using Vite + Vue will be dramatically more productive than doing SSR, I know because I've done *both* commercially multiple times, and I know which one I pick every time and it's *NOT* SSR.
If i do computation on the server side, i can use more efficient languages then JS. Rendering a page in Go, Rust, hell even in JS can be much faster, then on the client side, as the server knows it templates for example. Besides that, you don't have to push the whole page. Look at HTMX. And if you "mash together" HTML and Data on the server side, you are doing it wrong.
even with more efficient languages it still doesn't solve the problem (it may buy you a little more headroom but that's it). Inverting the entire things does, because now you're essentially doing distributed computing by offsetting the computation over to *each* client, that means it doesn't matter how many extra clients requests you have your computation becomes (Computation * Zero) which is always zero!
@@watthedoodle I am not a designer or webdev by trait, I run that stuff. On a huge scale. And i do so since 20 years. I had customers, that moved to SPAs, and client side rendering. And guess what: The load difference on the servers where not measurable. But why is this so? Because rendering a HTML Template or crafting JSON from the same data does not have a large computational difference.
@@tbethenerd they're are *not* the same, one is transformation of data representation that is only minor structural changes (see XML -> JSON etc), while rendering full templates, include full on conditional elements based upon expressions all of which is infinitely nested along with loops etc, these are not the same. For simple templates and at small scale the difference will not matter, but the principle can not be avoided, whatever that computational load is *will* be multiplied with every client you add, and by scale I mean the total concurrent RPS.
@@watthedoodlethis is where you start to go horribly wrong in your thinking Running data through a template to generate HTML is orders of magnitude less work than converting data to json The HTML template expansion can be done using no additional memory allocations. Transforming any data to json requires allocating strings for every element, converting numbers into strings, deallocation after use, etc You could roll out 100 HTML templates in the same number of clock cycles as doing 1 json payload
To be honest, whenever I see all of that recommendations to use HTMX, they always demonstrated it through "A Simple Table Application", or sort of a 90's static developer blogs. And this is a main point of usage of it and I not gonna discourage that, but as a say, if you need to do some more complex UI behavior like seamless transitions between html templates, animations, or just simple UI changes that not account server state - HTMX can't handle that as simple as it does those "complex and gross" web frameworks... For me, HTMX niche always in that place, where Alpine.js and Petite-Vue exist right now, but it never gonna be alternative for something more heavy and complex. To summarize it, I just want to say, that all of that "Fanclub of a new shiny thing" guys, they see HTMX as a hammer for anything right now no matter what...
Are you not missing the point that, in the paradigm you are critiquing, you only have to consider one copy of the application state - whereas with front-end frameworks you have to synchronise the front-end and back-end state? Also, I don't see why you can't separate the concerns (the data and view) on the back-end if that's what you want?
you don't need to "synchronise" state, http is idempotent. You simply "fetch" the client specific state that you need as and when you need it. All state is localised client side and is specific to each client instance. For you second point, yes while you *could* potentially have that strict segregation and use internal data "APIs", you still haven't the computation tax issue.
@@watthedoodle I will agree to disagree. Firstly, http is not necessarily idempotent, but even in the case that it is, state is not localised on the client side since you still have a database on the server side... when you update the server side database with an idempotent http method (let's say you DELETE a product listing), that change will in many cases, also need to be reflected on the client side, hence the need for synchronisation
@@byrongibby that DELETE request can either succeed or fail. If it fails then no need to update the current client state, if it succeeds then you can safely remove that data point from the client side. In either case navigating between "pages" will automatically load the latest database state anyway. Unless you want 100% seamless realtime state view across all clients in which case you would need to either use WebSockets and push out an the latest data point upon edit/delete/new OR use something like long polling OR server push etc. But this reload will be EXACTLY the same for SSR, the difference being with SSR you would have to refresh the entire page, whereas with SPA you would only re-fetch the changed data set. So actually it still wins out in terms of overhead.
@@watthedoodle I buy the argument that front end frameworks distribute the compute making it a more easily scaled solution, but to the point you just made - if you're using something like HTMX you aren't doing full page reloads. On the question of state, it has been my experience that you have your server side database, but you also have an in-memory database on the client. The DOM/view is updated as the client database changes - that seems to be a solved problem, the issue as I see it is comes from making sure the client side database and the server side database are kept consistent with each other. If you're saying that this is easy for you to do correctly that's fine, but I think a lot of people find it quite challenging.
@@byrongibby why would it be challenging? as long as the database is the SSOT (Single Source Of Truth), then the Client is simply the presentation of that state. The client also makes request's *to* the database in order to request mutations. The data flow is well established here, there is nothing to "synchronise", at least that's how I approach the design and data flow.
At scale and at high RPS it does not scale sorry that's a simple fact. Given that SSR can mean having to re-render the entire page, this includes all the conditional logic and looping structures etc. Perhaps on the smaller scale it's negligible, but it really really isn't when you work at scale. And it turns out if you invert and delegate the computation to the client side that template computation goes to ZERO for the server side, and it's hard to beat zero.
@@watthedoodleHow many are we talking about here? I need some valid numbers argument. Depending on the interaction it does not need to be fully server-side, it could just be when doing action that is only changing data (yes small js libs can be use for interactivity) and at that point i would argue the lifting is done more on the database side, which is another topic on itself. On the other side, most large application relies on microservices to run their API, this is scalable. But most SSR application mentioned do their scaling the same way using pods, replicas, load-balancing, etc. Why can't we make the same argument? Either way people seem to forgot the DevOps side of things, there no ZERO computation, you are always doing it somewhere, does not REDUCE the amount of computation in any way. The example shown is of course on a small scale where they are in one place, which should not be done on a large scale application (still, how large are we talking). Either way both are complicated, and i agree it should not be simplified.
@@zulfiqrysaadputra Because using client side rendering you wouldn't even **need** to scale the server side, that's entirely the point of CPU delegation.
@@witek4212 only for very few selected API calls, which would *only* send back the exact requested data which can be aggressively cached using redis for the read only data requests. When doing SSR you need to pull the data AND also run the HTML rendering computation on top, not only for every page but also on every page change/update which has objectively more computational requirement than just pulling the data (on a per requirement basis). TL;DR SSR = Data fetching + whole page HTML rendering vs CSR = Lazily selected data fetching only when requested and can return cached and partial data.
"Single page applications are still the best option" - I'm afraid I have to disagree. I also disagree with Andrew, mind you. What you are describing is modern horizontal architecture. This separates the delivery of data from the delivery of the UI. In this case, HTML. Literally nothing prevents a multipage application from being effective. SPAs are only required if you need all interface to stay within a single view. e.g. If you are attempting to provide a desktop application type of experience. If you are providing a multipage experience, please STOP using SPA frameworks. It's too much complexity for too little return. I have found in practice that multipage implementations using jquery are faster to develop and provide a better experience over React or Angular in 80-90% of the cases. Which is easy to understand when you realize that 80-90% of the use cases in the wild are designed as multipage and then jammed into SPA.
JQuery or more specifically shitty jQuery has cost the business I work for about $2,000,000 in developer pay fixing bugs over 10 years and it's only 12 unique pages with the home page, product listing, search and product page being the most complex. Some members on my team were tasked with "updating" the existing product page built in 6000 lines of jQuery spaghetti. They had 3 months. It took 9 months until the defect list grew larger than the actual tasks, then 16 months until the project was scrapped and they were both fired. They handballed it to me, I rebuilt the page in 4 months using Angular. It's been live for 9 months without any critical issues after release. Sure you could make a nice enough webpage/site with HTML and jQuery, but I've never actually seen it in the wild. There are no standards for it's use so different developers do whatever they feel like at the time, some use the revealing module pattern if you're lucky, some global objects, most just free functions in a massive file, generally just hacking exponentially to make things work. SetTimeout's everywhere to deal with race conditions, functionality that overwrites other functionality. Events that are never unbound except occasionally you will see `$el.off().on() It's hard to debug so average developers just hack on whatever they need to get a story complete. Modular componentised code that follows single responsibility as much as possible is always going to be the best solution, but more importantly having standardised practices for average developers to follow and documentation they can read is the biggest benefit to modern js frameworks.
@@ivan.jeremic which is why everyone is so unhappy with them and the quality of most apps is terrible / rapidly declining? SPAs are "Peak" in the same way that server-side MVC frameworks were "Peak" 3-Tier. Both are bad ideas that caught on before the industry could actually evaluate the efficacy. By the time anyone realized they were bad ideas, everyone was doing them.
That is the old way of doing it but he is also not relying on anyone else...npm, etc and not many can deal with that BS! Never had an application that wasn't complex so i prefer frameworks like angular or blazor.
Spaghetti code?😂 Have you ever used a full stack framework template engine like Jinja2. Every component can now be it’s own html that can extend, include or append to a base base html template (with iterations and conditionals). Spaghetti code is any front end framework boilerplate.
Yes I have used Jinja2 as well as many other template engines, front end frameworks also use templates in the form of composable render functions. So with respect your premise here simply doesn't hold.
Server side rendering does not require full page reloads. Partials are virtually equivalent to components, the only difference is where the content rendering takes place. Scalability argument doesn't matter for 80% of developers and this applies more if you want a full SPA experience, you can use a hybrid approach where pages are pages and reactive content is reactive content. I think there's space for more than one style of frontend development, plus, why are all the frontend frameworks adopting SSR if it's not the solution?
It's crazy because we've been here before. We had anti-JS zealots years ago when JS was invented. Guess what, JS won because it made web applications much easier to develop and better for users. Now they're coming back and saying "well, just use a little javascript to sprinkle interactivity here and there." No, we tried that too. Anybody who tried writing complex applications with jQuery on server rendered pages knows it sucked. And it wasn't because developers sucked then. We tried the pattern, and it evolved into modern UI libraries like React and Vue
Might you be speaking too broadly? For my part, I lament that we are still locked into JS.
If I'm speaking broadly it's because it's broadly the truth. Sure there are exceptions. And we all know there are drawbacks with JS, I'll be the first to admit it. But the solution to that isn't going back to the same old design patterns we had 10-15 years ago and solved with JS component models.
It would be like if we said "there are problems with cars" and the solution was to bring back horses. The reason it looks like such a convenient solution is because it's just regression.
I think current solutions like Sveltekit are basically perfect.
I think the whole discussion comes from the feeling that webdevelopment moves really fast
and that there is always new stuff out there being worked on.
Andrews view sounds a bit nostalgic, like: "back in the days, just HTML, CSS and JS ...
don't need all this fancy new stuff, just plain and simple...".
I suppose many people vibe with that, but that's not the whole picture.
Sure, for a simple static site, just go with that,
but if you want something bigger,
you will need a more sophisticated approach, which
may be more complicated (which is only natural), but less complicated compared to a big project
that doesn't have a system and an idea behind it, so:
big project (with a framework) vs. big project (without a framework)
I mean a framework is basically the result of optimizing your project for something bigger,
that's why they exist.
I've built the same complex app both in Vanilla js and Solid js. It took about 2 months to build in Vanilla js, but I used the jQuery and DataTables library to solve a lot of my problems. Without jQuery it would have taken me even longer. The app was functional but not great.
Then I built the same app again using Solid js and it took just 2 weeks and had a lot of unexpected benefits like being able to build dynamic data tables without importing the library - and having pages retain large amounts of data without having to reload it every time the user navigates away and comes back. To achieve the same using jQuery you would have to hide/show entire pages and the code would get very messy and counter-intuitive very quickly.
As someone who is currently in the process of rebuilding a vanilla app in solid, I second this take, although I still like using vanilla JS for persona/experimental projects. It just seems like the more you have to roll out by hand, the more opportunities there are to learn.
With laravel, you would have done with that same app in 5 days.
@@BraintemOrg I've dabbled in Laravel professionally in back in 2012. It's a good choice for server side projects. For me, Solid js is the best for client side projects. There's no reason a website couldn't use both.
Why do people need a one solution for all? Can't people have a discussion and pick out best tools for their requirements? Isn't that why we are engineers ffs?
This video response wasn't to say we need a single solution, you're absolutely correct we *should* use the right tool fort he job, and what I'm arguing is that SSR for web applications is *not* the right tool (for all the points covered). If however you don't have a web application and your site is static, then I wouldn't even SSR or SPA is the right tool in that case SSG would be the right tool.
I think you gloss over one of the biggest reasons: locality of data. SPA’s have the benefit of being able to leverage a client side cache which can feel far more responsive than needing to wait for a server response for every interaction.
One more thing to say here is that. once you add the html blocks to PHP or Blade/Laravel view engine.
It get very difficult to make changes to layout and design. your html markup and css classes gets scattered into different strings in those templates. You can easily further tweak the frontend in case of SPA or SSR with Nextjs.
--- Practically
SPAs - are only suitable for certain use cases like admin Section, or sites where content changes frequently. Because, SPA make it difficult for search engine crawlers, which results in bad SEO.
SSR - Server Side Rendered sites are more suited for things like blogs, landing pages, ecommerce sites etc
Very good point here regarding SEO, that is indeed on key legitimate criticism of SPA, there has been some solutions in this area, as you may already be aware there are solutions such as Nuxt and Next which can do isomorphic builds, but that's another rabbit hole!
I really like your style of breaking down the trade offs in a relatively neutral way. Would watch more.
Thank you for you kind words, I really appreciate it and it motivates me to create more stuff!
Using HTMX and some template engine can fix a lot of backend issues including spaghetti code. Then there is rendring part of the code and core api part that feches data. Also if the request header request Json then the api can return Json instead of Html. That way it works like rest service for mobile etc. Only reason where in client heavy application like games etc. This approach does not work well. But many applications are CRUD applications that can benefit from simplicity of operations. Specially when it come to user permission and roles implementation which backend is more close to in an application.
A few points
1. Html over the wire scales just fine
2. You're missing the point that you are adding the complexity or reimplementing all of the server side logic on the client via redux, or vuex, etc.. this creates just as much spaghetti code ( i haven't seen an enterprise production app prevent this).
3. Building an app based on your needs now and adding api end point later is just fine.
4. (Opinion) i haven't seen something like Nextjs used at a large scale. I have seen Angular and React, and it was just as much as a mess.
HTMX
React and Vue are built to have the application live in the browser, completely without a server, if need be. The same goes for Blazor. Those apps will store data on the client and push to the server when available. It adds lots of logic on the client side and basically doubles every domain logic. Htmx requires the server being always on. Which is okay for most 99% of apps. Htmx features should be Html standard.
Great video. I think many people are just frustrated with the current state of JS tooling, which is constantly evolving/breaking, fragmented, often slow and not simple to use. Meanwhile most of my backends from a decade ago probably still build and run (or require minimal migrations). If JS frontend dev would require a simple "js build --release" all the time and the same at every project much less people would complain. It's getting better with things like Vite or Bun, but still quite a bit to go.
Thank you! Absolutely agree the JS fatigue is real and I have found Vite to be a breath of real fresh air, and that's the positive direction the ecosystem needs to move towards, I think I may do the next video into vite!
yeah I don't get those html over the wire people, it is not flexible and there are reasons why we send data and over the wire.
5 min into video you do not provide any reasoning for why he is wrong, you just keep arguing that he does not know why, but if you know then explain it.
Then it also seems you are ok with someone else abstracting the parts which renders json and stiches that on the front end, and that is perfectly fine.
And sure if your product gets too big that you don't want to have large teams writing custom components for each and every item you serve to users that are running your webpage on various clients then it makes perfect sense to migrate to a framework that would help you do that more quickly and efficiently, but most websites out there are not that. They are not serving millions of users every hour, and what ends up happening when you visit these websites is you get an over engineerd bloatware that ends up using your entire memory just to animate the scoll into view.
But that's just me
I literally provided the reasons as to why he is wrong, please re-watch carefully. I re-iterated the points throughout the video as well. Almost everyone else has understood the points I raised against Andrew.
If a philosopher becomes a programmer 😂
Great video man
Thank you I'm glad you liked it! 👍
Fundamentally, the server does not know anything about the state of the UI. It cannot know it. Sometimes, that's okay, because you just send the initial state of the UI and don't care about what happens next. But the moment you try doing something more complicated you will need it and a server-side render architecture is just not right for that. If you chose it, you'll be stuck with that forever, and you'll end up making more and more workarounds to somehow make it work. Wanna have a form on a page and also a paginated list? Well, crap you're out of luck, you now have to do partial page renders and manually patch up the client side. It can be done, but it'll be a mess. And the next programmer will have no clue what is going on. While such things just come out of the box with React etc.
Fanatic video! Great points!
Thank you sir for the kind words, I'm happy you liked it!
Freudian slip? Is it "Fanatic" or "Fantastic"? Lol
If server-side is so great then how come apps are routinely rewritten to use a frontend framework but I've literally never heard of any effort inside any company to rewrite an existing frontend framework based app to be server-side only.
Hey. Was interesting to listen, but I think you missed the point of the original video. Andrew shows that "some" projects actually can be done without frontend frameworks. People were creating websites using plain php since the beginning of the web to serve millions of users, and nowadays it's also can be maintainable in the long-term when using modern frameworks. The computation scalability theory does not hold strong here. What your video does not mention is the added complexity when you're using frontend framework. Build tooling, wiring this up, and maintaining, deployment etc in the long-term is very complex nowadays. More and more prior front-end frameworks reinventing the wheel and adding SSR again. Unless you have an investment or free human resources to deal with it, there is no "lie" to shift the added complexity to a later stage. 98% of the projects would never go past this stage and you're not Google to juggle dozens of moving pieces by your own.
Sure I hear what you're saying, however I don't fully accept the "added complexity" part as that was one of the points raised by Andrew and what I'm suggesting is that in actual fact one doesn't even technically need a front end framework and yet still be able to fully do a SPA with ajax that doesn't rely on mashing the data and UI code on the server side. In fact you can watch my earlier video where I pretty much do that using Hono + Deno that has ZERO JavaScript build tooling, dependencies (client side) and zero frameworks, just raw JavaScript etc.
Having built many modern production sites, I don't agree with this assessment that it's complex to maintain, are *some* frameworks overly complex? Yes that part I will absolutely agree with (I'm looking at you React!). But we shouldn't tar all frameworks with the same brush as that is unfair and incorrect.
I would suggest that using Vite + Vue will be dramatically more productive than doing SSR, I know because I've done *both* commercially multiple times, and I know which one I pick every time and it's *NOT* SSR.
90% of sites you use on daily basis are built with heavy frontends. What ppl was creating decades ago is not relevant anymore
If i do computation on the server side, i can use more efficient languages then JS. Rendering a page in Go, Rust, hell even in JS can be much faster, then on the client side, as the server knows it templates for example.
Besides that, you don't have to push the whole page. Look at HTMX.
And if you "mash together" HTML and Data on the server side, you are doing it wrong.
even with more efficient languages it still doesn't solve the problem (it may buy you a little more headroom but that's it). Inverting the entire things does, because now you're essentially doing distributed computing by offsetting the computation over to *each* client, that means it doesn't matter how many extra clients requests you have your computation becomes (Computation * Zero) which is always zero!
@@watthedoodle I am not a designer or webdev by trait, I run that stuff. On a huge scale.
And i do so since 20 years.
I had customers, that moved to SPAs, and client side rendering. And guess what: The load difference on the servers where not measurable. But why is this so? Because rendering a HTML Template or crafting JSON from the same data does not have a large computational difference.
@@tbethenerd they're are *not* the same, one is transformation of data representation that is only minor structural changes (see XML -> JSON etc), while rendering full templates, include full on conditional elements based upon expressions all of which is infinitely nested along with loops etc, these are not the same. For simple templates and at small scale the difference will not matter, but the principle can not be avoided, whatever that computational load is *will* be multiplied with every client you add, and by scale I mean the total concurrent RPS.
@@watthedoodlethis is where you start to go horribly wrong in your thinking
Running data through a template to generate HTML is orders of magnitude less work than converting data to json
The HTML template expansion can be done using no additional memory allocations.
Transforming any data to json requires allocating strings for every element, converting numbers into strings, deallocation after use, etc
You could roll out 100 HTML templates in the same number of clock cycles as doing 1 json payload
To be honest, whenever I see all of that recommendations to use HTMX, they always demonstrated it through "A Simple Table Application", or sort of a 90's static developer blogs.
And this is a main point of usage of it and I not gonna discourage that, but as a say, if you need to do some more complex UI behavior like seamless transitions between html templates, animations, or just simple UI changes that not account server state - HTMX can't handle that as simple as it does those "complex and gross" web frameworks...
For me, HTMX niche always in that place, where Alpine.js and Petite-Vue exist right now, but it never gonna be alternative for something more heavy and complex.
To summarize it, I just want to say, that all of that "Fanclub of a new shiny thing" guys, they see HTMX as a hammer for anything right now no matter what...
With HTMX, you can relatively easily embed a SPA for a single sub-route. (In my case, elmish). It's actually quite nice for a hybrid website.
Are you not missing the point that, in the paradigm you are critiquing, you only have to consider one copy of the application state - whereas with front-end frameworks you have to synchronise the front-end and back-end state? Also, I don't see why you can't separate the concerns (the data and view) on the back-end if that's what you want?
you don't need to "synchronise" state, http is idempotent. You simply "fetch" the client specific state that you need as and when you need it. All state is localised client side and is specific to each client instance. For you second point, yes while you *could* potentially have that strict segregation and use internal data "APIs", you still haven't the computation tax issue.
@@watthedoodle I will agree to disagree. Firstly, http is not necessarily idempotent, but even in the case that it is, state is not localised on the client side since you still have a database on the server side... when you update the server side database with an idempotent http method (let's say you DELETE a product listing), that change will in many cases, also need to be reflected on the client side, hence the need for synchronisation
@@byrongibby that DELETE request can either succeed or fail. If it fails then no need to update the current client state, if it succeeds then you can safely remove that data point from the client side. In either case navigating between "pages" will automatically load the latest database state anyway. Unless you want 100% seamless realtime state view across all clients in which case you would need to either use WebSockets and push out an the latest data point upon edit/delete/new OR use something like long polling OR server push etc. But this reload will be EXACTLY the same for SSR, the difference being with SSR you would have to refresh the entire page, whereas with SPA you would only re-fetch the changed data set. So actually it still wins out in terms of overhead.
@@watthedoodle I buy the argument that front end frameworks distribute the compute making it a more easily scaled solution, but to the point you just made - if you're using something like HTMX you aren't doing full page reloads. On the question of state, it has been my experience that you have your server side database, but you also have an in-memory database on the client. The DOM/view is updated as the client database changes - that seems to be a solved problem, the issue as I see it is comes from making sure the client side database and the server side database are kept consistent with each other. If you're saying that this is easy for you to do correctly that's fine, but I think a lot of people find it quite challenging.
@@byrongibby why would it be challenging? as long as the database is the SSOT (Single Source Of Truth), then the Client is simply the presentation of that state. The client also makes request's *to* the database in order to request mutations. The data flow is well established here, there is nothing to "synchronise", at least that's how I approach the design and data flow.
The claim that server side rendering of HTML doesn't scale is one of the dumbest most pig ignorant claims i have ever heard.
At scale and at high RPS it does not scale sorry that's a simple fact. Given that SSR can mean having to re-render the entire page, this includes all the conditional logic and looping structures etc. Perhaps on the smaller scale it's negligible, but it really really isn't when you work at scale. And it turns out if you invert and delegate the computation to the client side that template computation goes to ZERO for the server side, and it's hard to beat zero.
@@watthedoodleHow many are we talking about here? I need some valid numbers argument. Depending on the interaction it does not need to be fully server-side, it could just be when doing action that is only changing data (yes small js libs can be use for interactivity) and at that point i would argue the lifting is done more on the database side, which is another topic on itself. On the other side, most large application relies on microservices to run their API, this is scalable. But most SSR application mentioned do their scaling the same way using pods, replicas, load-balancing, etc. Why can't we make the same argument? Either way people seem to forgot the DevOps side of things, there no ZERO computation, you are always doing it somewhere, does not REDUCE the amount of computation in any way. The example shown is of course on a small scale where they are in one place, which should not be done on a large scale application (still, how large are we talking). Either way both are complicated, and i agree it should not be simplified.
@@zulfiqrysaadputra Because using client side rendering you wouldn't even **need** to scale the server side, that's entirely the point of CPU delegation.
@@watthedoodle it's not ZERO. You still need to "render" JSON. Do you have any numbers that prove this "simple fact"?
@@witek4212 only for very few selected API calls, which would *only* send back the exact requested data which can be aggressively cached using redis for the read only data requests. When doing SSR you need to pull the data AND also run the HTML rendering computation on top, not only for every page but also on every page change/update which has objectively more computational requirement than just pulling the data (on a per requirement basis).
TL;DR
SSR = Data fetching + whole page HTML rendering
vs
CSR = Lazily selected data fetching only when requested and can return cached and partial data.
Nothing can match the component ecosystem of js framework. That thing alone is enough to not wanting to switch to server side rendering.
Soon someone will post a video to argue that is a great idea and we should all be just using that. Give me a break, where do those people come from?
Yes good points, let he try to crate table as excel sheet on server
"Single page applications are still the best option" - I'm afraid I have to disagree. I also disagree with Andrew, mind you. What you are describing is modern horizontal architecture. This separates the delivery of data from the delivery of the UI. In this case, HTML.
Literally nothing prevents a multipage application from being effective. SPAs are only required if you need all interface to stay within a single view. e.g. If you are attempting to provide a desktop application type of experience.
If you are providing a multipage experience, please STOP using SPA frameworks. It's too much complexity for too little return. I have found in practice that multipage implementations using jquery are faster to develop and provide a better experience over React or Angular in 80-90% of the cases. Which is easy to understand when you realize that 80-90% of the use cases in the wild are designed as multipage and then jammed into SPA.
Single page applications are peak web dev.
JQuery or more specifically shitty jQuery has cost the business I work for about $2,000,000 in developer pay fixing bugs over 10 years and it's only 12 unique pages with the home page, product listing, search and product page being the most complex.
Some members on my team were tasked with "updating" the existing product page built in 6000 lines of jQuery spaghetti. They had 3 months. It took 9 months until the defect list grew larger than the actual tasks, then 16 months until the project was scrapped and they were both fired. They handballed it to me, I rebuilt the page in 4 months using Angular. It's been live for 9 months without any critical issues after release.
Sure you could make a nice enough webpage/site with HTML and jQuery, but I've never actually seen it in the wild. There are no standards for it's use so different developers do whatever they feel like at the time, some use the revealing module pattern if you're lucky, some global objects, most just free functions in a massive file, generally just hacking exponentially to make things work. SetTimeout's everywhere to deal with race conditions, functionality that overwrites other functionality. Events that are never unbound except occasionally you will see `$el.off().on() It's hard to debug so average developers just hack on whatever they need to get a story complete.
Modular componentised code that follows single responsibility as much as possible is always going to be the best solution, but more importantly having standardised practices for average developers to follow and documentation they can read is the biggest benefit to modern js frameworks.
@@ivan.jeremic which is why everyone is so unhappy with them and the quality of most apps is terrible / rapidly declining?
SPAs are "Peak" in the same way that server-side MVC frameworks were "Peak" 3-Tier. Both are bad ideas that caught on before the industry could actually evaluate the efficacy. By the time anyone realized they were bad ideas, everyone was doing them.
I need a FE framework that brings food to the table!!!
You're wrong but, i respect your opinion
That is the old way of doing it but he is also not relying on anyone else...npm, etc and not many can deal with that BS! Never had an application that wasn't complex so i prefer frameworks like angular or blazor.
just pick a framework and stick with it. raw html css js is a meh
Right, I choose htmx 😂
Spaghetti code?😂 Have you ever used a full stack framework template engine like Jinja2. Every component can now be it’s own html that can extend, include or append to a base base html template (with iterations and conditionals). Spaghetti code is any front end framework boilerplate.
Yes I have used Jinja2 as well as many other template engines, front end frameworks also use templates in the form of composable render functions. So with respect your premise here simply doesn't hold.