The Drawback of Client Side Rendering

Поделиться
HTML-код
  • Опубликовано: 27 дек 2024

Комментарии • 450

  • @willmakk
    @willmakk 4 года назад +737

    Your metaphors are next level.

    • @keith6293
      @keith6293 4 года назад +32

      here's your big mac.

    • @arturfil
      @arturfil 4 года назад

      Aaaah I see what you did there...

    • @dickheadrecs
      @dickheadrecs 3 года назад

      🎈 🏠 🎈

    • @alexcc316
      @alexcc316 2 года назад

      "you are not a karen"

  • @navneethsubramanya.8465
    @navneethsubramanya.8465 4 года назад +1351

    Ben's got 99 problems, but a girlfriend ain't one.

    • @phantomKE
      @phantomKE 4 года назад +40

      Bruh! He says it with a straight face

    • @navneethsubramanya.8465
      @navneethsubramanya.8465 4 года назад +20

      @@phantomKE I know right? He just leveled up with these jokes!

    • @abdulazizs824
      @abdulazizs824 4 года назад +2

      🤣🤣

    • @blipojones2114
      @blipojones2114 4 года назад +16

      jokes aside, he's not a bad looking dude at all, getting jacked wouldn't hurt tho, as long as he doesn't covert to a life coach and talk about it non-stop like that other coder youtuber John Sonmez.

    • @malvoliosf
      @malvoliosf 4 года назад +8

      He has a girlfriend in Canada.

  • @programming2249
    @programming2249 4 года назад +453

    I avoid client-side rendering in order to save CPU cycles for cryptocurrency mining.

    • @r0ckinfirepower
      @r0ckinfirepower 4 года назад +5

      hahaha

    • @TechdubberStudios
      @TechdubberStudios 4 года назад +7

      hilarious comment! but crypto mining is an inefficient form of revenue on client's computer, see TPB case experiment.

    • @ezshroom
      @ezshroom 4 года назад +4

      @@TechdubberStudios It may pay less than ads, but it's many times better. I support websites that responsibly use cryptomining, and I block ads. Please, don't say that ads are better. They have never been any good to anybody's web browsing experience.
      Oh, and you can use cryptomining along with Arc, another earning method that does not involve ads.
      I'm done with Google's creepy trackers. Cryptocurrency mining is the future.

    • @TechdubberStudios
      @TechdubberStudios 4 года назад +3

      @@ezshroom I am genuinely 100% with you on the crypto movement. I hate ads. Always have hated them. But there are at least 2....3 big corporations that come to mind that were built on the ads business model, but with crypto mining... can't find one. And browser-crypto-mining is not exactly a new technology. I really want it to replace ads. I really do. Hate the pop-ups, spying, tracking, that's going on. And the first corpo that comes to mind would be Netflix, when considering whom should adopt the crypto model. Because the users stay on netflix and binge-watch hours and hours!

    • @TechdubberStudios
      @TechdubberStudios 4 года назад +2

      @@ezshroom also, do you happen to know any website/forum/subreddit focusing on browser-based mining? I would really like to join and dig in more into this subject.

  • @shaftsbury120
    @shaftsbury120 4 года назад +19

    Fantastic job explaining this! As always, the hilarious dry humor and "next level" metaphors help drive home points and keep things entertaining. Really helped clear up a bunch of stuff and get me pointed in the right direction. Many thanks!

  • @evans8245
    @evans8245 4 года назад +67

    solutions:
    0) pre-rendering with parcel or webpack
    1) server side rendering

    • @archmad
      @archmad 4 года назад +3

      your solutions are not client side rendering. he mentioned it.

  • @Silver_Knee
    @Silver_Knee 3 года назад +8

    I avoided serverside rendering a meta tag by registering a sub-domain, doing the serverside-rendering there and making my app only compatible with a set number of user-agents. Brilliant!

  • @diegogimbernat9253
    @diegogimbernat9253 4 года назад +238

    I love the tint on your glasses, it's serial killer-ish, where can i get a pair like those?

    • @bawad
      @bawad  4 года назад +372

      a package arrives at your door after the 3rd kill

    • @fev4
      @fev4 4 года назад +23

      @@bawad respect

    • @-Jakob-
      @-Jakob- 4 года назад +2

      They are the left-behinds after each kill. That's the way you get it.

    • @johnnamtae9610
      @johnnamtae9610 4 года назад

      @@bawad quick scope no scopes?

    • @flamendless
      @flamendless 3 года назад +1

      Those tints are wiped off blood from killing

  • @stephenyin3509
    @stephenyin3509 4 года назад +49

    Love the joke about girlfriend and client side rendering at the beginning

  • @PaulSebastianM
    @PaulSebastianM 4 года назад +129

    For your sake and ours, I hope you DON'T get a girlfriend too soon.

  • @BrotWurst
    @BrotWurst Год назад

    7:20 would it be possible to use the same url but check the header value on the for example nginx server?
    like is this user agent a bot (twitter, fb, etc.) => proxy to your slim API for only the meta data response
    and if its a real user (mac, windows, chrome, firefox user agent etc.) => proxy to your real page / default response / SSR page.
    maybe im forgetting something. i dont know if this could work.

  • @ufufu001
    @ufufu001 4 года назад +495

    the girlfriend problem might be solved if you stop walking around wearing asexual flag shirts

    • @SayWhat6187
      @SayWhat6187 4 года назад +18

      hahaha

    • @williamboshi1855
      @williamboshi1855 4 года назад +32

      lmao, good catch, respect

    • @travistrue2008
      @travistrue2008 4 года назад +12

      He's just playing hard to get. Karen gets it.

    • @johnyepthomi892
      @johnyepthomi892 4 года назад +1

      But with this if he ever gets one, she will be the right one. Lol

    • @ragnarok7976
      @ragnarok7976 3 года назад +1

      He do check a lot of aesthetic boxes from the virgin meme... Though I probably do too 😆

  • @stuartgreen5217
    @stuartgreen5217 4 года назад +62

    I guess u never heard of prerender.io
    been using it for years

    • @angry_moose94
      @angry_moose94 4 года назад +5

      Lol just killed the whole argument. Never heard of it before. Just goes to say that tech is exponential. Wonder if it will cause the cosmic crash eventually.

    • @ayushkhanduri2384
      @ayushkhanduri2384 4 года назад +2

      yep yep yep , you just commented before me

    • @rodrigoabselcid
      @rodrigoabselcid 4 года назад +1

      ​@@ayushkhanduri2384 same case, I just searched prerender before I add a comment about it to check if it's already mentioned and here it was.

    • @Saurabhandsonu1994
      @Saurabhandsonu1994 4 года назад +1

      🤯

  • @gelismissuriyeli4440
    @gelismissuriyeli4440 2 года назад

    This was the best explanation video I've seen on the matter... Kudos to you Mister...

  • @arpowers
    @arpowers 4 года назад +30

    I love how Ben roasts Angular devs. I thought of that carrot farmer line off and on all day and cracked up every time.

  • @TimeoutMegagameplays
    @TimeoutMegagameplays 4 года назад +61

    Solution: NextJS, Angular Universal, Nuxt, etc.

    • @jpsimons
      @jpsimons 4 года назад +2

      Also check out the create-exact-app npm (that's exact not react). Like NextJS but Express-forward design, full control at the server side level of what's going on.

    • @brandon.duffany
      @brandon.duffany 4 года назад +3

      @@jpsimons Just FYI, next.js also gives you full server side control. You can just run next as a library within an express server. In my experience, it's super ergonomic while preserving the state-of-the-art benefits of next (code splitting, automatic static optimization, incremental static generation, etc.). Having said that, I have not yet checked out create-exact-app, and am not sure how it differs from nextjs.

    • @angshu7589
      @angshu7589 4 года назад +3

      Why do I not like the sound of Angular Universal?

    • @milorad3232
      @milorad3232 4 года назад +3

      @@angshu7589 because you are not carrot farmer. Although color of your profile picture kinda resembles the carrot :D

    • @TimeoutMegagameplays
      @TimeoutMegagameplays 4 года назад

      @Adithya R Svelte Sapper is still in early development. I love Svelte, but Sapper is still far away from production-ready

  • @johnnietirado6131
    @johnnietirado6131 4 года назад +1

    Had this issue while using a MeteorJS website running with ReactJs as the client side is that we created a crawler ( i think there is an npm project for this) that would go to every page, render it and save it on our DB. When a non-human (Google) would access the site it would served this rendered HTML, making it SEO friendly. Basically our server would use the User Agent to define what type of content the user would get served. Hope this help

    • @rajatrao5632
      @rajatrao5632 2 года назад

      can you please explain what do you mean by 'render it and save it on db' , do you mean like render the dom elements and attach to html and store it in some link and add that link in db , or what exactly are you storing in db, if this is the case wouldn't it be too much for pages which are dynamic like /rob/photos, /sam/photos and likewise to be stored in db or am i missing something

  • @laenprogrammation
    @laenprogrammation 4 года назад +3

    there is a workaround : just add conditional tag in the small server that builds your page. you can still use client side rendering except for meta tags

  • @Rssks
    @Rssks 4 года назад

    2:05 i do it this way:
    Server serves response for parsers (meta, og, schema, jsonld and plain html content) and then comes along js that structures it up and takes over routing from this point, so when you navigate you actually don't "refresh"

  • @waynevanson277
    @waynevanson277 4 года назад +2

    A solution to your problem could be to build a single page application, with each end point for the app being pre rendered.
    It's basically jamstack. Once a user loads one page, the others do not need to be loaded.

  • @josephlardner-burke9400
    @josephlardner-burke9400 4 года назад

    Why wouldn’t you just serve this over the server on a different port? As in send the result as a when someone searches that route?

  • @vorname1485
    @vorname1485 4 года назад +1

    Why do you want another domain for that. The server that delivers the app index.html could also deliver the meta response instead, based on user agent. One problem in both situations you could get (separate service or combined), is that the bots make checks whether the content they see is different to a regular user. Whether any of them does it, I don't know, but I would check this possibility.

  • @cedric_lfbr
    @cedric_lfbr 4 года назад +2

    I had to do that once, I used a Lambda function since it was hosted on AWS, and the function intercepts the CloudFront distribution request and updates the HTML if the request comes from a robot, adding the OpenGraph tags.

  • @PaulSebastianM
    @PaulSebastianM 4 года назад +1

    If you manage the web server, you could use the web server's router to do the same exact hack you described without the need for a different subdomain, just a route that checks the user-agent of the client and returns different HTML based on it.

  • @DK-ox7ze
    @DK-ox7ze 3 года назад

    Nice post. However I am curious as to why these FB bots can't read a client rendered app? I am sure that these bots are ultimately rendering the app on a web engine/browser, because they need to generate the preview of the rendered content. The static html will also most likely load some resources like CSS and JS over the network (unless the bot expects an inlined css/js), so it's not going to be an instant preview. So maybe the limitation is not technical but rather functional, in that the bot is probably not willing to wait too long for the page to load (which is the case with most client rendered apps)?

  • @jimchapman4579
    @jimchapman4579 Год назад +1

    Greate video! I use Laravel on the server side to serve up everything. Static html pages and React apps or a combo of both. It's easy to embed a react app within a .blade template file. Meanwhile Laravel takes care of everything else, like API services, user registration and authentication, etc. Best of both worlds.

  • @kierangill4967
    @kierangill4967 4 года назад +4

    Sapper + svelte gives you the best of both worlds

  • @samsonbrody6308
    @samsonbrody6308 3 года назад

    Great video. Trying to wrap my head around server side rendering and this video definitely helped

  • @shriharis1498
    @shriharis1498 4 года назад +4

    Sorry I'm new here, and I've noticed that Ben clearly hates Angular.
    Can someone give a quick background please???

  • @tonylion2680
    @tonylion2680 4 года назад

    I´ve been watching your videos and yes, the quality of the content is always awesome, new suscriber

  • @maasteeve
    @maasteeve 4 года назад

    The problem I see with the magic link is that when a user just copies the url instead of pressing a share button to get the special share.* link that there will not be meta tags and when you do use the special share.* then there are meta tags. And I think most users will probably just copy the url they are on now instead of looking for a share button

  • @nileriversoftware4070
    @nileriversoftware4070 4 года назад

    You ever tried NextJS? It does SSR for the initial request (b/c it could be a bot), but CSR when you click links.

  • @Jamiered18
    @Jamiered18 4 года назад

    I done this in a similar way. Except the share link page didn't check any user agent headers. It just featured a JavaScript redirect to the normal URL.
    Bots don't run JavaScript, don't redirect, they get the meta tags they need. User does run JS, redirects to the main SPA. And, for good measure, a fallback clickable link on the share page shows in case redirecting is blocked.

  • @alldecentnamestaken
    @alldecentnamestaken 4 года назад +3

    "It's like I spent a bunch of time building a house and now I want that house to fly." LMAO

  • @Manivelarino
    @Manivelarino 4 года назад +2

    I just put my meta tags with variables like %PAGE_NAME% %PAGE_IMAGE% and replace these later while serving the page with express. doesn't work while client-side routing but It works for link previews.

  • @DubstepRS
    @DubstepRS 4 года назад +1

    The preview still won't work when users copy paste the link directly from the browser url bar

  • @JohnEricOrolfo
    @JohnEricOrolfo 3 года назад

    my solution for this is:
    1. set a lambda as an index
    2. on the index lambda, it should run a headless tool (like a headless chrome) to render the page if the request is a bot, else just serve the raw js

  • @arthurbruel5545
    @arthurbruel5545 4 года назад

    If the only thing that needs to change is the meta tags (not the rendered bits), you can also modify the html before returning it to the client, inserting the relevant meta tags.
    It will probably lead to performance problems, but you could also perform an if condition on the referrer of the request to determine if you should perform such modifications.

  • @gilliangoud
    @gilliangoud 4 года назад

    kinda curious what the increased load would be if you'd do this on the normal domain already -> check the user agent... Would eliminate cdn usage tho... so maybe not foolproof.

  • @archmad
    @archmad 4 года назад

    I had similar issue. good thing you found a better solution.

  • @ryszard3756
    @ryszard3756 4 года назад

    Hello Ben, Have You thought about differences between react-snap and NextJS/Gatsby from SEO perspective? I mean is there reason to use NextJS instead just react-snap to get better results in search engines? Does NextJS/Gatsby do something extra to perform better in SEO? Regards

  • @0dyss3us51
    @0dyss3us51 4 года назад +1

    You are hilarious and informative my dude haha, relatable. And damn dude the lenght of your link

  • @rishabhrathod888
    @rishabhrathod888 4 года назад

    For some wierd cases like mine where only /particularRoute need to work like SSR what i actually tried was hosted gatsby project in particular route of CRA project and it worked just need to handle few re-routing cases

  • @ytlongbeach
    @ytlongbeach 4 года назад +1

    Our course -- I followed and understood all the way to the end. This is because I'm an unemployed ex-Tech Lead [who has never worked at a FANG company], and a thousandaire.

  • @AndreiNedelus
    @AndreiNedelus 4 года назад +2

    Hey I saw Wes Bos in one of his videos, he used cloud functions to generate the preview and puppeteer i guess to take a screenshot of the url

  • @dawid_dahl
    @dawid_dahl 4 года назад +1

    This channel is slowly becoming one of my favorites on RUclips! 😄

  • @rickyu1978
    @rickyu1978 4 года назад

    Looking for exactly this solution. Was thinking of using aws api gateway to check the routes and then redirect based on headers. Social og links are super important for shareability..

  • @sebastienringrose5440
    @sebastienringrose5440 4 года назад

    Yo Ben, just saw this video so my comment may be irrelevant by now but for a simpler solution could you just wrap your existing create-react-app codebase in an express app then serve the CRA to actual users and generate html for FaceBook bots etc all from one codebase? I imagine your express app would be super small so wouldn't be adding much weight to your existing codebase. You could then build your API into that express app too and treat it as a monorepo.
    I do see that this would completely negate the benefits of serving static sites via gh pages or Netlify but if you've got an active API anyways is it a large extra load for that API to serve your static assets as well? If you're using Lambda functions then fine, you win.
    Interested to hear your thoughts. Cheers :)

    • @bawad
      @bawad  4 года назад

      I could do it that way, but I only needed it one specific thing, so I wanted to try keeping the benefits of using a CDN

    • @sebastienringrose5440
      @sebastienringrose5440 4 года назад

      Ben Awad yeah fair. Maybe worth considering if you want to make it more static/ssr’ed in future but by then you might consider the full next.js recode that you discussed in the vid.

  • @NLNTZ
    @NLNTZ 4 года назад +3

    I think your points on pre-rendering are slightly off. Not mentioning tools like Gatsby, Scully, or Gridsome is a miss as they can be used to render those dynamic routes you say cannot be pre-rendered. It's worth mentioning that JAMStack options are becoming really incredible for developers and give you the same benefits as server side rendering out of the box.

  • @vutesaqu
    @vutesaqu 4 года назад

    Could you use something like pug generate a html file like the normal one that just contains the with meta tag links that have and just send that instead of the normal blank html (instead of having to different by client) ?

  • @salshouts
    @salshouts 4 года назад

    This helped me alot! I am working on a project and my backend was almost finished. I was using create-react-app with router but switched over to next.js! Thanks alot

  • @victorbjorklund
    @victorbjorklund 4 года назад

    Nice solution. Only downside guess would be that Google wants you to show the same content to their bot as the user. But probably doesn't matter if you don't want to index those pages.

  • @squashtomato
    @squashtomato 3 года назад +1

    I think you could easily do this in net core. In the startup class, in the configure for routing, you could filter each route with the correct meta tags. You could make this an extension and bing bang bosh, neat tidy job done

  • @Sid91
    @Sid91 4 года назад

    I wonder how you'll feel about Blazor, ASP.NET CORE released Blazor in order to do everything in C# and replace Javascript entirely ( I just started learning about it but it sounds like what they're doing, could be wrong)

  • @samislam2746
    @samislam2746 4 года назад

    You can write normal HTML with some normal CSS just to make the preview template, imagine that this is the loading page, and only once the client-side rendering is ready, you can show the content of that on top of that base temporary HTML template.
    I've never done client-side rendering before, and I've never used React, but I think this should work. Why are you saying that it's not possible?

  • @tunyaa
    @tunyaa 4 года назад

    netlify has a free experimental feature called pre-rendering, for me, it works with Facebook, it parses the right meta tags automatically with pictures also. My content comes from a backend via graphql and apollo. meta is being set with react helmet, the page is handled by react-router, and it's a create react app project. Hope this helps. You can also do prerendering very very easily with react-snap package, but you need to rebuild when data changes. (PS. Thanks for your work, I really like your videos)

  • @jvcmarc
    @jvcmarc 4 года назад

    this is actually the first time I understood the difference between client side and server side rendering

  • @zBrain0
    @zBrain0 2 года назад

    This video is 2 years old, I don't know if it existed when you made it but today there is a tool called puppeteer that uses a headless Chrome browser to actually pre-render your HTML that you can feed to bots and still feed the client side rendered to humans

  • @BribedStudios
    @BribedStudios 4 года назад +7

    Nice GatsbyJS colorway on that shirt 🤙

  • @mihaijiplea8371
    @mihaijiplea8371 4 года назад

    I think one issue is that most of the link preview code nowadays is kind of stuck in a pre-1page application world. Apart from making a curl request to the website and getting the HTML, the preview feature should also render it and stop assuming everybody's using server-side rendering. This does sound like it gets more into the "create a google scraper" type of situation, but if in-turn we're talking about a client-side rendered website trying to create a preview for another client-side rendered website, all the rendering is happening on the client so the server doesn't take much load at all.

  • @nilanjanmitra7459
    @nilanjanmitra7459 4 года назад

    I use EJS and it allows for variables to be passed before sending the HTML to the client, so that can allow you to change the values in the meta tags.

  • @LiEnby
    @LiEnby 3 года назад

    There are benefits to client side rendering?
    Btw rather than checking the user agent, just include a client side redirect location="whatever.com"
    Because the bots dont have javascript they wont be redirected, but users will

  • @iamrohandatta
    @iamrohandatta 4 года назад

    You can use the header trick as you mentioned, and then simply use something like puppeteer to load the page on the server itself and then send the rendered HTML page to the client. If the header says it's from a normal user, then don't go to puppeteer, just return your usual index.html file.

    • @iamrohandatta
      @iamrohandatta 4 года назад

      I got this idea from here: ruclips.net/video/lhZOFUY1weo/видео.html

  • @maxcantube
    @maxcantube 4 года назад +1

    Going through a similar issue myself. My static site is hosted on S3 / CloudFront, orchestrated by terraform. My plan is to use CloudFront Origin Response triggers to trigger a lambda function to add the correct open graph tags to the response. I think this is the lightest weight option.

    • @wanjohi
      @wanjohi 2 года назад

      Did you do it? Am exploring some of this ideas :)

  • @nehtals
    @nehtals 4 года назад

    Would the twitter/fb crawler follow a redirect? Or would it just grab the meta data and come back? In the latter case the user agent check wouldn't be required. Also, your users are going to grab the url from the browser bar and probably not user the share link you made special for them

  • @roshanican504
    @roshanican504 3 года назад

    You make my day better

  • @dncube
    @dncube 4 года назад +4

    Hey Ben,
    I'm starting a new react web app for something like delivery.com and doordash.com where SEO is a major thing.
    Like rich structured snippets of stars etc in google results.
    Is SSR the only option?

    • @temirzhanyussupov6997
      @temirzhanyussupov6997 4 года назад

      If your content could be pre-generated (for example, blog), one of the options would be to use Gatsby.js

    • @bawad
      @bawad  4 года назад

      yeah

    • @dncube
      @dncube 4 года назад

      In that case Gatsby or NextJS or any other you recommend?
      I don't know if Gatsby will do but their web.dev score hits 💯 so was thinking that but in a dilemma

    • @bawad
      @bawad  4 года назад +2

      Next.js

    • @MrFckingninja
      @MrFckingninja 4 года назад

      @@bawad do you know an easy way to migrate from create react app to next.js? I've found problems with REDUX, svg images and react router active route (NavLink). Great video btw ;)))

  • @frankyb702
    @frankyb702 4 года назад +1

    Cant this be solved by a reverse proxy like NGINX just to cache the SSR rendered page but only serving it to bots?

    • @ryanleaf2740
      @ryanleaf2740 4 года назад

      That's what I was thinking. Maybe something like this to handle the routing internally to an SSR-generated page: www.nginx.com/resources/wiki/modules/user_agent/
      or this: serverfault.com/questions/775463/nginx-redirect-based-on-user-agent
      I'm sure there are other reverse proxies that support this, but this doesn't seem to be the most portable solution.
      Edit: added an example of a similar implementation from ServerFault

  • @Colonel1954Dz
    @Colonel1954Dz 4 года назад

    Well a simpler but similar way would be to always server side generate the index.html but only to change the meta data while letting the index.js generate the actual content.
    This wouldn't hurt human users and is pretty light on the server (only entry point html meta tags need to be generated depending on the link then the rest of the rendering navigation is done client side)

  • @mortezatourani7772
    @mortezatourani7772 4 года назад

    I liked the idea however, I think it is still good to have SSR for all users or maybe SEO as well.
    Would you share your ideas on uFrontends, too? Are you preparing some sort of tutorial on that or what?

  • @EPIC-w6k
    @EPIC-w6k 4 года назад +2

    Woah! My self esteem skyrocketed because I managed to keep up with you until the end :D Aside from that, your content is top notch, keep it coming man.

  • @alexsilny5748
    @alexsilny5748 4 года назад

    I had the same issue last week and also was thinking about moving to nextjs, but having a separate domain and server makes a lot more sense.

    • @happysloth91
      @happysloth91 4 года назад

      well Next js or any other SSR solutions doesn't mean you're gonna use one server for the backend and the front-end.

  • @DevinRhode2
    @DevinRhode2 4 года назад

    The pre-built html approach could work. If someone tries to visit a url while something is still building, you could serve up a special "page is still being built" page... one benefit here could be that, in 20 years when a website is taken down, you have all this pre-built static html, and it's very cheap (probably free) to host that.

    • @DevinRhode2
      @DevinRhode2 4 года назад

      Or when there's another sea-change with technology, MAYBE it's easier to migrate to something new, but I'm not sure, would depend on the specific situation.

  • @YoungGrizzly
    @YoungGrizzly 4 года назад

    I'm not a fan of server side unless it's to generate a HTML based email. Just seems so cluttered when doing work. Even the emails I generate on the server side are dreadful to work with but barrable since there aren't many.

  • @flamehiro
    @flamehiro 4 года назад

    o-o anything wrong with just using twig/blade and just use vue compiled version when you want shiz to update in real time?

  • @asiraky
    @asiraky 4 года назад

    I use client side rendering, and am able to return different HTML per page from the server. This is only a problem if you have a single, static, index.html.

  • @brendanhart1717
    @brendanhart1717 4 года назад

    I wonder how this solution for link previews would affect SEO, would Google consider this cloaking since one version of the page is displayed to bots and another to humans?

    • @bawad
      @bawad  4 года назад

      developers.google.com/search/docs/guides/dynamic-rendering

    • @brendanhart1717
      @brendanhart1717 4 года назад

      @@bawad thanks Ben. I guess my concern was maybe Google would think the difference between redirecting to the original URL for a human and not for a bot may be flagged up, more than an issue with just dynamic rendering in general :)

  • @leno__jeno
    @leno__jeno 4 года назад

    Doesn't Netlify have the option to pre render single page applications for bots? (Originally for a better SEO)
    I'm not sure about this because I have never used this feature before, but shouldn't it also be able to solve this problem?

  • @davidturner7192
    @davidturner7192 4 года назад

    If you decided to leave yourself open to refactoring your current solution to a server-side solution, will the performance hit be noticeable to the end-user?

    • @georgeokello8620
      @georgeokello8620 2 года назад

      With SSR no. The data management and computations happens at the server level. The components responsibility at the client level is dynamic UI rendering not manage API calls. Also the client devices do not have to be burdened with excessive large loads of JavaScript that they have to allocate memory resources to render JavaScript; that adds latency to the devices especially mobile devices. Components from an SSR perspective that contain the fetched data are going to be present and readily available to the client devices without worrying about spending computational resources on executing large JS bundles.

  • @gunjanrajtiwari1435
    @gunjanrajtiwari1435 4 года назад

    how to go back and forward if we are doing client side rendering?

  • @deepdivedevs100
    @deepdivedevs100 4 года назад

    i think we can use helmet in CRA also by generating the data for meta in server side

  • @dtpietrzak
    @dtpietrzak 3 года назад

    I made my "yarn deploy" script change react's build/index.html to build/index.php, then it throws in a little php script that grabs some info from the good ole' mongoDB and 'echo's the metatags into the , only when that url is directly accessed. (You can get the URL params in PHP) It's only a few lines of code. The hardest part was figuring out how to configure Nginx's reverse proxy to render PHP. O_O

  • @russellabraham9208
    @russellabraham9208 4 года назад

    Isn't this what client side routing or search queries are for? On document load before the window. Try some JSON data from thread, clone or populate a document fragment with your html and finally render the fragment using the query parameter and its prefix/suffix..

  • @cauebahia
    @cauebahia 4 года назад

    It works! I do exactly that with my react web SPAs. I use firebase and cloud functions to detect user agents and serve SSR version on the fly to robots and CSR version to users. This is also important to SEO indexing, cause some robots won't run any JS and expect html-only responses. Really enjoy your videos.

    • @leisiyox
      @leisiyox 4 года назад

      What about some prerender.io ?

    • @cauebahia
      @cauebahia 4 года назад

      @@leisiyox I thought about using it, but never tried it. Don't know how well it works. It would also cost more than my current firebase cloud function solution.

    • @leisiyox
      @leisiyox 4 года назад

      ​@@cauebahia what are the conditions that you recommend using firebase?
      I thought about using it but I seek guidence

    • @cauebahia
      @cauebahia 4 года назад

      @@leisiyox I like that they integrate lots of services in a single solution. When you create a firebase project, you instantly have access to file storage, hosting, database, authentication, and some other stuff that makes it really easy. I also like that Firestore has real-time listeners for your data. Really good for a client side rendered app. Also really like their documentation and the fact that you can easily access other Google Cloud services and API. There are many videos online about it. Check it out.

  • @CarlosMartinezTech
    @CarlosMartinezTech 4 года назад

    I like how you explain, well done. Thank you for the quality content.

  • @kirasmith1147
    @kirasmith1147 4 года назад

    I just used a node.js express server to host compiled create-react-app, this way you can modify the page and add meta tags if needed before serving the page. Sort of a mix of server and client side rendering as he said.

  • @HappyCheeryChap
    @HappyCheeryChap 4 года назад

    Detecting the user agent and serving different content is called "dynamic rendering": developers.google.com/search/docs/guides/dynamic-rendering
    I've actually been wondering about this lately... whether to use Next.js or set up dynamic rendering so that only bots get the SSR version.
    But I don't get why you're talking about adding the "share." subdomain to serve different content? Why not just use dynamic rendering on the regular URLs?

  • @ekaansh
    @ekaansh 4 года назад

    has there been a followup to this? video/repo?

  • @uskro
    @uskro 4 года назад

    Facebook's user agent is there for the facebook app browser as well.

  • @alirezvani9149
    @alirezvani9149 3 года назад

    Now you can make static pages for you dynamic frequently updated pages with Nextjs, How it works's is that it looks at the requested page and if it is present from the build time, sends it back and if it is not built during build time, builds on the go(run time) and adds it to the built pages for the next request. pretty amazing and game changing !

  • @RoiTrigerman
    @RoiTrigerman 4 года назад

    Your solution is nice, but people copy the url from the browser, and paste it where they wanna share it, and that won't work (they link will not have the "share" subdomain..
    Or am I missing something?
    Anyway I use angular, and it has server side rendering so I think i will just use that.. You can also just switch to angular.. I know you like it

  • @pedroserapio8075
    @pedroserapio8075 4 года назад

    Gatsby also solves the React single-page problem, since we can generate all the individual HTML, CSS, and JS pages.

  • @r0ckinfirepower
    @r0ckinfirepower 4 года назад

    I wonder if doing something like this create-react-app.dev/docs/title-and-meta-tags#generating-dynamic-meta-tags-on-the-server would work for this use case and maybe solve this issue of having to check if the request is coming from a human or a bot.

  • @mbaneshi
    @mbaneshi 4 года назад

    How about mixing django template system with one spa like vue . What do you think ?

  • @DevinRhode2
    @DevinRhode2 4 года назад

    I'm sure someone has already pointed out, but if someone copies the url from the browser, they won't have the `share.` prefix. Probably need a global UA interceptor that results in pages being rendered server-side.
    Year 2030: Browsers introduce a feature to request server-rendered pages by pretending to be a facebook/twitter bot.

  • @DrkGrntt
    @DrkGrntt 4 года назад

    At the end you said "In an ideal world, you would have picked the ideal tool to use in the first place." What would you consider the ideal tool in your case?

  • @SKCodesForFun
    @SKCodesForFun 4 года назад

    Guys I am not a Web developer so excuse me if I say something stupid, but why can't you just use the javascript to render a meta tag based on what route is invoked? Do link previews only parse the html without running any of the scripts on the page?

    • @adrianobonano
      @adrianobonano 4 года назад

      Because of the way that those websites scrap your content. They are not fully rendering the page as your browser does it. Just reading what the server is responding, the plain HTML of a SPA.

  • @dremiq6670
    @dremiq6670 2 года назад

    3:17 thanks for explaning this here lol, i was like why wouldn't you be able to just stick the meta tags in the requested html

  • @LuXxenatorX
    @LuXxenatorX 4 года назад

    hey, can you talk about heroku, hasura and the likes? all this cloud tooling seems like the way to go to become a one full stack pony show, but I would love your opinion on it, thanks!

  • @dovh49
    @dovh49 4 года назад

    Another option would be to have a CDN and when you get content updates just render that single page to the CDN. Then you aren't having to rerender all the time. It would get more complex than that of course as you would need to rerender everything if you make a style change to your website. And it would be public. Unless you pay for something to make it private and have the CDN only return a partial page for bots.

  • @SmujMaiku
    @SmujMaiku 4 года назад

    I had this problem once but my focus was towards crawlers. I ended up using some php to "render" the important bits like title, descriptions and links. Then the javascript would remove those elements and do the single page app business. It was back in carrot farmer code days but I'm sure happy coders can accomplish this just as well.