Great video. Thanks for introducing me to Deku. I think there is a lot of information left out here. I'm gonna outline my opinions below even if they don't matter: 1. JSON is not for computers it's for humans. Trouble shooting binary requests isn't easy. 2. GZIP compression can greatly reduce the bytes over the wire. JSON with lots of repeating keys can greatly benefit from gzip. 3. A lot of the time the database is your actual bottleneck and serialization doesn't matter for performance. In fact, 1,600 requests per second is more than most companies/services will ever see. 4. Having the document define the schema allows for flexible data. You can't just send arbitrary data with a binary protocol. This has pros/cons. For a public facing web service that only receives a couple thousand requests per second I would pick JSON over any other structured binary protocol almost every time. Yes there is a cost to simplicity but there is also a cost to complexity. Make it easy for internal (engineers) and external (users) to troubleshoot. Just use JSON unless you have identified that serialization is the bottleneck in your endpoint.... For most endpoints/services, I would wager serialization is not the bottleneck.
things like scientific applications involving high density of data collected in small amount of time(rocket controls, explotions tests..), and things that are inherintly binary(vnc), JSON is not a good choice anything involving man, it's good to use JSON
@@tattipodapatti yes even non programmers can understand JSON easily, government of india is using it in websites like income tax for the users to download as json
Hey, did you like the video? I really enjoy making this kind of content. That's fun. Do you think I should do something with promises? (MAKE A COMMENT, Do not respond, RUclips has the worst notifications)
BLAZINGLY SLOW. Thanks for another great technical video! I hadn't actually thought about it until now, but this explains perfectly why we use protobuf so much in embedded work
For sure. I am also curious why “deku” instead of using exactly Avro or Protobuf. This approach doesn’t seem to take into account forward/backward compatibility, unless I missed something?
I can't stop loving this content. JSON is something I use daily but never really think about. A deeper than usual deep dive, yet still accessible. Thank you my good man
Binary formats are difficult to maintain and scale at certain points, as ThePrimeagen notes, it's also not easy to version; you'll have to change your parser to read data correctly, or know where to take up unused space to give room for future improvements without breaking the protocol. But it's really good for real-time systems that need to deliver a lot of data in a small amount of time. For example game servers where feedback should be fast and responsive both on the client and across other clients with different network conditions at 60fps, while also dealing with other systems like rendering.
It kinda landed right at the best moment, as I'm trying to improve the performance of our microservices where are transferring millions of JSON messages and can see the real cost of JSON serialize/deserialization growing super fast. I was studying GRPC and just learned about this protobuf here, thanks again for sharing this kind of content!!!
There is actually a successor to GRPC & Protobuf called Cap’n Proto. Afaik it's made by the same person but they are no longer working for Google. It's faster because Cap'n Protos RPC can "travel back in time" (send a rpc using the output of a another call before the call has even finished) and it does direct in-memory modification rather than adding extra encoding & decoding steps. You of course keep the benefit of having a separate protocol definition that can be used across multiple languages.
Very cool video 😎 Would be interesting to see jsons position challenged. I wonder how gRPC would compare, as I have never used it in the browser. But, before we hate on JSON, let’s remember that it freed us from XML
It didn’t really. All these horrible front end “frameworks” go through linguistic and development hoops to preserve XML. UIs are defined in some bastard munged XML+JS format that can’t be merged. UIs can be described in JSON. But a bunch of HTML authors from 1994 still want to do
@@ABaumstumpf you dont need to send comments over API requests, lol. XML is fine for a lot of cases if you like it, but it makes no sense to use it for network traffic.
@@lunafoxfire "you dont need to send comments over API requests" Who said anything about API request?? "but it makes no sense to use it for network traffic." Then that applies exactly the same to Json.
I interned at an insurance company where the way they received new insurance data was through a single string that was about 5000 characters long and parsed it by slicing that string at intervals and it almost always caused issues with encoding because it used ascii encoding and all the responses were in utf-8 so it had a billion safeguards that would always break when someone entered a non ascii character. I'd say it doesnt get more blazingly fast than this but its written in the most cursed and blazingly slow C# code ive ever seen in my limited experience. Best part is they moved to c# from java like 7 years ago but just kept this system instead of rewriting it to use xml or "jizzle"
I think JSONs fall into the same pitfalls as document storage DBs. They are so easy to understand and implement, yet so goddamn hard to let go off even after you learn about better and BLAZINGLY FAST-er alternatives (like good ol' postgres or protobufs in this case). This requires a change in mentality more than anything. I suppose old habits die hard.
As someone who tried writing this kind of stuff back before JSON and even before XML, the reason we went with this stuff was for debugging purposes. At the time it was so nice to just watch the stuff on the wire, and be able to have it be human readable. However, I would argue this was mostly because there was not a de-facto binary protocol well understood by common tooling. I think it's entirely possible for us to have a nice binary encoding that our tooling (things like wireshark, Charles proxy, Chrome dev tools) could read and understand without it being fat and bloated like JSON.
I guess it would be really beneficial for the adoption of protobufs as well, if you could just upload a .proto to devtools and see all the underlying data
Prime, these videos make us all better programmers. Thank you so much, I have been so much healthier programming and having fun doing it just because of your content, and that makes me a much better programmer
I'm, just using "edn" (extensible data notation) by default since it's not only "blazingly" but also "extensibly" fast :D Amazing video btw, like always!
I don't know how I got here guy, but I'm absolutely glad to be here - some slick vim keystrokes, nice editing and explanations. Keep it coming good sir.
been watching a lot of your videos, I feel like your newer 'hot take' videos on blogs are fine and all, but you're a good teacher and I'd like to see more of these more informative/objective vids from you because you're a pretty good teacher when you do this sort of content!
The deserialization from json from the requests and the database to serialization to the response are actually in the top 5 of our bottlenecks in the company i work in. 40ms of a 100ms request are just for de-/serialization from/to json.
Its run by business, not engineers; jsôn allows for a more commodity/fungible dev team. It takes a certain amount of scale before the tradeoff is worth it. Same for things like reactive programming, etc. The quality of your videos, in both content and presentation, is strong enough that it's deterring me from taking on that kind of load for a S&Gs channel of my own! I'd love to see more stuff 100% (i'd do the twitch stuff but it's a bit of a time commitment and i completely suck at multi-tasking so i can't really do it in the background)
Your production quality has gone up and the humor is on point. Today I decided to subscribe! I can officially say that I like this channel now :) great content my dude!
I think another benefit of JSON (though it may not be as relevant in this example as in, say, a web application) over any binary format is that it is immediately readable to any user. I can't count the number of scripts or alternative clients I've been able to write just by inspecting the JSON data a website sends. A universal, plain-text format like JSON frees the potential of how software is used from just the developer and places some choice in the hands of the user as well. Another thing, I've found that I really like the TSV (tab-separated values) format for storing any tabular data. It's so simple, effortless to parse, and much more compact compared to JSON or even CSV (the schema is usually just the first line, or can even be omitted). I'm not sure how it would work as a data transfer format, and though I don't see it commonly used, I doubt there would be any significant problems.
Hey, I clicked that weird red button that says SUBSCRIBE on it and indeed, there were multiple AJAX calls that were using JSON for BOTH request parameters and response data. Crazy!
I think I could have done a better job and got a bit slower on binary representation. But, it's such a hard balance to not be long-winded, but be entertaining. So thank you for the note!
There's a Handmade Seattle talk from 2021 called "context is everything" that shows you can get huge speedups if you can tighten your schema, and if you have full control over your tools that do the parsing (down to using native code). JSON imposes a tax in comparison to raw bytes, but you can still get big gains if you can tighten down your schema. Thinking JSON/not JSON in black and white will cost you orders of magnitude. You should consider if you actually need the flexibility of wide-open generic constrains on either the parsing or the generation side.
Great video. Super interesting to see! The “XML is worse” part got me in kidneys. I have to work with shitty XML everyday as a messaging schema between our app and other companies integrations (LEGACYYY). Slow and terrible to work with. The worst of both worlds.
A few extra bytes to be human readable is totally worth it. The fact that you inspect the JSON sent in the dev tool sis testament to that. I would say the overhead of the encoding/decoding is perceptabible for most use cases.
The particular example you gave is just wrong. Dev tools could've just as easily transparently converted to/from an efficient binary format. Surely a human-readable format is useful. But nowhere as useful as people think it is. Images are not human-readable, neither are docx, pdfs, or mobi. Plus, if readability was a concern a subset of YAML (which is less complicated to parse) is much more readable than JSON. How is an unparsed, unindented JSON readable anyway? And if you have to do parse JSON to display in a "human readable" format anyway, why not use a binary format, or a subset YAML?
@@FunOrange42 Yes, and your answer is the correct one. JSON is not used because it is "human friendly". It is used because the tooling required for it was already there. Javascript always understood JSON. And that is all it took for it become popular
I was gonna say, you're gonna get people arguing with you instantly in this thread. I agree with you Michael but what can you do. A day is coming when quantum computers will make it so that the difference is actually negligible.
@@gnikdroy by “subset of YAML”, I assume you mean basically just the keys + values? Because ‘less complicated to parse’ is absolutely not true for full YAML vs JSON… if you look at the spec alone, this is quite clear (YAML spec is even longer than XML…, and ~10x longer than JSON). Because of this complexity, also different YAML parsers may sometimes give entirely different interpretations & parsings for the same YAML string. And I would argue that even the subset of the most basic functionality of YAML is still more complex to parse than JSON, since tracking indentation to denote “Start object”/‘End object” is already more complex than simple delimiter characters… and inevitably someone will use a tab when someone else used spaces. And someone may use 3 spaces to indent while another uses 4. And so on. It’s very easy to create broken YAML while also hard to verify because it’s very easy to create YAML which is semantically broken but syntactically correct. YAML can’t detect that you meant for a line to have different indentation, while JSON can detect very easily that you didn’t add a “}" Unindented JSON may be unreadable, but unindented YAML is semantically broken. Unindented + single-line JSON may be EXTREMELY unreadable, but unindented + single-line YAML is syntactically broken
Hi Man! So it's been a couple of weeks since you started to appear in my recomendations, for some reason RUclips is suggesting me all your VIM videos, now i don't give a damn about VIM and i didn't found the thumbnail any interesting so i decided to ignore it, but today it suggested me this one....so i watched it.....and now i've watched almost all of your videos :D This is very good and entertaining content, keep up the good work ( i'm still not gonna watch the VIM videos btw )
"This data transfer specification is slower than writing non-interchangeable data structures in my preferred language" yeah of course, and also of course it's less space-efficient (and slower to parse) than all the various binary formats (but see also gz). JSON is human-readable, available in every language, and good enough for most use-cases (except at FAANG scale); no one ever accused JSON parsers of being particularly speedy, but also most systems don't have JSON parsing as a bottleneck. I ran into this at a previous company, they wanted to switch to protobuf or something similar to save some bytes, and I had to call out that they had some terrible queries in common paths that took over 5 seconds to complete, and needed to focus on real problems. And about the last quip: if we got rid of JSON today, some idiot with a popular blog would say it's time for XML to make a comeback, and then we'd really be in deep shit.
W video, more more more. Too many videos out there aimed at beginners, very few aimed at intermediates (other than theo and fireship). Love the content. Love the streams. Good job dad.
looking at the comments: seems most people consume messages without checking what they are ingesting (some even say that is the safe way), never heard of network endianness, didn't realize a few bits in the beginning of the frame can specify version, size and any other desirable property, don't know what rpc is, think computers should communicate in human, and the list goes on... so I think you are right.
I'm with you man, I have an embedded arm Linux mature project I recently spend some time optimizing code that the GCC profiler marked as high usage. There is some json in the project, this video makes me want to rethink it, so the device runs more cool, other people don't this beneficial, but I don't see the benefit of heating the atmosphere
@@andersoncunha7079 Who cares? 99% of devs are probably working stuff like web dev. Yeah we want the router and the actual webserver itself to be internally optimized to the femtosecond. But the website? Bro people are loading sites on their phones. If it takes a couple seconds, who cares? What matters is if it's maintainable. JSON is maintainable. And before you say "but you can break it with version updates" yeah and it just causes an image not to show on a website. It doesn't crash an airplane into the ground. Different applications bro.
The fact that there are a million different Javascript frameworks invented every day shows that the productivity boost and portability benefits that Javascript gives you are real.
Love this type of video, I wasn't aware of Deku before watching. I'd be keen to see a breakdown of Protobuf vs Deku We used Twitch's Twirp framework for Golang at a previous company I worked at for exactly this reason
Great video. Thanks for introducing me to Deku.
I think there is a lot of information left out here. I'm gonna outline my opinions below even if they don't matter:
1. JSON is not for computers it's for humans. Trouble shooting binary requests isn't easy.
2. GZIP compression can greatly reduce the bytes over the wire. JSON with lots of repeating keys can greatly benefit from gzip.
3. A lot of the time the database is your actual bottleneck and serialization doesn't matter for performance. In fact, 1,600 requests per second is more than most companies/services will ever see.
4. Having the document define the schema allows for flexible data. You can't just send arbitrary data with a binary protocol. This has pros/cons.
For a public facing web service that only receives a couple thousand requests per second I would pick JSON over any other structured binary protocol almost every time. Yes there is a cost to simplicity but there is also a cost to complexity. Make it easy for internal (engineers) and external (users) to troubleshoot. Just use JSON unless you have identified that serialization is the bottleneck in your endpoint.... For most endpoints/services, I would wager serialization is not the bottleneck.
things like scientific applications involving high density of data collected in small amount of time(rocket controls, explotions tests..), and things that are inherintly binary(vnc), JSON is not a good choice
anything involving man, it's good to use JSON
to your 3rd point. It was even greater 1,600 requests per milliseconds ~ which would be 1,600,000 requests per second
on point
Comments like this keep my hope in the humanity
Why gzip when we can Brotli?
JSON is like the python of language, slow but convenient to use
I think this is the perfect description
We want to know more about JSON alternatives
@@worstfellow Yes, but definitely not XML
@@tattipodapatti yes even non programmers can understand JSON easily, government of india is using it in websites like income tax for the users to download as json
@@codyking9491 yes already we have too many javascript frameworks don’t want too many data formats 😅
These videos are getting better and better both in quality and entertainment factor :)
That's what I like to hear
@@ThePrimeagen to bad you cannot 'hear' that...
unless you made a speech to text app in RuSt... :)
edit: text to speech
@@vaisakh_km if he had then it would have been blazingly fast
Hey, did you like the video? I really enjoy making this kind of content. That's fun. Do you think I should do something with promises? (MAKE A COMMENT, Do not respond, RUclips has the worst notifications)
I think you should learn french
BLAZINGLY SLOW. Thanks for another great technical video! I hadn't actually thought about it until now, but this explains perfectly why we use protobuf so much in embedded work
Absolutely. By the way, I haven't responded to your discord message. I'm on a flight to meet my boss right now, and then I'll be able to update you
@@ThePrimeagen Thanks a ton. Have a safe flight and hope the meeting goes well
@@ThePrimeagen you have a boss? thought ur the boss
blazingly slow is the new blazingly fast
@@dickheadrecs I just heard that sentence with Fireships voice thanks for making my day
It would be nice to see a video comparing sending JSON/XML/Apache Avro/Protobuf over the wire with Rust/Go/JS. Great video, as always;
Absolutely must happen
For sure. I am also curious why “deku” instead of using exactly Avro or Protobuf. This approach doesn’t seem to take into account forward/backward compatibility, unless I missed something?
I can't stop loving this content. JSON is something I use daily but never really think about. A deeper than usual deep dive, yet still accessible. Thank you my good man
ty ty ty
Binary formats are difficult to maintain and scale at certain points, as ThePrimeagen notes, it's also not easy to version; you'll have to change your parser to read data correctly, or know where to take up unused space to give room for future improvements without breaking the protocol.
But it's really good for real-time systems that need to deliver a lot of data in a small amount of time. For example game servers where feedback should be fast and responsive both on the client and across other clients with different network conditions at 60fps, while also dealing with other systems like rendering.
Protobuf then?
It kinda landed right at the best moment, as I'm trying to improve the performance of our microservices where are transferring millions of JSON messages and can see the real cost of JSON serialize/deserialization growing super fast. I was studying GRPC and just learned about this protobuf here, thanks again for sharing this kind of content!!!
GRPC is wonderful. And reading the proto files is so much better than working with swagger
There is actually a successor to GRPC & Protobuf called Cap’n Proto.
Afaik it's made by the same person but they are no longer working for Google. It's faster because Cap'n Protos RPC can "travel back in time" (send a rpc using the output of a another call before the call has even finished) and it does direct in-memory modification rather than adding extra encoding & decoding steps.
You of course keep the benefit of having a separate protocol definition that can be used across multiple languages.
@@Ether_Void “before call is even finished” sounds cool but also scary this rings so many alarms in my head as a security analyst
i love that akame wallpaper behind your transparent terminal with nvim open. nice touch.
Very cool video 😎
Would be interesting to see jsons position challenged. I wonder how gRPC would compare, as I have never used it in the browser. But, before we hate on JSON, let’s remember that it freed us from XML
It didn’t really.
All these horrible front end “frameworks” go through linguistic and development hoops to preserve XML.
UIs are defined in some bastard munged XML+JS format that can’t be merged.
UIs can be described in JSON.
But a bunch of HTML authors from 1994 still want to do
Yeah, it "freed" us from a more structured approach that also supports comments and validation....
@@ABaumstumpf you dont need to send comments over API requests, lol. XML is fine for a lot of cases if you like it, but it makes no sense to use it for network traffic.
@@lunafoxfire "you dont need to send comments over API requests"
Who said anything about API request??
"but it makes no sense to use it for network traffic."
Then that applies exactly the same to Json.
@@ABaumstumpf ...the whole video was about network traffic... ...thats what the original comment was about...
Really enjoyed this. It just seems like, for better or for worse, the ease and convenience of JSON trumps everything else.
I interned at an insurance company where the way they received new insurance data was through a single string that was about 5000 characters long and parsed it by slicing that string at intervals and it almost always caused issues with encoding because it used ascii encoding and all the responses were in utf-8 so it had a billion safeguards that would always break when someone entered a non ascii character.
I'd say it doesnt get more blazingly fast than this but its written in the most cursed and blazingly slow C# code ive ever seen in my limited experience. Best part is they moved to c# from java like 7 years ago but just kept this system instead of rewriting it to use xml or "jizzle"
Recently I was comparing Deno and Rust. What surprised me was how much simple JSON response weigh...
People just don't realize how expensive simplicity is
@@ThePrimeagen I'm a simple man; consequently, an expensive one as well.
@@enclave2k1 xd
I think JSONs fall into the same pitfalls as document storage DBs. They are so easy to understand and implement, yet so goddamn hard to let go off even after you learn about better and BLAZINGLY FAST-er alternatives (like good ol' postgres or protobufs in this case). This requires a change in mentality more than anything. I suppose old habits die hard.
Exactly. Simplicity is amazing. I cannot stress that enough. But it also costs a lot.
Crazy timing. My frontend team at work is frequently wondering why we're not doing JSON payload, so i can send this! Super informative, thanks!
hah! well if your backend is dictating it, they are smert
damn! that sub request was really well placed and timed! nice one!
As someone who tried writing this kind of stuff back before JSON and even before XML, the reason we went with this stuff was for debugging purposes. At the time it was so nice to just watch the stuff on the wire, and be able to have it be human readable. However, I would argue this was mostly because there was not a de-facto binary protocol well understood by common tooling. I think it's entirely possible for us to have a nice binary encoding that our tooling (things like wireshark, Charles proxy, Chrome dev tools) could read and understand without it being fat and bloated like JSON.
This
Insert obligatory xkcd standards cartoon here ;-)
I guess it would be really beneficial for the adoption of protobufs as well, if you could just upload a .proto to devtools and see all the underlying data
The lab coat and theme when you went all sciency gave me serious Garand Thumb vibes. I love it. You guys even kind of look like each other.
I'm loving the new editing style. not too much but just the right amount of sass
Prime, these videos make us all better programmers. Thank you so much, I have been so much healthier programming and having fun doing it just because of your content, and that makes me a much better programmer
Love to hear that giga Chad
holy gigachad comment
I'm, just using "edn" (extensible data notation) by default since it's not only "blazingly" but also "extensibly" fast :D
Amazing video btw, like always!
Tytyty
This is an algorithmic message to let you know that this type of content is really really appreciated !
Love your videos dude!
What's with the dancing squirrel 💃🐿️?
🤣🤣🤣
Good one!
As a french dev, hearing you say "jason" makes me giggle in the weirdeist way 😅
Your more technical videos are so great, kudos !
Will do, and I will never let JSON be said any other way
For the algo
For the thankfulness
These types of videos are super valuable and I really appreciate the way you go into the technicalities and explain these topics. Loving the format!
Absolutely loving these performance deep dives! Keep ‘em comin!
I don't know how I got here guy, but I'm absolutely glad to be here - some slick vim keystrokes, nice editing and explanations. Keep it coming good sir.
been watching a lot of your videos, I feel like your newer 'hot take' videos on blogs are fine and all, but you're a good teacher and I'd like to see more of these more informative/objective vids from you because you're a pretty good teacher when you do this sort of content!
having just seen Mark Robber's last Olympiad for squirrels. I can appreciate the squirrel for science...
Science squirrels are incredibly important
The deserialization from json from the requests and the database to serialization to the response are actually in the top 5 of our bottlenecks in the company i work in. 40ms of a 100ms request are just for de-/serialization from/to json.
That is incredible. And if you're using a garbage collected language, don't forget the effects of garbage collection.
@@ThePrimeagen Well, we use Java so... yeah.. xD
had to pause at the intro just to mention how smooth that subscriber plug was. ggwp
edit: what a great video
I want an update of this video, this time featuring JDSL and Tom the genuous
Its run by business, not engineers; jsôn allows for a more commodity/fungible dev team. It takes a certain amount of scale before the tradeoff is worth it. Same for things like reactive programming, etc.
The quality of your videos, in both content and presentation, is strong enough that it's deterring me from taking on that kind of load for a S&Gs channel of my own! I'd love to see more stuff 100% (i'd do the twitch stuff but it's a bit of a time commitment and i completely suck at multi-tasking so i can't really do it in the background)
I really enjoy these comparison videos, especially with your humor.
insanely high quality video. thank you for all the information!
I'd just like to interject for a moment. What you're referring to as Json,
is in fact, JSML, or as I've recently taken to calling it, Tom is a genius.
Love the DEEP DIVE 🔥🔥🔥🔥🔥
Your production quality has gone up and the humor is on point. Today I decided to subscribe! I can officially say that I like this channel now :) great content my dude!
I think another benefit of JSON (though it may not be as relevant in this example as in, say, a web application) over any binary format is that it is immediately readable to any user. I can't count the number of scripts or alternative clients I've been able to write just by inspecting the JSON data a website sends. A universal, plain-text format like JSON frees the potential of how software is used from just the developer and places some choice in the hands of the user as well.
Another thing, I've found that I really like the TSV (tab-separated values) format for storing any tabular data. It's so simple, effortless to parse, and much more compact compared to JSON or even CSV (the schema is usually just the first line, or can even be omitted). I'm not sure how it would work as a data transfer format, and though I don't see it commonly used, I doubt there would be any significant problems.
One of the easiest subscribing of my life, top notch quality
ty ty ty ty
Hey, I clicked that weird red button that says SUBSCRIBE on it and indeed, there were multiple AJAX calls that were using JSON for BOTH request parameters and response data. Crazy!
Weird. This Json thing might just catch on after all
I loved this video! Thank you Primeagen
This is one of the best videos on programming I've ever seen. I feel enlightened and empowered
This was a really cool deep-dive, but also very entertaining and straight to the point. Looking forward to more stuff like this. :)
I think I could have done a better job and got a bit slower on binary representation. But, it's such a hard balance to not be long-winded, but be entertaining.
So thank you for the note!
There's a Handmade Seattle talk from 2021 called "context is everything" that shows you can get huge speedups if you can tighten your schema, and if you have full control over your tools that do the parsing (down to using native code).
JSON imposes a tax in comparison to raw bytes, but you can still get big gains if you can tighten down your schema.
Thinking JSON/not JSON in black and white will cost you orders of magnitude. You should consider if you actually need the flexibility of wide-open generic constrains on either the parsing or the generation side.
Amazing video as always, please keep them up!
For more information on that topic I recommend chapter 4 of Designing Data-Intensive Applications. Really good video :D
Great video. Super interesting to see! The “XML is worse” part got me in kidneys. I have to work with shitty XML everyday as a messaging schema between our app and other companies integrations (LEGACYYY). Slow and terrible to work with. The worst of both worlds.
love it when the scienceagen comes in to do some blazingly fast research for us all
Love these style videos. Keep it up!
Thank you. I feel like it's a fun break from the VIM content
Your jokes in this type of developer/programming videos is funny and unique and I would love to see more of em keep up the good work.
Loving this series! This is great content covering topics that aren't too easy to find, most of YT is for junior devs.
A computer scientist ~~~NEEDS~~~ a lab coat.
Anyone else currently looking at a JSON parser and your head constantly screaming "Jéson!"?
This was very interesting and something I hadn't really given second thought to when I'd use JSON
Great video, some interesting points mentioned here. Looking forward to seeing more
Very informative and comedic video, thanks :) Keep up the good work TheScienceagen
Excellent work, Dr. Prime!
Dude this guy is on another level. Love your videos.
MY GOD, this was so satisfying to watch and listen to
0:08 made me laugh way too hard, thank you Mr. Prime
I like this format! Could be even a bit deeper / more technical and longer :). Anyway, super cool format!
I absolutely vote for more videos like this, but maybe some more involvement in the creation of the code. Loving it! SO much JSML o_O
Yes!!! Gold content!!! I have never heard any other dev talking about serialization encodings!!!!
Exactly what I was looking for. Thank you!
Really loving this, Prime.
As a front-end dev - I feel personally attacked by Prime's videos but I do appreciate them all the same!
Love this form of content Prime! 👍 You make harder concepts (for me anyway) as digestible as possible! So invaluable.
This is awesome! More of these deep dives please!!
What about gzip, brotli, etc? I don't worry too much about json size overhead as long as the client supports compression (especially brotli).
i think you fall a sleep during the video
parsing json is slow
Love these videos! Keep 'em coming!
Trying!
A few extra bytes to be human readable is totally worth it. The fact that you inspect the JSON sent in the dev tool sis testament to that. I would say the overhead of the encoding/decoding is perceptabible for most use cases.
The particular example you gave is just wrong. Dev tools could've just as easily transparently converted to/from an efficient binary format.
Surely a human-readable format is useful. But nowhere as useful as people think it is. Images are not human-readable, neither are docx, pdfs, or mobi. Plus, if readability was a concern a subset of YAML (which is less complicated to parse) is much more readable than JSON. How is an unparsed, unindented JSON readable anyway? And if you have to do parse JSON to display in a "human readable" format anyway, why not use a binary format, or a subset YAML?
@@gnikdroy because the tooling already exists for JSON
@@FunOrange42 Yes, and your answer is the correct one. JSON is not used because it is "human friendly". It is used because the tooling required for it was already there. Javascript always understood JSON. And that is all it took for it become popular
I was gonna say, you're gonna get people arguing with you instantly in this thread.
I agree with you Michael but what can you do.
A day is coming when quantum computers will make it so that the difference is actually negligible.
@@gnikdroy by “subset of YAML”, I assume you mean basically just the keys + values?
Because ‘less complicated to parse’ is absolutely not true for full YAML vs JSON… if you look at the spec alone, this is quite clear (YAML spec is even longer than XML…, and ~10x longer than JSON). Because of this complexity, also different YAML parsers may sometimes give entirely different interpretations & parsings for the same YAML string.
And I would argue that even the subset of the most basic functionality of YAML is still more complex to parse than JSON, since tracking indentation to denote “Start object”/‘End object” is already more complex than simple delimiter characters… and inevitably someone will use a tab when someone else used spaces. And someone may use 3 spaces to indent while another uses 4. And so on. It’s very easy to create broken YAML while also hard to verify because it’s very easy to create YAML which is semantically broken but syntactically correct. YAML can’t detect that you meant for a line to have different indentation, while JSON can detect very easily that you didn’t add a “}"
Unindented JSON may be unreadable, but unindented YAML is semantically broken. Unindented + single-line JSON may be EXTREMELY unreadable, but unindented + single-line YAML is syntactically broken
IT FEELS LIKE IM PRONONCING IT WITH MY FRENCH ACCENT
Thank you, Mr. ThePrimeagen!
Awesome stuff. Keep making videos like these.
I really love this sort of content from you
Damn, when that wild Thomas Dolby appeared I couldn't help myself from laughing
this was really beneficial and interesting. thanks for posting it. i learned a lot
Hi Man! So it's been a couple of weeks since you started to appear in my recomendations, for some reason RUclips is suggesting me all your VIM videos, now i don't give a damn about VIM and i didn't found the thumbnail any interesting so i decided to ignore it, but today it suggested me this one....so i watched it.....and now i've watched almost all of your videos :D
This is very good and entertaining content, keep up the good work ( i'm still not gonna watch the VIM videos btw )
"This data transfer specification is slower than writing non-interchangeable data structures in my preferred language" yeah of course, and also of course it's less space-efficient (and slower to parse) than all the various binary formats (but see also gz). JSON is human-readable, available in every language, and good enough for most use-cases (except at FAANG scale); no one ever accused JSON parsers of being particularly speedy, but also most systems don't have JSON parsing as a bottleneck. I ran into this at a previous company, they wanted to switch to protobuf or something similar to save some bytes, and I had to call out that they had some terrible queries in common paths that took over 5 seconds to complete, and needed to focus on real problems. And about the last quip: if we got rid of JSON today, some idiot with a popular blog would say it's time for XML to make a comeback, and then we'd really be in deep shit.
This is the best type of video. Make more plz.
JSML
Thank you for this video MR. Primeagen. I feel smarter now.
I like this. Have my algorithmic signal. Thanks prime.
W video, more more more. Too many videos out there aimed at beginners, very few aimed at intermediates (other than theo and fireship). Love the content. Love the streams. Good job dad.
Yoda voice: Jason. You seek Jason… for some reason that’s what plays in my head.
I love this style of video! The live stuff is neat as well but I find these are much easier to focus on
but my eyes can read json blazingly fast
I’m making a VR mmo, this video was exactly what I was looking for. I need to share player state as fast as possible.
As an embedded C engineer i love efficient code. I think JavaScript is the evidence of the decline of our civilization.
Facts
looking at the comments: seems most people consume messages without checking what they are ingesting (some even say that is the safe way), never heard of network endianness, didn't realize a few bits in the beginning of the frame can specify version, size and any other desirable property, don't know what rpc is, think computers should communicate in human, and the list goes on... so I think you are right.
I'm with you man, I have an embedded arm Linux mature project I recently spend some time optimizing code that the GCC profiler marked as high usage.
There is some json in the project, this video makes me want to rethink it, so the device runs more cool, other people don't this beneficial, but I don't see the benefit of heating the atmosphere
@@andersoncunha7079 Who cares? 99% of devs are probably working stuff like web dev. Yeah we want the router and the actual webserver itself to be internally optimized to the femtosecond. But the website? Bro people are loading sites on their phones. If it takes a couple seconds, who cares? What matters is if it's maintainable. JSON is maintainable. And before you say "but you can break it with version updates" yeah and it just causes an image not to show on a website. It doesn't crash an airplane into the ground. Different applications bro.
The fact that there are a million different Javascript frameworks invented every day shows that the productivity boost and portability benefits that Javascript gives you are real.
... Protocol Buffers have entered the chat.
Love this kinda content Prime
Love this type of video, I wasn't aware of Deku before watching. I'd be keen to see a breakdown of Protobuf vs Deku
We used Twitch's Twirp framework for Golang at a previous company I worked at for exactly this reason
your technical videos are so good!
Loved it, thanks for this type of videos
I'm saying it the french way from now on. Thank you for making my life richer!
Would be interesting to see comparison between deku and MsgPack since it sits somewhere in between binary and json
I just love your content, it's smart, useful, in dept and quite entertaining!
Keep them coming !!!
Lets make blazingly synonym for ThePrimeagen