@@tattipodapatti yes even non programmers can understand JSON easily, government of india is using it in websites like income tax for the users to download as json
Great video. Thanks for introducing me to Deku. I think there is a lot of information left out here. I'm gonna outline my opinions below even if they don't matter: 1. JSON is not for computers it's for humans. Trouble shooting binary requests isn't easy. 2. GZIP compression can greatly reduce the bytes over the wire. JSON with lots of repeating keys can greatly benefit from gzip. 3. A lot of the time the database is your actual bottleneck and serialization doesn't matter for performance. In fact, 1,600 requests per second is more than most companies/services will ever see. 4. Having the document define the schema allows for flexible data. You can't just send arbitrary data with a binary protocol. This has pros/cons. For a public facing web service that only receives a couple thousand requests per second I would pick JSON over any other structured binary protocol almost every time. Yes there is a cost to simplicity but there is also a cost to complexity. Make it easy for internal (engineers) and external (users) to troubleshoot. Just use JSON unless you have identified that serialization is the bottleneck in your endpoint.... For most endpoints/services, I would wager serialization is not the bottleneck.
things like scientific applications involving high density of data collected in small amount of time(rocket controls, explotions tests..), and things that are inherintly binary(vnc), JSON is not a good choice anything involving man, it's good to use JSON
Hey, did you like the video? I really enjoy making this kind of content. That's fun. Do you think I should do something with promises? (MAKE A COMMENT, Do not respond, RUclips has the worst notifications)
BLAZINGLY SLOW. Thanks for another great technical video! I hadn't actually thought about it until now, but this explains perfectly why we use protobuf so much in embedded work
For sure. I am also curious why “deku” instead of using exactly Avro or Protobuf. This approach doesn’t seem to take into account forward/backward compatibility, unless I missed something?
I can't stop loving this content. JSON is something I use daily but never really think about. A deeper than usual deep dive, yet still accessible. Thank you my good man
Binary formats are difficult to maintain and scale at certain points, as ThePrimeagen notes, it's also not easy to version; you'll have to change your parser to read data correctly, or know where to take up unused space to give room for future improvements without breaking the protocol. But it's really good for real-time systems that need to deliver a lot of data in a small amount of time. For example game servers where feedback should be fast and responsive both on the client and across other clients with different network conditions at 60fps, while also dealing with other systems like rendering.
It kinda landed right at the best moment, as I'm trying to improve the performance of our microservices where are transferring millions of JSON messages and can see the real cost of JSON serialize/deserialization growing super fast. I was studying GRPC and just learned about this protobuf here, thanks again for sharing this kind of content!!!
There is actually a successor to GRPC & Protobuf called Cap’n Proto. Afaik it's made by the same person but they are no longer working for Google. It's faster because Cap'n Protos RPC can "travel back in time" (send a rpc using the output of a another call before the call has even finished) and it does direct in-memory modification rather than adding extra encoding & decoding steps. You of course keep the benefit of having a separate protocol definition that can be used across multiple languages.
"This data transfer specification is slower than writing non-interchangeable data structures in my preferred language" yeah of course, and also of course it's less space-efficient (and slower to parse) than all the various binary formats (but see also gz). JSON is human-readable, available in every language, and good enough for most use-cases (except at FAANG scale); no one ever accused JSON parsers of being particularly speedy, but also most systems don't have JSON parsing as a bottleneck. I ran into this at a previous company, they wanted to switch to protobuf or something similar to save some bytes, and I had to call out that they had some terrible queries in common paths that took over 5 seconds to complete, and needed to focus on real problems. And about the last quip: if we got rid of JSON today, some idiot with a popular blog would say it's time for XML to make a comeback, and then we'd really be in deep shit.
I interned at an insurance company where the way they received new insurance data was through a single string that was about 5000 characters long and parsed it by slicing that string at intervals and it almost always caused issues with encoding because it used ascii encoding and all the responses were in utf-8 so it had a billion safeguards that would always break when someone entered a non ascii character. I'd say it doesnt get more blazingly fast than this but its written in the most cursed and blazingly slow C# code ive ever seen in my limited experience. Best part is they moved to c# from java like 7 years ago but just kept this system instead of rewriting it to use xml or "jizzle"
I think JSONs fall into the same pitfalls as document storage DBs. They are so easy to understand and implement, yet so goddamn hard to let go off even after you learn about better and BLAZINGLY FAST-er alternatives (like good ol' postgres or protobufs in this case). This requires a change in mentality more than anything. I suppose old habits die hard.
Prime, these videos make us all better programmers. Thank you so much, I have been so much healthier programming and having fun doing it just because of your content, and that makes me a much better programmer
Very cool video 😎 Would be interesting to see jsons position challenged. I wonder how gRPC would compare, as I have never used it in the browser. But, before we hate on JSON, let’s remember that it freed us from XML
It didn’t really. All these horrible front end “frameworks” go through linguistic and development hoops to preserve XML. UIs are defined in some bastard munged XML+JS format that can’t be merged. UIs can be described in JSON. But a bunch of HTML authors from 1994 still want to do
@@ABaumstumpf you dont need to send comments over API requests, lol. XML is fine for a lot of cases if you like it, but it makes no sense to use it for network traffic.
@@lunafoxfire "you dont need to send comments over API requests" Who said anything about API request?? "but it makes no sense to use it for network traffic." Then that applies exactly the same to Json.
been watching a lot of your videos, I feel like your newer 'hot take' videos on blogs are fine and all, but you're a good teacher and I'd like to see more of these more informative/objective vids from you because you're a pretty good teacher when you do this sort of content!
Your production quality has gone up and the humor is on point. Today I decided to subscribe! I can officially say that I like this channel now :) great content my dude!
The deserialization from json from the requests and the database to serialization to the response are actually in the top 5 of our bottlenecks in the company i work in. 40ms of a 100ms request are just for de-/serialization from/to json.
It's not apples to apples though, JSON is (generally) schema-less, while the simple binary format showed has to know the structure of the data before it can even parse it. If say fields are added later an outdated client might just parse garbage values not knowing how the new data is layed out. Yes you can version data correctly yourself or use good tools like protobuf. And for the size complaints, JSON is usually served gzipped so a lot of repeated " , {} [] won't really matter. I'd like a comparison with gzipped data, text compresses very well so it will get closer to binary.
I think another benefit of JSON (though it may not be as relevant in this example as in, say, a web application) over any binary format is that it is immediately readable to any user. I can't count the number of scripts or alternative clients I've been able to write just by inspecting the JSON data a website sends. A universal, plain-text format like JSON frees the potential of how software is used from just the developer and places some choice in the hands of the user as well. Another thing, I've found that I really like the TSV (tab-separated values) format for storing any tabular data. It's so simple, effortless to parse, and much more compact compared to JSON or even CSV (the schema is usually just the first line, or can even be omitted). I'm not sure how it would work as a data transfer format, and though I don't see it commonly used, I doubt there would be any significant problems.
As someone who tried writing this kind of stuff back before JSON and even before XML, the reason we went with this stuff was for debugging purposes. At the time it was so nice to just watch the stuff on the wire, and be able to have it be human readable. However, I would argue this was mostly because there was not a de-facto binary protocol well understood by common tooling. I think it's entirely possible for us to have a nice binary encoding that our tooling (things like wireshark, Charles proxy, Chrome dev tools) could read and understand without it being fat and bloated like JSON.
I guess it would be really beneficial for the adoption of protobufs as well, if you could just upload a .proto to devtools and see all the underlying data
I don't know how I got here guy, but I'm absolutely glad to be here - some slick vim keystrokes, nice editing and explanations. Keep it coming good sir.
Maybe try with gson? Memory-wise it'll use roughly the same amount of memory but speed-wise, I think it'll definitely do better + the readability + the embedded schema
I think I could have done a better job and got a bit slower on binary representation. But, it's such a hard balance to not be long-winded, but be entertaining. So thank you for the note!
I'm, just using "edn" (extensible data notation) by default since it's not only "blazingly" but also "extensibly" fast :D Amazing video btw, like always!
There's a Handmade Seattle talk from 2021 called "context is everything" that shows you can get huge speedups if you can tighten your schema, and if you have full control over your tools that do the parsing (down to using native code). JSON imposes a tax in comparison to raw bytes, but you can still get big gains if you can tighten down your schema. Thinking JSON/not JSON in black and white will cost you orders of magnitude. You should consider if you actually need the flexibility of wide-open generic constrains on either the parsing or the generation side.
"The schema is within the JSON" can be a feature sometimes (maybe?). I'm wondering if there are formats where the schema can appear in the header, and then the data is just packed binary.
Hi Man! So it's been a couple of weeks since you started to appear in my recomendations, for some reason RUclips is suggesting me all your VIM videos, now i don't give a damn about VIM and i didn't found the thumbnail any interesting so i decided to ignore it, but today it suggested me this one....so i watched it.....and now i've watched almost all of your videos :D This is very good and entertaining content, keep up the good work ( i'm still not gonna watch the VIM videos btw )
Human readable, dynamic formats are not binary. I am shook. I agree that people often use JSON where they don't really need a human readable format, but they have their place. I would actually be interested if there are human readable formats that are much faster to parse/generate. Also 10x difference is surprisingly low. Like, we are comparing a format that needs to be parsed to something that basically can get flushed straight to memory. JSON seems pretty amazingly fast for what it does
It was the smallest message I could send. As a message grew the performance got closer to 40 to 100x. Remember I'm also fully parsing the message. This isn't flat buffers where it lazily parses message access At that point you're talking a thousand x
I read somewhere that when in JS you want to init with a lot of object data, it's faster to JSON encode it and then let JS JSON decode it, just because JSON parsing parsing is faster than letting the JS parser parse all the object data on loading the page/script.
That is true, similar to how it is faster to use element.innerHTML = "...." than programatically add a div, set properties etc. But it's still faster to use a binary format (done properly).
Binary protocols often do some kind of chunked data structure that supports versioning the protocol nicely. I mean only as nicely as json can do in that things that are new and not supported in your version are just ignored, offsets are not hardcoded, etc. yet size is smaller, but some very little parsing is needed though...
Deku looks interesting, but it also looks like it was made for transferring objects of a known structure, and mainly size. Like, what if an object contains an array of an unknown size? Or even simpler, how do you transfer strings?
yah yahh, prime is on a roll! This is the best tech content on the internet if you ask me. Would be delighted if you pumped up even more of these. Plus the streams were you write the code for these experiments are so much fun :)
Love this type of video, I wasn't aware of Deku before watching. I'd be keen to see a breakdown of Protobuf vs Deku We used Twitch's Twirp framework for Golang at a previous company I worked at for exactly this reason
9:05 Though you can use the content-type header for this, for json it is "application/json" if you set it to something saying it is using deku, then people/servers would know that it is using deku
Hey, I clicked that weird red button that says SUBSCRIBE on it and indeed, there were multiple AJAX calls that were using JSON for BOTH request parameters and response data. Crazy!
W video, more more more. Too many videos out there aimed at beginners, very few aimed at intermediates (other than theo and fireship). Love the content. Love the streams. Good job dad.
Great video. Super interesting to see! The “XML is worse” part got me in kidneys. I have to work with shitty XML everyday as a messaging schema between our app and other companies integrations (LEGACYYY). Slow and terrible to work with. The worst of both worlds.
You missed the point. Binary representations are not flexible. You need to read properties in the exact order they are written. You can't add new properties in old data. You can't query them without the schema, which will be external and won't be provided with the data. You can't read data without the schema, if there are many versions of that schema, you're in trouble. If you update the schema, you have to update the consumer, even if the modification is backward compatible.
I loved this idea, And this is the first time I see how slow is JSON, I never thought about it before Thanks for the great ideas and videos, Please keep up the good work I love watching your videos, it is useful, funny and interesting to me
looking at the comments: seems most people consume messages without checking what they are ingesting (some even say that is the safe way), never heard of network endianness, didn't realize a few bits in the beginning of the frame can specify version, size and any other desirable property, don't know what rpc is, think computers should communicate in human, and the list goes on... so I think you are right.
I'm with you man, I have an embedded arm Linux mature project I recently spend some time optimizing code that the GCC profiler marked as high usage. There is some json in the project, this video makes me want to rethink it, so the device runs more cool, other people don't this beneficial, but I don't see the benefit of heating the atmosphere
@@andersoncunha7079 Who cares? 99% of devs are probably working stuff like web dev. Yeah we want the router and the actual webserver itself to be internally optimized to the femtosecond. But the website? Bro people are loading sites on their phones. If it takes a couple seconds, who cares? What matters is if it's maintainable. JSON is maintainable. And before you say "but you can break it with version updates" yeah and it just causes an image not to show on a website. It doesn't crash an airplane into the ground. Different applications bro.
The fact that there are a million different Javascript frameworks invented every day shows that the productivity boost and portability benefits that Javascript gives you are real.
When liking your video I opened the network tab and inspected the ja-sean request. The request header contains a 'context', params, and target which is the video id. The response just contains a "responseContext" with 2,783 bytes. Although small, it is more than a simple http status code. I wonder how "cheap" storage and transmission causes useless network traffic?
JSON is like the python of language, slow but convenient to use
I think this is the perfect description
We want to know more about JSON alternatives
@@worstfellow Yes, but definitely not XML
@@tattipodapatti yes even non programmers can understand JSON easily, government of india is using it in websites like income tax for the users to download as json
@@codyking9491 yes already we have too many javascript frameworks don’t want too many data formats 😅
Great video. Thanks for introducing me to Deku.
I think there is a lot of information left out here. I'm gonna outline my opinions below even if they don't matter:
1. JSON is not for computers it's for humans. Trouble shooting binary requests isn't easy.
2. GZIP compression can greatly reduce the bytes over the wire. JSON with lots of repeating keys can greatly benefit from gzip.
3. A lot of the time the database is your actual bottleneck and serialization doesn't matter for performance. In fact, 1,600 requests per second is more than most companies/services will ever see.
4. Having the document define the schema allows for flexible data. You can't just send arbitrary data with a binary protocol. This has pros/cons.
For a public facing web service that only receives a couple thousand requests per second I would pick JSON over any other structured binary protocol almost every time. Yes there is a cost to simplicity but there is also a cost to complexity. Make it easy for internal (engineers) and external (users) to troubleshoot. Just use JSON unless you have identified that serialization is the bottleneck in your endpoint.... For most endpoints/services, I would wager serialization is not the bottleneck.
things like scientific applications involving high density of data collected in small amount of time(rocket controls, explotions tests..), and things that are inherintly binary(vnc), JSON is not a good choice
anything involving man, it's good to use JSON
to your 3rd point. It was even greater 1,600 requests per milliseconds ~ which would be 1,600,000 requests per second
on point
Comments like this keep my hope in the humanity
Why gzip when we can Brotli?
Hey, did you like the video? I really enjoy making this kind of content. That's fun. Do you think I should do something with promises? (MAKE A COMMENT, Do not respond, RUclips has the worst notifications)
I think you should learn french
These videos are getting better and better both in quality and entertainment factor :)
That's what I like to hear
@@ThePrimeagen to bad you cannot 'hear' that...
unless you made a speech to text app in RuSt... :)
edit: text to speech
@@vaisakh_km if he had then it would have been blazingly fast
BLAZINGLY SLOW. Thanks for another great technical video! I hadn't actually thought about it until now, but this explains perfectly why we use protobuf so much in embedded work
Absolutely. By the way, I haven't responded to your discord message. I'm on a flight to meet my boss right now, and then I'll be able to update you
@@ThePrimeagen Thanks a ton. Have a safe flight and hope the meeting goes well
@@ThePrimeagen you have a boss? thought ur the boss
blazingly slow is the new blazingly fast
@@dickheadrecs I just heard that sentence with Fireships voice thanks for making my day
It would be nice to see a video comparing sending JSON/XML/Apache Avro/Protobuf over the wire with Rust/Go/JS. Great video, as always;
Absolutely must happen
For sure. I am also curious why “deku” instead of using exactly Avro or Protobuf. This approach doesn’t seem to take into account forward/backward compatibility, unless I missed something?
I can't stop loving this content. JSON is something I use daily but never really think about. A deeper than usual deep dive, yet still accessible. Thank you my good man
ty ty ty
Binary formats are difficult to maintain and scale at certain points, as ThePrimeagen notes, it's also not easy to version; you'll have to change your parser to read data correctly, or know where to take up unused space to give room for future improvements without breaking the protocol.
But it's really good for real-time systems that need to deliver a lot of data in a small amount of time. For example game servers where feedback should be fast and responsive both on the client and across other clients with different network conditions at 60fps, while also dealing with other systems like rendering.
Protobuf then?
i love that akame wallpaper behind your transparent terminal with nvim open. nice touch.
It kinda landed right at the best moment, as I'm trying to improve the performance of our microservices where are transferring millions of JSON messages and can see the real cost of JSON serialize/deserialization growing super fast. I was studying GRPC and just learned about this protobuf here, thanks again for sharing this kind of content!!!
GRPC is wonderful. And reading the proto files is so much better than working with swagger
There is actually a successor to GRPC & Protobuf called Cap’n Proto.
Afaik it's made by the same person but they are no longer working for Google. It's faster because Cap'n Protos RPC can "travel back in time" (send a rpc using the output of a another call before the call has even finished) and it does direct in-memory modification rather than adding extra encoding & decoding steps.
You of course keep the benefit of having a separate protocol definition that can be used across multiple languages.
@@Ether_Void “before call is even finished” sounds cool but also scary this rings so many alarms in my head as a security analyst
"This data transfer specification is slower than writing non-interchangeable data structures in my preferred language" yeah of course, and also of course it's less space-efficient (and slower to parse) than all the various binary formats (but see also gz). JSON is human-readable, available in every language, and good enough for most use-cases (except at FAANG scale); no one ever accused JSON parsers of being particularly speedy, but also most systems don't have JSON parsing as a bottleneck. I ran into this at a previous company, they wanted to switch to protobuf or something similar to save some bytes, and I had to call out that they had some terrible queries in common paths that took over 5 seconds to complete, and needed to focus on real problems. And about the last quip: if we got rid of JSON today, some idiot with a popular blog would say it's time for XML to make a comeback, and then we'd really be in deep shit.
Recently I was comparing Deno and Rust. What surprised me was how much simple JSON response weigh...
People just don't realize how expensive simplicity is
@@ThePrimeagen I'm a simple man; consequently, an expensive one as well.
@@enclave2k1 xd
Really enjoyed this. It just seems like, for better or for worse, the ease and convenience of JSON trumps everything else.
Love your videos dude!
What's with the dancing squirrel 💃🐿️?
🤣🤣🤣
Good one!
I interned at an insurance company where the way they received new insurance data was through a single string that was about 5000 characters long and parsed it by slicing that string at intervals and it almost always caused issues with encoding because it used ascii encoding and all the responses were in utf-8 so it had a billion safeguards that would always break when someone entered a non ascii character.
I'd say it doesnt get more blazingly fast than this but its written in the most cursed and blazingly slow C# code ive ever seen in my limited experience. Best part is they moved to c# from java like 7 years ago but just kept this system instead of rewriting it to use xml or "jizzle"
I think JSONs fall into the same pitfalls as document storage DBs. They are so easy to understand and implement, yet so goddamn hard to let go off even after you learn about better and BLAZINGLY FAST-er alternatives (like good ol' postgres or protobufs in this case). This requires a change in mentality more than anything. I suppose old habits die hard.
Exactly. Simplicity is amazing. I cannot stress that enough. But it also costs a lot.
Crazy timing. My frontend team at work is frequently wondering why we're not doing JSON payload, so i can send this! Super informative, thanks!
hah! well if your backend is dictating it, they are smert
Prime, these videos make us all better programmers. Thank you so much, I have been so much healthier programming and having fun doing it just because of your content, and that makes me a much better programmer
Love to hear that giga Chad
holy gigachad comment
Very cool video 😎
Would be interesting to see jsons position challenged. I wonder how gRPC would compare, as I have never used it in the browser. But, before we hate on JSON, let’s remember that it freed us from XML
It didn’t really.
All these horrible front end “frameworks” go through linguistic and development hoops to preserve XML.
UIs are defined in some bastard munged XML+JS format that can’t be merged.
UIs can be described in JSON.
But a bunch of HTML authors from 1994 still want to do
Yeah, it "freed" us from a more structured approach that also supports comments and validation....
@@ABaumstumpf you dont need to send comments over API requests, lol. XML is fine for a lot of cases if you like it, but it makes no sense to use it for network traffic.
@@lunafoxfire "you dont need to send comments over API requests"
Who said anything about API request??
"but it makes no sense to use it for network traffic."
Then that applies exactly the same to Json.
@@ABaumstumpf ...the whole video was about network traffic... ...thats what the original comment was about...
I'm loving the new editing style. not too much but just the right amount of sass
These types of videos are super valuable and I really appreciate the way you go into the technicalities and explain these topics. Loving the format!
been watching a lot of your videos, I feel like your newer 'hot take' videos on blogs are fine and all, but you're a good teacher and I'd like to see more of these more informative/objective vids from you because you're a pretty good teacher when you do this sort of content!
Your production quality has gone up and the humor is on point. Today I decided to subscribe! I can officially say that I like this channel now :) great content my dude!
Absolutely loving these performance deep dives! Keep ‘em comin!
The lab coat and theme when you went all sciency gave me serious Garand Thumb vibes. I love it. You guys even kind of look like each other.
I want an update of this video, this time featuring JDSL and Tom the genuous
damn! that sub request was really well placed and timed! nice one!
The deserialization from json from the requests and the database to serialization to the response are actually in the top 5 of our bottlenecks in the company i work in. 40ms of a 100ms request are just for de-/serialization from/to json.
That is incredible. And if you're using a garbage collected language, don't forget the effects of garbage collection.
@@ThePrimeagen Well, we use Java so... yeah.. xD
It's not apples to apples though, JSON is (generally) schema-less, while the simple binary format showed has to know the structure of the data before it can even parse it. If say fields are added later an outdated client might just parse garbage values not knowing how the new data is layed out. Yes you can version data correctly yourself or use good tools like protobuf. And for the size complaints, JSON is usually served gzipped so a lot of repeated " , {} [] won't really matter. I'd like a comparison with gzipped data, text compresses very well so it will get closer to binary.
What about gzip, brotli, etc? I don't worry too much about json size overhead as long as the client supports compression (especially brotli).
i think you fall a sleep during the video
parsing json is slow
I think another benefit of JSON (though it may not be as relevant in this example as in, say, a web application) over any binary format is that it is immediately readable to any user. I can't count the number of scripts or alternative clients I've been able to write just by inspecting the JSON data a website sends. A universal, plain-text format like JSON frees the potential of how software is used from just the developer and places some choice in the hands of the user as well.
Another thing, I've found that I really like the TSV (tab-separated values) format for storing any tabular data. It's so simple, effortless to parse, and much more compact compared to JSON or even CSV (the schema is usually just the first line, or can even be omitted). I'm not sure how it would work as a data transfer format, and though I don't see it commonly used, I doubt there would be any significant problems.
As someone who tried writing this kind of stuff back before JSON and even before XML, the reason we went with this stuff was for debugging purposes. At the time it was so nice to just watch the stuff on the wire, and be able to have it be human readable. However, I would argue this was mostly because there was not a de-facto binary protocol well understood by common tooling. I think it's entirely possible for us to have a nice binary encoding that our tooling (things like wireshark, Charles proxy, Chrome dev tools) could read and understand without it being fat and bloated like JSON.
This
Insert obligatory xkcd standards cartoon here ;-)
I guess it would be really beneficial for the adoption of protobufs as well, if you could just upload a .proto to devtools and see all the underlying data
This is an algorithmic message to let you know that this type of content is really really appreciated !
I don't know how I got here guy, but I'm absolutely glad to be here - some slick vim keystrokes, nice editing and explanations. Keep it coming good sir.
0:24 its used by Tom The Genius Jay-Diesel
Maybe try with gson? Memory-wise it'll use roughly the same amount of memory but speed-wise, I think it'll definitely do better + the readability + the embedded schema
This was a really cool deep-dive, but also very entertaining and straight to the point. Looking forward to more stuff like this. :)
I think I could have done a better job and got a bit slower on binary representation. But, it's such a hard balance to not be long-winded, but be entertaining.
So thank you for the note!
I'd just like to interject for a moment. What you're referring to as Json,
is in fact, JSML, or as I've recently taken to calling it, Tom is a genius.
I'm, just using "edn" (extensible data notation) by default since it's not only "blazingly" but also "extensibly" fast :D
Amazing video btw, like always!
Tytyty
had to pause at the intro just to mention how smooth that subscriber plug was. ggwp
edit: what a great video
Loving this series! This is great content covering topics that aren't too easy to find, most of YT is for junior devs.
Your jokes in this type of developer/programming videos is funny and unique and I would love to see more of em keep up the good work.
Very informative and comedic video, thanks :) Keep up the good work TheScienceagen
Love this form of content Prime! 👍 You make harder concepts (for me anyway) as digestible as possible! So invaluable.
insanely high quality video. thank you for all the information!
Amazing video as always, please keep them up!
I really enjoy these comparison videos, especially with your humor.
There's a Handmade Seattle talk from 2021 called "context is everything" that shows you can get huge speedups if you can tighten your schema, and if you have full control over your tools that do the parsing (down to using native code).
JSON imposes a tax in comparison to raw bytes, but you can still get big gains if you can tighten down your schema.
Thinking JSON/not JSON in black and white will cost you orders of magnitude. You should consider if you actually need the flexibility of wide-open generic constrains on either the parsing or the generation side.
Love these style videos. Keep it up!
Thank you. I feel like it's a fun break from the VIM content
having just seen Mark Robber's last Olympiad for squirrels. I can appreciate the squirrel for science...
Science squirrels are incredibly important
0:08 made me laugh way too hard, thank you Mr. Prime
"The schema is within the JSON" can be a feature sometimes (maybe?). I'm wondering if there are formats where the schema can appear in the header, and then the data is just packed binary.
Dude this guy is on another level. Love your videos.
I just love your content, it's smart, useful, in dept and quite entertaining!
This is one of the best videos on programming I've ever seen. I feel enlightened and empowered
Great video, some interesting points mentioned here. Looking forward to seeing more
For more information on that topic I recommend chapter 4 of Designing Data-Intensive Applications. Really good video :D
I loved this video! Thank you Primeagen
Hi Man! So it's been a couple of weeks since you started to appear in my recomendations, for some reason RUclips is suggesting me all your VIM videos, now i don't give a damn about VIM and i didn't found the thumbnail any interesting so i decided to ignore it, but today it suggested me this one....so i watched it.....and now i've watched almost all of your videos :D
This is very good and entertaining content, keep up the good work ( i'm still not gonna watch the VIM videos btw )
This is fantastic. Would you be able to share the testing setup you had to compare JSON and DEKU?
Human readable, dynamic formats are not binary. I am shook. I agree that people often use JSON where they don't really need a human readable format, but they have their place. I would actually be interested if there are human readable formats that are much faster to parse/generate.
Also 10x difference is surprisingly low. Like, we are comparing a format that needs to be parsed to something that basically can get flushed straight to memory. JSON seems pretty amazingly fast for what it does
It was the smallest message I could send. As a message grew the performance got closer to 40 to 100x. Remember I'm also fully parsing the message. This isn't flat buffers where it lazily parses message access
At that point you're talking a thousand x
@@ThePrimeagen Fair point
I love this style of video! The live stuff is neat as well but I find these are much easier to focus on
This is awesome! More of these deep dives please!!
For the algo
For the thankfulness
I like this format! Could be even a bit deeper / more technical and longer :). Anyway, super cool format!
I read somewhere that when in JS you want to init with a lot of object data, it's faster to JSON encode it and then let JS JSON decode it, just because JSON parsing parsing is faster than letting the JS parser parse all the object data on loading the page/script.
That is true, similar to how it is faster to use element.innerHTML = "...." than programatically add a div, set properties etc. But it's still faster to use a binary format (done properly).
Binary protocols often do some kind of chunked data structure that supports versioning the protocol nicely. I mean only as nicely as json can do in that things that are new and not supported in your version are just ignored, offsets are not hardcoded, etc. yet size is smaller, but some very little parsing is needed though...
Deku looks interesting, but it also looks like it was made for transferring objects of a known structure, and mainly size. Like, what if an object contains an array of an unknown size? Or even simpler, how do you transfer strings?
Love these videos! Keep 'em coming!
Trying!
your technical videos are so good!
I don't even work with Rust and NodeJS, but it's pretty fun your videos. Keep going!
Anyone else currently looking at a JSON parser and your head constantly screaming "Jéson!"?
yah yahh, prime is on a roll!
This is the best tech content on the internet if you ask me. Would be delighted if you pumped up even more of these.
Plus the streams were you write the code for these experiments are so much fun :)
Can you make a video about CBOR, the binary JSON alternative?
Love this type of video, I wasn't aware of Deku before watching. I'd be keen to see a breakdown of Protobuf vs Deku
We used Twitch's Twirp framework for Golang at a previous company I worked at for exactly this reason
Exactly what I was looking for. Thank you!
I really love this sort of content from you
MY GOD, this was so satisfying to watch and listen to
9:05 Though you can use the content-type header for this,
for json it is "application/json" if you set it to something saying it is using deku, then people/servers would know that it is using deku
this was really beneficial and interesting. thanks for posting it. i learned a lot
Hey, I clicked that weird red button that says SUBSCRIBE on it and indeed, there were multiple AJAX calls that were using JSON for BOTH request parameters and response data. Crazy!
Weird. This Json thing might just catch on after all
I absolutely vote for more videos like this, but maybe some more involvement in the creation of the code. Loving it! SO much JSML o_O
One of the easiest subscribing of my life, top notch quality
ty ty ty ty
JSON maximizes readability, whereas deku maximizes speed. Same tradeoff between Rust and Python
As a french dev, hearing you say "jason" makes me giggle in the weirdeist way 😅
Your more technical videos are so great, kudos !
Will do, and I will never let JSON be said any other way
W video, more more more. Too many videos out there aimed at beginners, very few aimed at intermediates (other than theo and fireship). Love the content. Love the streams. Good job dad.
Would be interesting to see comparison between deku and MsgPack since it sits somewhere in between binary and json
This was very interesting and something I hadn't really given second thought to when I'd use JSON
Great video. Super interesting to see! The “XML is worse” part got me in kidneys. I have to work with shitty XML everyday as a messaging schema between our app and other companies integrations (LEGACYYY). Slow and terrible to work with. The worst of both worlds.
You missed the point. Binary representations are not flexible. You need to read properties in the exact order they are written. You can't add new properties in old data. You can't query them without the schema, which will be external and won't be provided with the data. You can't read data without the schema, if there are many versions of that schema, you're in trouble. If you update the schema, you have to update the consumer, even if the modification is backward compatible.
Excellent work, Dr. Prime!
I loved this idea, And this is the first time I see how slow is JSON, I never thought about it before
Thanks for the great ideas and videos, Please keep up the good work
I love watching your videos, it is useful, funny and interesting to me
Great video, please do more subjects in depth. These nuggets of knowledge help us all! Thank you.
love it when the scienceagen comes in to do some blazingly fast research for us all
I really love your work man. You're the Senior I'd aim to become ^-^
but my eyes can read json blazingly fast
As an embedded C engineer i love efficient code. I think JavaScript is the evidence of the decline of our civilization.
Facts
looking at the comments: seems most people consume messages without checking what they are ingesting (some even say that is the safe way), never heard of network endianness, didn't realize a few bits in the beginning of the frame can specify version, size and any other desirable property, don't know what rpc is, think computers should communicate in human, and the list goes on... so I think you are right.
I'm with you man, I have an embedded arm Linux mature project I recently spend some time optimizing code that the GCC profiler marked as high usage.
There is some json in the project, this video makes me want to rethink it, so the device runs more cool, other people don't this beneficial, but I don't see the benefit of heating the atmosphere
@@andersoncunha7079 Who cares? 99% of devs are probably working stuff like web dev. Yeah we want the router and the actual webserver itself to be internally optimized to the femtosecond. But the website? Bro people are loading sites on their phones. If it takes a couple seconds, who cares? What matters is if it's maintainable. JSON is maintainable. And before you say "but you can break it with version updates" yeah and it just causes an image not to show on a website. It doesn't crash an airplane into the ground. Different applications bro.
The fact that there are a million different Javascript frameworks invented every day shows that the productivity boost and portability benefits that Javascript gives you are real.
When liking your video I opened the network tab and inspected the ja-sean request. The request header contains a 'context', params, and target which is the video id. The response just contains a "responseContext" with 2,783 bytes. Although small, it is more than a simple http status code. I wonder how "cheap" storage and transmission causes useless network traffic?
Great video. Does the verbosity of JSON affect the file size of requests once they are gzipped?
Yes, gzip just compresses it to hell but it still need to compress the keys and syntactic stuff like quotes, colons, etc.
Loved it, thanks for this type of videos