I think that it is also worth mentioning that JSON is easily read by humans, which is something that isn't mentioned and this fact is a very important aspect of its popularity.
It's gotta be, since you're not able to comment. As usual it's, Horses for courses. It's great to have a small, unstructured, and browser friendly option like JSON, but when the use case shifts to the composition of large complex documents from multiple sources, and metadata collection, XML is worth it's bulk.
@@likwidmocean Oh for sure, its the same as using dynamic types vs static types in a language for a corporate project level. The former is nice, but quickly looses its value when you deal with thousands of lines of code.
Strictly speaking JSON can be parsed as ASCII since all syntax characters are ASCII. String parsing can be done separately; the string itself is UTF-8 encoded, but escape sequences have to be handled first, and the unicode escape sequences are actually UTF-16 encoded. So it's kind-of a messy format in that sense.
@@JochemKuijpers yeah, if you even skip parsing keys which could contain any utf-8 character as well then it could be parsed as ascii. But what is the point if you cannot parse the data?:)
He frequently gives wrong information so that's to be expected. I wouldn't want him as my professor. He spends most of the time talking about irrelevant things and drawing poor explanations. If you showed 5:07 to anyone and ask them what is written there, you'd be happy to get confused look. In short this video could have been simpler, shorter and better explained.
He missed three values: true, false, and null. This is mainly notable because he otherwise covered absolutely everything you can represent in JSON. The nice thing about it is that it can represent enough structures to be useful and can't represent anything strange or subtle, which minimizes the chance that you'll have bugs in parsing.
@@KuraIthys i'd go with a linked list of structs, with one element being a pointer to a string. You can even allocate that without wasting memory since you should decide their name before they are actually born! Only problem with that is when you free(myself) the reference to them will be lost, but still... That's a problem of whoever will have to upkeep the code :D
JSON has exactly two problems: 1. No support for comments 2. A comma after the last item in an array or object is a syntax error Both of these cause problems when manually editing JSON files, which I know isn't what JSON was designed for, but it would be so easy to fix. Maybe I should file an RFC
Hah, I can feel you brother. So many Ansible scripts have caused an invalid configuration files for me just because I forgot that last comma is considered syntax error. Lack of comments is not so much of an issue in communication, but when writing configuration files it is nice to have them. Both are, like you said, easy fixes but they will never happen, too many integrations would need to be updated.
There is a newer format called JSON5 that addresses all of those issues. Basically bringing JSON to ECMAScript5+ world. Identifier names without quotes, single quotes around strings, trailing commas, comments, hex numbers, ...
Jonathan Crowder one of the issues with "__comment" field is when the system using this JSON document also has the document schema to check it against, and it starts to complain about unknown field...
JSON is a great topic! 👏👏👏There are so many things to talk about around it: - jsonb - links to other JSON objects (instead of nesting everything in place) - JSON Schema - JSON API - JSON and Web Semantic - JSON-LD It would be great to beginning a series around this :)
JSON is one of those things in the programming tool box that made me feel really powerful the first time I picked it up. Up to that point I had been taught to either write all my program data out in a csv file, exporting and parsing "by hand" or use the pre-written SQL queries I'd been given to interact with a database. Having the ability to quickly pull data in and out of my program as whole objects without having to worry about how it was being written or whether all the control breaks were happening in the correct order made me feel untouchable.
I got the same feels but with integration. Literally getting data that your app can use just like that, directly from Google's servers is a pretty powerful thing. Same goes for integration with Facebook, Twitter, Instagram, etc. I couldn't believe how easy it was the first time. Really makes you appreciate abstraction.
Number: a signed decimal number that may contain a fractional part and may use exponential E notation, but cannot include non-numbers such as NaN. The format makes no distinction between integer and floating-point.
Great introduction, but a few errors: Javascript and json have as much in common as java and javascript. JSON is inspired by javascript. It doesn't use the same parser. A javascript interpreter probably exposes a json api too, and stores the binary representation in the same way, but they are not the same. If your javascript is using the javascript parser to parse json, you are open to attack. In json, the strings are also not required to be unique, so it is not the same as a associative array. Any standard compatible parser however is allowed to place additional requirements, so most implementations ignore this fact. Javscript is not lightweight. It is lighter than some other text formats, but heavier than other text formats. It is also a lot heaver than binary formats such IFF/RIFF.
@@okie9025 While it's true that it is based on a small subset of javascript, it is such a small subset and it makes changes to how you can use it, that it's basically inspired by it. In javascript, for example, you can have comments, in json you cannot. In json the string key in a object is not required to be unique, in js/chrome it ignores everything but the last.
@@okie9025 so that's 2 exceptions for the object, no comments=3, json doesn't support functions=4, it's starting to have more things that are not javascript than things that are.
@@madeso functions are different for every language and are this compiled in different ways also comments are not needed in a storage file, even SQL has very limited support for comments that only work in actual SQL queries
you see a lot of JSON files in games nowadays. i guess it is easier to change the settings of things when they are defined clearly in human readable form and do not need compiling to be used.
I'd expect to hear the word "serialization" at least once in this video. Let's go deeper and speak about BSON, Protobuf, CBOR, and etc then? Lightweight is a bold statement without comparison to alternatives. Just compare JSON and CBOR datagram sizes for the same data. You'll notice that CBOR produces much lighter data sizes.
@@SergeMatveenko I see your point . But this is barely a preview/presentation to JSON. Computer scientists who'll actually work with JSON will anyway dive deeper for more information and for a non-programmer this is the most "human" way to introduce JSON. So everyone is safe . (sorry for bad english)
5:13 to 5:20: "It is used for transferring data in a way that is pretty much _agnostic_ to the programming language being used." That was a beautiful and figurative way to use the word "agnostic"!
The fact that there is a dispute about this is an argument against the entire concept of using anthropocentric data structures to transfer data between machines.
@Ark Osf Absolutely. Machines are infinitely better than humans at generating human-parsable code that is also machine-parsable. The rubegoldbergian insanity is that in a properly working program, none of this human-parsable code is ever going to be parsed by a human. It's a small miracle that cryptographic key exchanges aren't done in Roman numerals.
There are of course other data exchange formats that are agnostic to the application they're used in, and fairly human readable, such as YAML. I think the reason JSON is preferred is because the way it declares data types is more intuitive to programmers, as strings must be declared in quotes, numbers are always bare, arrays are in square brackets, and so on... I've used YAML several times, and while it's nicer to look at for something like a config file, half the time I forget the syntax for everything in it.
One possible solution for representing a graph (or a tree with optional parent references) is using an indexing system, because you cannot reference parent nodes directly. Let's see a UK road map: { "graph": [ { "city":"London", "roadTo":[1,2,3] }, { "city":"Bristol", "roadTo":[0] }, { "city":"Manchester", "roadTo":[1,2] }, { "city":"Nottingham", "roadTo":[4] }, { "city":"Liverpool", "roadTo":[1,3] } ] }; So London(0) has a road to Bristol(1), Manchester(2) and Nottingham(3). To get from London(0) to Liverpool(4), you have to connect in Nottingham(3). graph[0] is the root node of the graph.
That it is not JaSON is the real tragedy here. What kind of computer person ignores such a pun when it's handed to them on practically a gold platter? The a isn't just convenient, it's right there in the spelled-out word. But then again, there is this English guy who took the longest sounding English letter in the alphabet and trippled it just to name something. :-P
Eve is classically the eavesdropper in the scenario involving an exchange from Alice (point A) and Bob (point B). Using the name Eve in the context of this video would be unfitting.
2:45 Regarding arrays, it is possible to represent arrays as general objects, i.e. ["Alice", "Bob", "Charlie", "Derek", "Esmeralda"] could be represented as {"1": "Alice", "2": "Bob", "3": "Charlie", "4": "Derek", "5": "Esmeralda"}. Of course, the disadvantage is, it introduces issues like possibility of noncontinuous indices, 0/1-based indexing mixups, index transformations (string vs. integer) etc. that objects allow, so an object would need to be checked (upon reading the JSON) if it is indeed an array and/or stripping it of non-array features.
Those are still separate data types. One is indexed while the other is a hash. They are not equivalent because the index in Arrays are integers, whereas keys in objects are strings. So you'd have to coerce the key before it could be used interchangeably (in JavaScript). Even worse for other languages that parses JSON, is that the deserialized data types are different and would need further serialization to become compatible with one another.
didn't JSON come about as a response to XML in that you could do "var data = eval(...);" on it and let the JS parser do the work for you? obviously the security implications won the day and JSON.stringify/JSON.parse came about, but I thought it was born out of that ad-hoc convenience.
JSON as data exchange method became popular since Javascript had no built-in way to parse XML data and dumping out complex data to the browser as directly parsable JS code was simply the easiest method. Programmers also like it for making a clear distinction between numeric values and text strings, which XML doesn't.
I don't think so. Its primary purpose is structuring objects, considering JavaScript doesn't have concept of classes. At some point people just started using it for communication since serializing and reverting it back to object is pretty straight forward and it maintains structure you are using in your code anyway.
Web browsers are really slow, bloated virtual machines. Node.js is lovely and it's 'fast' because it's essentially compiling the code before its run. You know what's even faster? A real language with a COMPILER. I'm sorry that BEtoN is so hard to type that everybody throws a ton of their compute resources in the bin.
It's called serialization. A program has some data (or code) in memory but dumping the contents of the memory wouldn't produce a usable format to store the data and then read back. You need some kind of representation for it, that is easy to read and parse. Creating such a format for every use case separately is a waste, so it's often better to use some preexisting serialization library to do it.
Anybody else get the JSON error message uploading thumbnails to RUclips with Chrome. Works fine with Edge and Firefox and just another example of RUclips's impressive enhancements since focusing on the Studio Beta fiasco.
I am looking a journal which hosts papers from the early beginning of computer science. Can anyone help me on this matter? Thank you very much in advance.
Json is usefull as well as a txt delimited file. No advantage of any sort to use that instead of a txt delimited or fixed, that's why most standard communication softwares do not use json but delimited text for interface functions. The reason is easy to understand, what is difficoult in data communications is not the format of the file containing the data, but the nature of the data transfered. So it doesn't matter if the data is transfered with plain text, json or xml, it is a question of what data is contained in the file.
Any state management library for JavaScript uses JSON to store state :) That is the fundamental of JavaScript, and what it's building blocks are (objects).
Building a packet sniffer seems kinda out of scope for a computerphile video.... It's not simple in the slightest and definitely spills out of the 5-15 minute bites they normally put out.
A config file in this format is very human readable in notepad without a syntax highlighter, almost as clear as an ini, perhaps even better because the structure is visually nested. An XML is almost impossible to read, and type all the repeated attributes for every value and symbols all with the right hand. Most XML was clearly never intended to be edited by people, but is still saved to disk in verbose text format.
No standard-defined way to store binary data though. You may send an array of ints, or a base64-encoded string or a string of hex digits, but there's no native built-in way in Javascript to encode a binary segment.
@@lawrencedoliveiro9104 So, first I take the PNG image from disk on the sender computer, decode it, disassemble into components, send them over JSON, including the image data as arrays of decimal values, then reassemble on the other end before displaying as a notification icon?
We use JSON because ASN.1 does not solve the problem JSON solves. ASN.1 is a schema language which we use to describe how the data will look and then allow an application to serialize or deserialize data according to that schema, usually by compiling the ASN.1 schema into code in that language. JSON on the other hand is simply a data representation format. The reason this distinction matters is because sometimes you want a fixed schema with rules in how data is to be represented and bounds on the values and other times we specifically need it to be very free form. If you are not sure what kind of data might be fed into your program but need to represent it in some way JSON or MessagePack might be for you while on the other hand if you will always have the same set of fields and data types in those fields then Protocol Buffers or ASN.1 might be for you. As someone who uses both schema'd and schemaless data formats for software I write I can honestly tell you both are useful in their own regard.
Gotta love how we have up to 7 kinds of the Universal Serial Bus used at once, but the JavaScript Object Notation is a standard that works pretty much globally.
They are both objects in JavaScript, so it's true. All values in JavaScript. There are some primitive types like `number` and `string`, which values of those types resolves to when using `typeof`, but they are objects as well. All values has a prototype, which is inherited from Object's prototype. Another thing to notice is the use of == (loose equality) rather than === (strict equality). Loose equality converts types beforehand, so 1 == '1'. Whereas strict equality compares both the type _and_ the value, without type conversion, and therefore 1 !== '1'. If you wanted to check wether a value is an array and not an `object`, then use Array.isArray.
JSON is great for computing in general in my view, but is no replacement for RDBMSs #DataIntegrity #WhyWaitToWriteToDB Dr. B here even hinted at it if you caught it, when he remarked that it'd be interesting to see a graph like structure with a bunch of relationships to one another. That's where Relational Databases shine, foreign keys and all that. Plus the fact that in some NoSQL DB architectures you can wait to write data, not having to at the instant the operation is handled, which can lead to trouble with data integrity... all for fast benchmarks... which of course JSON offers with its universal and small "packets".
1:01 IBM doesn’t make PCs any more. I think you meant “Microsoft-compatible PC”. After all, Microsoft has been the arbiter of “compatibility” ever since Microsoft Flight Simulator became the original benchmark for this back in the mid-1980s.
Eh, as a longtime Linux user I'd rather we didn't define PCs in terms of Microsoft products... PS. Nowadays Apple computers are also almost-PCs in my book, just locked down. They have identical architecture and they can also run Windows (with Boot Camp), so your definition crumbles. What differentiates them from PCs is their locked-down nature (less than in the past), since interchangeability is a defining characteristic of PCs. Sorry for being pedantic :D
@@jasondoe2596 That’s the reality of it, though. Nowadays we are seeing other types of PCs, ones that owe nothing to Microsoft, start to become more popular, the obvious example being the Raspberry π family.
Text encoding is an inefficient compromise. I don't want to make things easier for amateurs - I understand and can handle endiness perfectly fine. Web browsers nowadays? are just very inefficient virtual machines. If we were to design a system that could give us youtube/facebook today? it'd be pure binary - it'd run full speed on a fraction of the hardware you need today. A screen with layout, text and images shouldn't be maxing out all the cores of a modern desktop system. Webpages already are applications running in a virtual machine and barely anybody is using a text editor to make them anymore - so why are we wasting epic amounts of computational power on something that should of been implemented like java/flash from the very beginning?
Quite. Text formats should only be used where they are expected to be read and written by humans. Having written both a parser for a toy programming language while at uni and a binary loader for Windows BMP on a just-for-fun project, I can say it's a lot more work writing the parser than the loader. Using text formats when the files are never going to be human-written is just masochism.
Because a lot of people have issues running a random executable on their own machine. I'll happily sacrifice some performance to gain security. In related news, I block all online java/flash content.
For any such indexing requirements, JSON on its own won't suffice. It's only a way to describe relationships and what not. You will need a NoSQL engine working on top of this data , which then creates index's and what not.
Depends on what you mean by "third". The 3rd that exists in the data? the third that was born? The 3rd best liked? You can easily traverse a tree, directly look up the n'th instance or search by key.
Compared to written material in web, that was quite poor explanation. If anyone wouldn´t know what JSON is, they wouldn´t get it. And who knows what it is, would watch with wish to rush in and add own comments or better examples.
JSON is purely a declarative language, you cannot put code in it. One thing you can do is to assign a unique id to each node, and then reference the node via its id. Later build an index where you can address objects by id, but you can only do it from code, it cannot be part of the data. For example: { "id":1, children: [ { id:2 }, { id:3 }, {id:4, parent:1 } ] }; That's how PDF does it as well.
This video presents JSON as a data format that is not bounded to software.. That's not true, it's just a very commonly accepted data format *but you still have to have software that accepts it*. In that aspect it's no different from any other data format.
I think that it is also worth mentioning that JSON is easily read by humans, which is something that isn't mentioned and this fact is a very important aspect of its popularity.
It's gotta be, since you're not able to comment.
As usual it's, Horses for courses. It's great to have a small, unstructured, and browser friendly option like JSON, but when the use case shifts to the composition of large complex documents from multiple sources, and metadata collection, XML is worth it's bulk.
@@likwidmocean Oh for sure, its the same as using dynamic types vs static types in a language for a corporate project level. The former is nice, but quickly looses its value when you deal with thousands of lines of code.
And it's also slow ; but yes, you're comment is 100% correct
JSON is not just ASCII. It accepts the entire UTF-8 encoding and requires it from decoders in its spec.
Strictly speaking JSON can be parsed as ASCII since all syntax characters are ASCII. String parsing can be done separately; the string itself is UTF-8 encoded, but escape sequences have to be handled first, and the unicode escape sequences are actually UTF-16 encoded. So it's kind-of a messy format in that sense.
@@JochemKuijpers yeah, if you even skip parsing keys which could contain any utf-8 character as well then it could be parsed as ascii. But what is the point if you cannot parse the data?:)
They just said ASCII instead of plaintext, boohoo. Don't waggle your exotic encoding knalij on our faces, you and the people that liked this :')
He frequently gives wrong information so that's to be expected. I wouldn't want him as my professor. He spends most of the time talking about irrelevant things and drawing poor explanations. If you showed 5:07 to anyone and ask them what is written there, you'd be happy to get confused look. In short this video could have been simpler, shorter and better explained.
@@MladenMijatov Whoosh!
The sound a swinging pedant makes.
From the title, I thought this video would consist of pronunciation advice
me too
Yup same
WaysToSaySame.json
😂
What's the joke about XML? It's a format that can't be read by people or machines.
And yet XML is a 'simplified' form of SGML created precisely because SGML had exactly that problem.
At least XML allows comments.
As a way to define a data struct, I find XML pretty solid. But as a format to interface with, indeed it is terrible.. therefor XML2JSON !
@@lawrencedoliveiro9104 .jsonc for that, or YAML
@@lawrencedoliveiro9104 Yea... the lack of comments in json kind of ruined it for me so I prefer to use ini files since usually that's all I need.
The video really ends at 0:25
😂😂😂
I felt the same
I only clicked because the title made it sound like he was gonna insist it's pronounced J S O N
@@sammbci me too, so I feel kind of ripped off for my time...
"Children tend to be an ordered collection" is probably my favourite sentence. :-)
He missed three values: true, false, and null. This is mainly notable because he otherwise covered absolutely everything you can represent in JSON. The nice thing about it is that it can represent enough structures to be useful and can't represent anything strange or subtle, which minimizes the chance that you'll have bugs in parsing.
"Your children are a list of strings"
Personally, I would prefer them to be pointers.
NULL pointers as long as they are not born?
Or children pointers? lol
actually, only your dog could be a pointer
@@Jet-Pack wouldn't that mean knowing in advance how many children you'll have in future?
@@KuraIthys i'd go with a linked list of structs, with one element being a pointer to a string. You can even allocate that without wasting memory since you should decide their name before they are actually born!
Only problem with that is when you free(myself) the reference to them will be lost, but still... That's a problem of whoever will have to upkeep the code :D
So Jason Derulo's full name is actually Java Script Object Notation Derulo?
Indeed.
Terrible jokes like these? are like a warm crackling fire for a geek - never, ever stop.
No, notice the *a* in J*a*son?
Yes.
JSON Derulo
JSON has exactly two problems:
1. No support for comments
2. A comma after the last item in an array or object is a syntax error
Both of these cause problems when manually editing JSON files, which I know isn't what JSON was designed for, but it would be so easy to fix.
Maybe I should file an RFC
Also the ability to disregard the quotation marks in some places like JS allows like {one:"Something"} instead of {"one":"Something"}
Hah, I can feel you brother. So many Ansible scripts have caused an invalid configuration files for me just because I forgot that last comma is considered syntax error. Lack of comments is not so much of an issue in communication, but when writing configuration files it is nice to have them. Both are, like you said, easy fixes but they will never happen, too many integrations would need to be updated.
There is a newer format called JSON5 that addresses all of those issues. Basically bringing JSON to ECMAScript5+ world. Identifier names without quotes, single quotes around strings, trailing commas, comments, hex numbers, ...
1. CJSON exists
Jonathan Crowder one of the issues with "__comment" field is when the system using this JSON document also has the document schema to check it against, and it starts to complain about unknown field...
2:20 “This value can be any other JavaScript thing” except a Function, a Symbol, a RegExp, a Date, undefined, null (except in arrays), Infinity, NaN…
null is a always a valid value, not just in arrays
@@slash_me oh, my bad
I think he meant any JavaScript primitive.
@@NatoNathan Symbol and Undefined are primitive types, and Infinity and NaN are primitive values, yet none of them exist in JSON
my, bad forgot about those.
JSON is a great topic! 👏👏👏There are so many things to talk about around it:
- jsonb
- links to other JSON objects (instead of nesting everything in place)
- JSON Schema
- JSON API
- JSON and Web Semantic
- JSON-LD
It would be great to beginning a series around this :)
JSON is one of those things in the programming tool box that made me feel really powerful the first time I picked it up. Up to that point I had been taught to either write all my program data out in a csv file, exporting and parsing "by hand" or use the pre-written SQL queries I'd been given to interact with a database. Having the ability to quickly pull data in and out of my program as whole objects without having to worry about how it was being written or whether all the control breaks were happening in the correct order made me feel untouchable.
I got the same feels but with integration. Literally getting data that your app can use just like that, directly from Google's servers is a pretty powerful thing. Same goes for integration with Facebook, Twitter, Instagram, etc. I couldn't believe how easy it was the first time. Really makes you appreciate abstraction.
*The numbers, JSON, what do they mean??*
Number: a signed decimal number that may contain a fractional part and may use exponential E notation, but cannot include non-numbers such as NaN. The format makes no distinction between integer and floating-point.
Lol
So Alice and Bob are, in fact, friends with JSON?
@@neminem1203 True. Adam can verify this fact.
But will they help him find the Golden Fleece?
Great introduction, but a few errors:
Javascript and json have as much in common as java and javascript. JSON is inspired by javascript. It doesn't use the same parser. A javascript interpreter probably exposes a json api too, and stores the binary representation in the same way, but they are not the same. If your javascript is using the javascript parser to parse json, you are open to attack.
In json, the strings are also not required to be unique, so it is not the same as a associative array. Any standard compatible parser however is allowed to place additional requirements, so most implementations ignore this fact.
Javscript is not lightweight. It is lighter than some other text formats, but heavier than other text formats. It is also a lot heaver than binary formats such IFF/RIFF.
Jsonis based on javascript it's literally in the name...
@@okie9025 While it's true that it is based on a small subset of javascript, it is such a small subset and it makes changes to how you can use it, that it's basically inspired by it. In javascript, for example, you can have comments, in json you cannot. In json the string key in a object is not required to be unique, in js/chrome it ignores everything but the last.
@@madeso in every way (except properties having to be strings), a json item is the same as a javascript object
@@okie9025 so that's 2 exceptions for the object, no comments=3, json doesn't support functions=4, it's starting to have more things that are not javascript than things that are.
@@madeso functions are different for every language and are this compiled in different ways
also comments are not needed in a storage file, even SQL has very limited support for comments that only work in actual SQL queries
you see a lot of JSON files in games nowadays. i guess it is easier to change the settings of things when they are defined clearly in human readable form and do not need compiling to be used.
A lot of programs (games or otherwise) will have an INI or CONF file which I prefer for settings. Not sure if JSON has any real advantage over that.
Now we know the true identity of JSON Bourne.
This is another great Computerphile video! I've got a video series on JSON if anyone wants to learn about using it via Python.
I've used json to represent a graph for a decision tree pretty recently, and I gotta say it works very well for that kind of application
since i'm pretty much a newbie at programming, i'd love to hear from you how exactly you managed to do that, would you mind?
Plz do a video on the limitations of JSON (or other human-readable formats) or JSON vs. XML!
JSON Derulo uses the same notation when composing his songs
JSON is delightful to work with. So clean, concise & easily pars..e..able? I love it.
@Kevin Oh that is just *beautiful*
I'd expect to hear the word "serialization" at least once in this video.
Let's go deeper and speak about BSON, Protobuf, CBOR, and etc then?
Lightweight is a bold statement without comparison to alternatives. Just compare JSON and CBOR datagram sizes for the same data. You'll notice that CBOR produces much lighter data sizes.
Dude chill out.
@@sebastianpopa7943 the thing is that there are a lot of people out there who will watch this and remember it like it was said in the video.
@@SergeMatveenko I see your point . But this is barely a preview/presentation to JSON. Computer scientists who'll actually work with JSON will anyway dive deeper for more information and for a non-programmer this is the most "human" way to introduce JSON. So everyone is safe . (sorry for bad english)
This is the 'What is JSON?' video for people who don't actually know what JSON is and it's doing a really good job at that.
@@sebastianpopa7943 he's not even angry. Don't tell people to chill out if they're not angry
5:13 to 5:20: "It is used for transferring data in a way that is pretty much _agnostic_ to the programming language being used."
That was a beautiful and figurative way to use the word "agnostic"!
JSON at home: Jason
"Jason's JSON."
JSON is almost good. It should have been specified to allow trailing commas though.
The fact that there is a dispute about this is an argument against the entire concept of using anthropocentric data structures to transfer data between machines.
+ if you indent with tabs, it doesn't parse
@Ark Osf Absolutely.
Machines are infinitely better than humans at generating human-parsable code that is also machine-parsable.
The rubegoldbergian insanity is that in a properly working program, none of this human-parsable code is ever going to be parsed by a human.
It's a small miracle that cryptographic key exchanges aren't done in Roman numerals.
Tbh comments would be great as well
@@AlaaZorkane Both tabs and spaces are allowed outside of values, but needs to be escaped inside strings as `\t` for tabs and `
` for newlines.
There are of course other data exchange formats that are agnostic to the application they're used in, and fairly human readable, such as YAML. I think the reason JSON is preferred is because the way it declares data types is more intuitive to programmers, as strings must be declared in quotes, numbers are always bare, arrays are in square brackets, and so on... I've used YAML several times, and while it's nicer to look at for something like a config file, half the time I forget the syntax for everything in it.
Next video: JSON vs XML
The _Alien vs Predator_ of data interchange formats!
@@lawrencedoliveiro9104 more like Avengers vs Justice League
@@iii-ei5cv Yes, but which John Steed sidekick? Tara King? Emma Peel? Cathy Gale? Or one of the older ones?
Different tools for different jobs.
No contest. XML is complete trash.
I haven't realized until now that computerphile's videos start with the xomputerphile tag and ends with the closing tag.
not all... :) >Sean
No video is good without Alice and Bob!
One possible solution for representing a graph (or a tree with optional parent references) is using an indexing system, because you cannot reference parent nodes directly. Let's see a UK road map: { "graph": [ { "city":"London", "roadTo":[1,2,3] }, { "city":"Bristol", "roadTo":[0] }, { "city":"Manchester", "roadTo":[1,2] }, { "city":"Nottingham", "roadTo":[4] }, { "city":"Liverpool", "roadTo":[1,3] } ] }; So London(0) has a road to Bristol(1), Manchester(2) and Nottingham(3). To get from London(0) to Liverpool(4), you have to connect in Nottingham(3). graph[0] is the root node of the graph.
I can't imagine anyone watching this channel and wondering what JSON is.
and yet here i am
I just want to know who this JSON is and why he keeps breaking my stuff.
PRESS "X" TO "JSON"
JSON
JSON
JSON
JJJJJSOOOOONN
Anthony Ingram I get the reference there...
}}}}}]}}}]}}}]}}}}}}]}}]}}}}
The file has been stolen.
That it is not JaSON is the real tragedy here. What kind of computer person ignores such a pun when it's handed to them on practically a gold platter? The a isn't just convenient, it's right there in the spelled-out word. But then again, there is this English guy who took the longest sounding English letter in the alphabet and trippled it just to name something. :-P
DJDoena XD
The biggest crime against nostalgia is that Eve wasn't listed as one of the children.
Eve is classically the eavesdropper in the scenario involving an exchange from Alice (point A) and Bob (point B). Using the name Eve in the context of this video would be unfitting.
@@smergibblegibberish Yes, Eve would be the eavesdropping sibling. ;)
solidly equipped with older gear for data recovery, properly placed at the windows.
XML + XSD + JAX-B = Heaven. I so wish JSON had something similar (JSON schemas and Java API like JAX-B)
Since JSON could be used in multiple programming languages, isn't it time JSON gets a new name?
You should have shown a prettyprint on-screen example. I work with JSON quite a lot and those sketches we're a bit all over the place.
2:45 Regarding arrays, it is possible to represent arrays as general objects, i.e. ["Alice", "Bob", "Charlie", "Derek", "Esmeralda"] could be represented as {"1": "Alice", "2": "Bob", "3": "Charlie", "4": "Derek", "5": "Esmeralda"}.
Of course, the disadvantage is, it introduces issues like possibility of noncontinuous indices, 0/1-based indexing mixups, index transformations (string vs. integer) etc. that objects allow, so an object would need to be checked (upon reading the JSON) if it is indeed an array and/or stripping it of non-array features.
Those are still separate data types. One is indexed while the other is a hash. They are not equivalent because the index in Arrays are integers, whereas keys in objects are strings. So you'd have to coerce the key before it could be used interchangeably (in JavaScript).
Even worse for other languages that parses JSON, is that the deserialized data types are different and would need further serialization to become compatible with one another.
Sun workstation? Might as well talk about the Atari ST.
I have a Sun workstation that I still use, though it is an amd64 based one.
Considering there *is* at least one functioning Atari ST in that room...
who the hell is Jason?
didn't JSON come about as a response to XML in that you could do "var data = eval(...);" on it and let the JS parser do the work for you? obviously the security implications won the day and JSON.stringify/JSON.parse came about, but I thought it was born out of that ad-hoc convenience.
JSON as data exchange method became popular since Javascript had no built-in way to parse XML data and dumping out complex data to the browser as directly parsable JS code was simply the easiest method.
Programmers also like it for making a clear distinction between numeric values and text strings, which XML doesn't.
I don't think so. Its primary purpose is structuring objects, considering JavaScript doesn't have concept of classes. At some point people just started using it for communication since serializing and reverting it back to object is pretty straight forward and it maintains structure you are using in your code anyway.
You know you're a fossil when you mention a Sun workstation...
In this day and age, where the heck did you find tractor fed greenbar????
Lightweight JavaScript: please ignore that 350+ Mb of RAM occupied by an instance of chromium to run me.
U can run js outside web browsers in node.js... besides he said json is lightweight not js (which actually is lightweight in mentioned nodejs).
Web browsers are really slow, bloated virtual machines. Node.js is lovely and it's 'fast' because it's essentially compiling the code before its run. You know what's even faster? A real language with a COMPILER. I'm sorry that BEtoN is so hard to type that everybody throws a ton of their compute resources in the bin.
@@JohnnyWednesday you can compile JS to machine code. JS is the real language and most others are frauds
@@willtheoct r/gatekeeping
@@willtheoct - said exactly like an amateur coder that only knows javascript ;)
It's called serialization. A program has some data (or code) in memory but dumping the contents of the memory wouldn't produce a usable format to store the data and then read back. You need some kind of representation for it, that is easy to read and parse. Creating such a format for every use case separately is a waste, so it's often better to use some preexisting serialization library to do it.
Sun? IBM? You might be living in the wrong decade mate ;-)
He probably works on supercomputers.
I really wonder what the name of the 12th child would have been.
Lucas?
What about Jay’s son?
Anybody else get the JSON error message uploading thumbnails to RUclips with Chrome. Works fine with Edge and Firefox and just another example of RUclips's impressive enhancements since focusing on the Studio Beta fiasco.
I am looking a journal which hosts papers from the early beginning of computer science. Can anyone help me on this matter? Thank you very much in advance.
Solid end credits joke.
I’ve always wondered- what’s the type of paper that he’s writing on? I keep seeing it in Computerphile vids
What about Jason Schemer? Seem to hear a lot about that bloke these days
Has Computerphile ever made a video about JSONP ?
How do you guys pronounce "json": json or json?
json
exactly
So many computers in one room
I'm deaf... and I need caption for this video.
It has captions
Amiga 1000 behind the guy ? 👍
Json is usefull as well as a txt delimited file.
No advantage of any sort to use that instead of a txt delimited or fixed, that's why most standard communication softwares do not use json but delimited text for interface functions.
The reason is easy to understand, what is difficoult in data communications is not the format of the file containing the data, but the nature of the data transfered.
So it doesn't matter if the data is transfered with plain text, json or xml, it is a question of what data is contained in the file.
Is there any reason why you don't present the name of the guy in this particular video?
Love your videos. Can you guys do one on wireshark? Or intrusion detection?
5:25. Proof 2pac is alive
Worst curly braces in computer history.
And almost as bad explanation.
@@MladenMijatov couldn't be worse than your hairline
Yeah... next video: "How to correctly draw curly braces"
This explains how to increase sales for companies that make RAM and processors.
[X] JASON!
Thank you! I need to know what JSON is.
I feel sarcasm:)
Serge Matveenko actually no, no sarcasm.
@@FandCCD Sorry, I felt like you're referring to the fact that this thing has been around for almost two decades already.
Serge Matveenko no prob. I’m new to the scene.
@@SergeMatveenko Something that "everyone knows" is new information to a lot of people every day
i usually just write my own damn parser and call it a day. so long as you include the same header at both ends it works.
React Redux uses Json to store States and its awsome ;)
Any state management library for JavaScript uses JSON to store state :) That is the fundamental of JavaScript, and what it's building blocks are (objects).
ui_wizard still awesome ;)
Please make a video, or a series of videos about packet encodings, how data is sent in packets via a network, and how to build a packet sniffer!
Building a packet sniffer seems kinda out of scope for a computerphile video.... It's not simple in the slightest and definitely spills out of the 5-15 minute bites they normally put out.
A config file in this format is very human readable in notepad without a syntax highlighter, almost as clear as an ini, perhaps even better because the structure is visually nested. An XML is almost impossible to read, and type all the repeated attributes for every value and symbols all with the right hand. Most XML was clearly never intended to be edited by people, but is still saved to disk in verbose text format.
No standard-defined way to store binary data though. You may send an array of ints, or a base64-encoded string or a string of hex digits, but there's no native built-in way in Javascript to encode a binary segment.
You would store the structure that the binary data represented.
@@lawrencedoliveiro9104 So how'd you go about sending, say, a PNG image over JSON?
@@sharpfang Encode the chunks.
@@lawrencedoliveiro9104 So, first I take the PNG image from disk on the sender computer, decode it, disassemble into components, send them over JSON, including the image data as arrays of decimal values, then reassemble on the other end before displaying as a notification icon?
@@sharpfang Represent the entire structure of the file as JSON structure.
You called?
What's even more strange is why are we using JSON when ASN.1 solved this problem space better 35 years ago?
We use JSON because ASN.1 does not solve the problem JSON solves. ASN.1 is a schema language which we use to describe how the data will look and then allow an application to serialize or deserialize data according to that schema, usually by compiling the ASN.1 schema into code in that language. JSON on the other hand is simply a data representation format.
The reason this distinction matters is because sometimes you want a fixed schema with rules in how data is to be represented and bounds on the values and other times we specifically need it to be very free form. If you are not sure what kind of data might be fed into your program but need to represent it in some way JSON or MessagePack might be for you while on the other hand if you will always have the same set of fields and data types in those fields then Protocol Buffers or ASN.1 might be for you. As someone who uses both schema'd and schemaless data formats for software I write I can honestly tell you both are useful in their own regard.
Every time I read json, my mind "the numbers, json. what do they mean!?"
"the numbers json"?
@@xybersurfer I corrected the sentances. Anyways, it's from CoD black ops. But He says Mason instead of json..
@@RocketLR oh, i see. then it makes sense that i don't recognize it
JSON?
JSON..?
JSON!?
JSON!
Gotta love how we have up to 7 kinds of the Universal Serial Bus used at once, but the JavaScript Object Notation is a standard that works pretty much globally.
can we talk about why typeof [ ] == typeof { }
They are both objects in JavaScript, so it's true. All values in JavaScript. There are some primitive types like `number` and `string`, which values of those types resolves to when using `typeof`, but they are objects as well.
All values has a prototype, which is inherited from Object's prototype.
Another thing to notice is the use of == (loose equality) rather than === (strict equality). Loose equality converts types beforehand, so 1 == '1'. Whereas strict equality compares both the type _and_ the value, without type conversion, and therefore 1 !== '1'.
If you wanted to check wether a value is an array and not an `object`, then use Array.isArray.
JSON is great for computing in general in my view, but is no replacement for RDBMSs #DataIntegrity #WhyWaitToWriteToDB
Dr. B here even hinted at it if you caught it, when he remarked that it'd be interesting to see a graph like structure with a bunch of relationships to one another. That's where Relational Databases shine, foreign keys and all that.
Plus the fact that in some NoSQL DB architectures you can wait to write data, not having to at the instant the operation is handled, which can lead to trouble with data integrity... all for fast benchmarks... which of course JSON offers with its universal and small "packets".
What? No.
do you still use a tractor feed printer, or are you still using out all the tractor feed paper you had left over from 30 years ago?
1:01 IBM doesn’t make PCs any more. I think you meant “Microsoft-compatible PC”. After all, Microsoft has been the arbiter of “compatibility” ever since Microsoft Flight Simulator became the original benchmark for this back in the mid-1980s.
Eh, as a longtime Linux user I'd rather we didn't define PCs in terms of Microsoft products...
PS. Nowadays Apple computers are also almost-PCs in my book, just locked down. They have identical architecture and they can also run Windows (with Boot Camp), so your definition crumbles. What differentiates them from PCs is their locked-down nature (less than in the past), since interchangeability is a defining characteristic of PCs.
Sorry for being pedantic :D
@@jasondoe2596 That’s the reality of it, though.
Nowadays we are seeing other types of PCs, ones that owe nothing to Microsoft, start to become more popular, the obvious example being the Raspberry π family.
SEAN!
No, it's Jason
Text encoding is an inefficient compromise. I don't want to make things easier for amateurs - I understand and can handle endiness perfectly fine. Web browsers nowadays? are just very inefficient virtual machines. If we were to design a system that could give us youtube/facebook today? it'd be pure binary - it'd run full speed on a fraction of the hardware you need today. A screen with layout, text and images shouldn't be maxing out all the cores of a modern desktop system. Webpages already are applications running in a virtual machine and barely anybody is using a text editor to make them anymore - so why are we wasting epic amounts of computational power on something that should of been implemented like java/flash from the very beginning?
Quite. Text formats should only be used where they are expected to be read and written by humans. Having written both a parser for a toy programming language while at uni and a binary loader for Windows BMP on a just-for-fun project, I can say it's a lot more work writing the parser than the loader. Using text formats when the files are never going to be human-written is just masochism.
Because a lot of people have issues running a random executable on their own machine. I'll happily sacrifice some performance to gain security.
In related news, I block all online java/flash content.
Don't go JSON waterfalls
How do you read an element in JSON? Say I want to know the name of Alice's third Kid. How would I find him if there are no indices?
For any such indexing requirements, JSON on its own won't suffice. It's only a way to describe relationships and what not. You will need a NoSQL engine working on top of this data , which then creates index's and what not.
You can JSON.parse() and then index it like a normal JS object
Depends on what you mean by "third". The 3rd that exists in the data? the third that was born? The 3rd best liked?
You can easily traverse a tree, directly look up the n'th instance or search by key.
Compared to written material in web, that was quite poor explanation.
If anyone wouldn´t know what JSON is, they wouldn´t get it.
And who knows what it is, would watch with wish to rush in and add own comments or better examples.
KISS, love it, too easy
So how would I reference to an object that I already defined before? Eg. the second parent of a child.
JSON is purely a declarative language, you cannot put code in it. One thing you can do is to assign a unique id to each node, and then reference the node via its id. Later build an index where you can address objects by id, but you can only do it from code, it cannot be part of the data. For example: { "id":1, children: [ { id:2 }, { id:3 }, {id:4, parent:1 } ] }; That's how PDF does it as well.
How did this have anything to do with JSON?
It's Computer Phil not Computerphile!!!
JSON not JSON
How does it handle data types? Is "value": "42" the string 42 or the number 42?
just like in javascript, quotes mean strings, no quotes means numbers.
I love JSON for embedded, because you have very limited memory in embedded devices.
Thanks for the video!
This video presents JSON as a data format that is not bounded to software.. That's not true, it's just a very commonly accepted data format *but you still have to have software that accepts it*. In that aspect it's no different from any other data format.
The hardest part of programming is to draw curly brackets correctly
The children that Steve "generate" are in alphabetical order by name. I seriously doubt that is accidental.
i think he was just making up a name as he went down yhe alphabet