And that's the last in this run of the Language Files! There may well be more later in the year, but for now: thank you so much to co-authors Gretchen and Molly. Pull down the description for more about Gretchen's podcast and book!
When you have to re-read a whole paragraph in a fanfiction because at the end of it you find out you were imagining the character roles reversed... and now you have two parellel scenes visualised.
oh God, when the author doesn't make it clear which character is speaking in a scene and your left wondering why Sub would be saying something that only applies to Dom
When they just leave what the characters are saying without clues as to who is saying what assuming I will pick up on who is who. Excuse me, but I'm dumb.
When you realize 5 chapters later that that one very important line was actually uttered by the _other_ character, completely changing your context for the last 5 chapters...
I love coming back to this video. I think it is excellent, Tom explains it really well, and there really isn't any flaw to the arguments that he is making. Of course we know that he was completely wrong just 2 years later, but that makes me like the video more. He wasn't wrong because of any factual error, but because nobody could predict just how incredible LLMs would become at natural language processing. Rather than complain that it aged like milk we should instead hold up this masterpiece of a video as a lesson that things are not as difficult as we always think they are, and the seemingly impossible might be on the verge of being achieved.
What about the fact that it is very difficult but, not impossible, to saw the thief with your bicycle? Edit: it is much easier if you use just the chain.
Its only because we are just a step away from it, but time is a bad measurement since genius could strike at any time. If everything went well we could have it done in 2.
Yesterday I put those sentences into OpenAI Chatbot, and it successfully answered all of them. Turns out machine learning WAS gonna save us, less than 3 years later. Checkmate Tom.
I tested asking it to put the suitcase inside the trophy. It stuffed the suitcase inside, and admitted it was a silly idea. So yes, it is extremely good at understanding this stuff.
I asked it about the trophy sentence and it got it correct. Then I switched from “too big” to “too small” and it gave the same (wrong) answer. ChatGPT is good, but not intelligent.
@@TheGregamonster remembering the time the characters were in court and one of the characters just pulled out a gun??? Even though they never had access to one?
Feb.19.2023 I asked chat gpt: "The trophy would not fit in the suitcase because it was too big." What is the sentence referring to by "too big"? It replied: In the sentence "The trophy would not fit in the suitcase because it was too big," the phrase "too big" is referring to the size of the trophy. The trophy is too large to fit inside the suitcase, so it cannot be packed and transported in that way.
I asked it a newer one it may not have seen yet, The ball broke the table because it was made of steel. What was made of steel? It told me this was ambiguous. To its credit when I corrected it and told it that this was an example of a Winograd schema and that one of the two interpretations was obviously correct, it did pick the ball as the steel object. But then, to its detriment again, you can try this replacing steel with Styrofoam to change the meaning around. It gets even worse and doesn't think it is ambiguous at all--and insists that a Styrofoam ball would not break a table. And when warned again that there is a right answer, it says that the Styrofoam ball did break the table. The ones it gets right, it gets right because it has seen them in the training set because they are classical examples. It is smoke and mirrors though, and it still chokes on anything novel.
@@hayden.A0 Not one of the best examples because, as Tom said, they are made to be totally unambiguous and easy for humans to understand. Which is why it is so interesting that computers struggle so greatly with them.
@@emperorborgpalpatine No one understands it. The sentence, "The woman(1) beat her daughter(2) because she(x) was drunk." is entirely ambiguous, which misses the point of testing AI with unambiguous statements.
One of my favourite examples of such a sentence: "The disco ball went through the table because it was made of steel/styrofoam." The meaning of "it" depends solely on the last word.
You could do the same thing with Tom's opening sentence. Change the last word from "big" to "small" and the sentence still makes sense (except now x=2)
@@Liggliluff I just think this is amusing, considering your profile pic and channel content. I'm almost *surprised* that you don't write slash/shipfics.
AI dungeon has a real violent streak in it. I tried to "go to sleep" once and it made a troll break in through my window and shoot me in the head. Absolutely no chill, I must say.
@@facundogriglio3316 just tried with chatgpt 4o mini this is what it said: The trophy wouldn't fit in the brown suitcase because it was too big, meaning its size exceeded the available space inside the suitcase. There could be several reasons for this: Height/Length: The trophy might be taller or longer than the suitcase's interior dimensions. Shape: The trophy could have an irregular shape (e.g., a wide base or large decorative elements) that doesn't align with the suitcase's compartments or structure. Fragility: The trophy might require special handling, and trying to force it into the suitcase could risk damaging it, making it impractical to fit. Inflexibility: Unlike flexible items, a trophy might have a rigid structure, meaning it can't be squished or bent to fit into a smaller space.
@@alphaONEphoenix i didnt say it cant answer the question, I'm saying it cant understand it, an AI so far is only a machine that processes a ton of information to output what other people or many other people said.
You: "Put the trophy in the suitcase" Computer: "Ok pulling out gun" You: "No" Computer: "Ok putting away gun" You: "Open suitcase" Computer: "Takes out gun and shoots suitcase"
We can still understand even when a person who has trouble speaking English says things incorrectly. This works both ways, even when I use the wrong word in Spanish, native Spanish speakers still understand what I was trying to say. I find this human ability to put the parts together, even when something is wrong, is amazing.
@@jayteegamble is that actually true or just a joke. I only ever got paper participation/attendance awards and that's it. I got trophies/ ribbons for winning. Did anyone ever get a little trophie for participation? If so, for what and where? Genuinely curious if it happens or if it's one of those hollywood exaggerations.
@@FMFF_ It's a fictional Boomer myth they like to perpetuate to demonstrate how much better they are than everyone else. The only time I have ever seen anyone get participation ribbons has been at pee wee sports events and the Special Olympics.
In my competition everyone got a trophy for participation. Except me, because the trophies were too expensive. Taught me a valuable lesson in economics: Why everyone doesn't get a trophy.
Yesss. Like please just use their names or restructure your sentences. "The blond-haired boy" just sounds so weird because nobody thinks like that about people
@@colemorgan3356 This cannot be stressed enough! It's like the word 'said.' The reader is conditioned to not notice it after so much exposure, and names and pronouns are the same way.
@@YellowJelly13GPT3 was finished within months of this video being published. It just wasn't released to the public until 2022. In actuality, he was practically already wrong at the time this video was filmed - he just didn't know it yet.
@@lurrielee2755 GPT2 had already made massive waves in the nerd/chatbot sphere of people that care about such things. He had already been wrong for quite some time, a year or so at time of publishing, it just wasnt very huge in the public zeitgeist yet. Honestly as much as I love him, he simply wasnt that researched for this video to make that kind of claim.
@@UrammarYou say this like Gpt2 and more relevantly gpt3 are still far from perfect. Sure, it's able to reply back, and while I can't talk about exactly how it works due to a lack of understanding in the internal mechanisms (Which to be frank, even the creators probably don't know), from the times I've used it, the learning through repetition is somewhat successful, but not nearly as effective as the personal experience we have simply through learning. He wasn't wrong, people just credit the program for the wrong things.
@@mfirdanhb Buying a cup is better than getting a trophy. A trophy has no functional use other than collecting dust. A cup holds things and occupies far less space. You also don’t typically have to compete over a cup (with exceptions).
@CozyFrog That's not how it works, I never went to the moon, that doesn't mean I have experience with the moon, I don't even see how what you said make a bit of sense, If you never even have or go or see it, you never experience it
@@docinabox258 Both are perfectly fine grammatically. Picture a banana - just one of them - being swarmed with fruit flies, illustrating that fruit flies like a banana. Just like you might say, "I like a good cup of coffee;" or, "I like a baseball game." It's understood that there are many of either of these things, and that you like them all. Fred
gpt 3 was actually released later in 2020, after this video was published, so its been that was since roughly the video came out.. GPT4 came out couple months ago, and if you take a look at the timespans between each GPT version, its around 2 to 3 years, so it only makes sense
@@petrkdn8224I tried the tablecloth example with GPT-4. Not only did it provide the correct answer, it also rephrased the sentence in order to make it clearer.
That moment you go, "hmm, i usually use visual or character traits to distinguish between same pronouned characters in my gay romance fanfics" and then Tom calls you out on it as weird and amateur.
It’s not always a bad thing in writing, but usually it’s used as a crutch or a bandaid solution, instead of intentionally, which is why it often comes off as amateur Really, when it comes to writing, there are general guidelines, but no strict, set rules-because there’s always gonna be someone who will break those rules and do it well ¯\_(ツ)_/¯
I did spend a few minutes thinking about how I'd word that sentence so it'd work before I realized the entire sentence was just trash and should be thrown out completely.
Personally, Blaseball has been a good training ground for some people I know to stop using synecdoche and epithets, as characters have zero canonical descriptions of appearance at all, so no amount of "pinkette" or "the taller one" is going to disambiguate which character is which. You could try looking at fanfic from there to improve on that specific skill. Similarly, Homestuck fandom is really good at second person (written from the perspective of "you") because canon is written in second person. If you want other recommendations feel free to ask!
"Artificial language processing remains 10 years away, just as it has for the last few decades." If you could summarize futurology in a single statement, that would be it.
Jackie Tearie besides the fact that nuclear energy hasn’t changed too much. Nuclear bomb are still always tested by all countries, and efficient energy sources are rare to see cause of money-hungry electricity and gas companies. Such a miserable world
I've been programming so much recently and speaking so little that my brain started to agree with the computers on the difficulty of parsing those sentences.
@@resyntax Yes, but on a sample size of one, it was correct. I almost said it was well reasoned, but it claimed to have done it off the sentence structure.
GPT-6. Discovers a new way to make the perfect beer, sets up a business, codes a flawless website, gets legally set up in a few small local areas, gains funding from investors, expands and gradually takes over the entire beer industry.
Now chatgpt is replying with “If I were human, and assuming both the suitcase and the trophy were appropriately sized and designed for such an interaction, it might be possible to put the suitcase inside the trophy, depending on their respective sizes and shapes. However, it's important to note that such an action would likely be unconventional and not typically done.”
Me: ChatGPT wouldn't fit in the mainframe computer, because it was too big. In the previous sentence, what does it refer to? ChatGPT: In the previous sentence, the pronoun "it" refers to the mainframe computer.
AI is still crap, especially the voice recognition. I was reminded of that when I played a quiz with one a few months ago. Alexa: What do you fire from a bow? Me: Arrows Alexa:... Alexa: A rose is incorrect. The correct answer is "An arrow".
Yeh my mom has SIRI it's hilarious like that. Here's an example. "SIRI order from Papa john's." I heard you say 'or more front maya Jones' is that right??? "No! I said 'Papa johns! " Sorry I don't understand that. "I SAID NO!!! "
@@kusaisama Google's no better. Even with a full decade of voice recognition samples, it sometimes can't recognize a simple command like "set an alarm for 5 am".
Magnus Juul Ask yourself if you would have a single problem with Tom discussing straight romance fiction And then, if it's not too much to ask, reevaluate your life choices
Tried this test with ChatGPT, although a little altered. I asked, "The ball doesn't fit in the box because it is too small." ChatGPT responded, "The word 'it' refers to the ball. The sentence means that the ball is too small to fit inside the box, which is the reason why it doesn't fit." As a software engineer, this gives a little bit of hope that my job is still safe. A little.
Well i said to it, it was wrong so it corrected it and told me that it was wrong because the box was too small to accommodate the ball . Here it kinda understood its mistake. This shows how smart its learning is
Ah yep. This same example of yours trumped it on my end too. Perhaps it's trained on the examples Tom mentioned but not new ones. -In the sentence "The ball doesn't fit in the box because it is too small.", what is the word "it" referring to? -In the sentence "The ball doesn't fit in the box because it is too small," the word "it" is referring to the ball. -Why would a small ball have trouble fitting in a box? -A small ball might have trouble fitting in a box if the box is not large enough to accommodate the ball. Alternatively, the opening of the box might be too small to allow the ball to pass through. It also depends on the size of the ball and the size of the box. -Doesn't this mean "it" refers to the box then? -Yes, you are correct. In the given sentence, "it" refers to the box, which is too small to fit the ball. I apologize for the confusion in my previous response.
I LOVED how you pulled in slash fanfiction to demonstrate!!! This is indeed a problem (inexperienced) authors frequently struggle with, and you are right: synecdoches are awkward to read. The solution I have discovered (when careful sentence construction can't solve the issue): use names! The rule about trying to avoid repetition in your vocabulary can be applied with more leniency in this case, and if the writing is good, the scene will come to life so perfectly that clarification is not always needed anyway. That said, I object to slash fiction authors being called amateur in general - though there are, as you point out, significant differences in skill level.
My: Can you tell me what the following sentence is trying to say: The trophy would not fit in the suitcase because it was too big. ChatGPT: This sentence is explaining that the trophy couldn't be placed inside the suitcase due to its size
I believe CGP Grey did a video on this subject that aged much better? Its hypothetical example is a bot trained to sort images between "bee" and "dog" -- which after an iterative training process it can (and quite efficiently), only to get tripped up by a photo of a dog in a bee costume.
Me: The trophy would not fit in the suitcase because it was too big. ChatGPT: This sentence is an example of a classic ambiguity problem often used in linguistics. The pronoun "it" could refer to either the "trophy" or the "suitcase," leading to two interpretations: The trophy was too big to fit in the suitcase. The suitcase was too big to hold the trophy. However, most people intuitively assume that the trophy is too big because it aligns better with common sense expectations (trophies are usually large and suitcases are typically for storing things). Do you enjoy exploring these types of linguistic puzzles?
2020 was not the year I expected to hear about the grammatical intricacies of lesbian slash fiction from Tom. I'm not complaining because he's 100% bang on, but still. Huh.
Here’s one fault that I experienced with it: Mulder: “Scully, if I don’t make it out of here alive, please feed my fish.” Scully: “Mulder, I am not a fish.” Mulder: “Scully, I said that you should feed my fish if I die here, not that you are a fish. What’re you smoking!” Scully: “Nothing, Mulder.”
@@kingdollop-head743 It makes me sad that they do not shake it up with relationship indicators and names near as often. Their own, their partners, Tom Scott's etcetera.
You know, I’ve tried using the AI dungeon recently, apparently a lot of people tried doing the same thing Tom did because the moment i asked it to imagine a suitcase and a trophy it immediately filled in the rest of the problem for me, gave me the correct answer, and twice proudly proclaimed that it knew trophies were made of medal. It was an interesting conversation all round, I asked it if it knew Grice’s Maxims, and not only did it list them but it explained what they were in, what I could tell, were its own words. It also identified times in our conversation when one of the maxims had been violated or flouted, and even stopped to correct me when I accidentally referred to it as “flaunting” a maxim.
4 years later, and the main thesis of the video is still *technically* true. It just turns out that it doesn’t matter if computers understand what they’re saying or the relationships between words, they’re just VERY good at cheating and getting it right once they have enough data.
*Oxford comma cries in the corner* "I went to the club with the pole-dancers, Tom Scott and Stephen Fry" (and a million fanfic writers cried out in joy)
@oH well,lord! nah just AI writing another new AI. And replacing human, and after that the make more AI and make them their slave Until that AI can make its own colony and take down the AI Then the cycle continues, as AI make a guild and take down each other. They soon come in agreement and decided to have one true Lord as a god, which is a human, but us human is already vanish, and thus there will be atheist AI who does not believe in human god. They continue to grow, make new tech..
@oH well,lord! God is just humans who accidentally make AI too intelligent, that we rule over and take down god, and thus we are next to make AI as our slave. The history we learn are just cover up by the elder to make sure we don't know we are actually AI.
@@DavidRichfield Interesting. My solution for x is none of the options. Similar to the sentence "The robot(1) put the trophy(2) in the brown suitcase(3) and it(x) was funny."
And that's the last in this run of the Language Files! There may well be more later in the year, but for now: thank you so much to co-authors Gretchen and Molly. Pull down the description for more about Gretchen's podcast and book!
how the f*** is this posted 1 month ago?
Tom Scott it says for me pinned 1 month ago and it’s been out for 3 mins...
A month old comment in a minute old video.. Classic Tom!
Thanks Tom, Gretchen and Molly. Very cool!
Tom why do you keep time traveling to make these comments
When you have to re-read a whole paragraph in a fanfiction because at the end of it you find out you were imagining the character roles reversed... and now you have two parellel scenes visualised.
oh God, when the author doesn't make it clear which character is speaking in a scene and your left wondering why Sub would be saying something that only applies to Dom
lmao visualising while reading? pfft casual
aphantasia gang
When they just leave what the characters are saying without clues as to who is saying what assuming I will pick up on who is who. Excuse me, but I'm dumb.
When you realize 5 chapters later that that one very important line was actually uttered by the _other_ character, completely changing your context for the last 5 chapters...
@@slaughterround643 yea boi!
Things I didn’t expect to be mentioned in a Tom Scott video:
•Slash fiction
•Overwatch
Haha
weird that I'm working on overwatch slash fiction in another tab
"Large things can't fit inside smaller things."
Shaun Cheah, according to slash writers? Yes, they can.
•AI Dungeon
Looking back on this 4 year old video after the release of GPT-4 shows how quickly this tech has developed
just incredible, isn't it? or maybe frightening?
Or GPT-3, honestly. I remember using that and it was magical
I was just thinking we might benefit from an update on this topic.
Try asking it how many rs are in strawberry
@TheBluePhoenix008 there's 2 Rs in strawbery, duh
I love coming back to this video. I think it is excellent, Tom explains it really well, and there really isn't any flaw to the arguments that he is making. Of course we know that he was completely wrong just 2 years later, but that makes me like the video more. He wasn't wrong because of any factual error, but because nobody could predict just how incredible LLMs would become at natural language processing. Rather than complain that it aged like milk we should instead hold up this masterpiece of a video as a lesson that things are not as difficult as we always think they are, and the seemingly impossible might be on the verge of being achieved.
Absolutely agreed
"I saw the thief with my bicycle" - the thief has my bicycle.
"I saw the thief with my binoculars" - either I or the thief might have my binoculars.
"I saw the thief with my bicycle with my binoculars"
"I saw the thief with my binoculars with my binoculars"
You did not just use alot of binocular(s?) to see a damn thief
@@Liggliluff You have two sets of binoculars?
What about the fact that it is very difficult but, not impossible, to saw the thief with your bicycle?
Edit: it is much easier if you use just the chain.
The thief saw me with my own binoculars.
“Artificial language processing remains ten years away, just as it has for the last few decades” is genuinely such a great quote
Just like fusion is the energy source of the future -- and always will be.
It’s not as great as you’re making it sound tbh
Its only because we are just a step away from it, but time is a bad measurement since genius could strike at any time. If everything went well we could have it done in 2.
@@Merilirem indeed: hard to put a timeline on innovation.
Every new invention is a million years away the day before it’s suddenly conceived of
wow. totally, totally true!!
"Siri call me an ambulance!"
"You're an ambulance."
😂😂😂
Lmao
"Siri please i'm dying!"
"hello dying i'm Siri."
Lmao
😂😂😂😂😂
Yesterday I put those sentences into OpenAI Chatbot, and it successfully answered all of them.
Turns out machine learning WAS gonna save us, less than 3 years later. Checkmate Tom.
I tested asking it to put the suitcase inside the trophy. It stuffed the suitcase inside, and admitted it was a silly idea. So yes, it is extremely good at understanding this stuff.
@@Jason9637It could also be because it was asked that question by others who watched the video so it learned from it
I asked it about the trophy sentence and it got it correct. Then I switched from “too big” to “too small” and it gave the same (wrong) answer. ChatGPT is good, but not intelligent.
He just made a video :)
@@krishp1104 ChatGPT doesn't learn from interactions.
Tom: Put the suitcase in the trophy
AI: So anyway, I started blasting...
Omfg underrated comment
This deserves more likes 😂
XD
Eat your cereal
Go watch the Yogscast's series of videos on it, absolutely hilarious
I once tried to add "two cans of chilli" to my grocery list. And it added "toucans of Chile."
"add a note to feed the baby" becomes "add a note: defeat the baby".
@@renakunisaki oh no..
@@renakunisaki That's brilliant xD Good robot.
@@renakunisaki Is this a challenge?!
So that's why my smart fridge is full of birds and my food bill skyrocketed.
Chili was still good though.
tom: "put the suitcase in the trophy"
AI: "i'll get the gun"
T800: "A phased plasma rifle in the 40 watt range".
AI dungeon gets confused sometimes. And when it does, it usually resolves that confusion by injecting violence.
So it's an American AI program.
@@TheGregamonster remembering the time the characters were in court and one of the characters just pulled out a gun??? Even though they never had access to one?
@@brandonfrancey5592 😂
Feb.19.2023
I asked chat gpt:
"The trophy would not fit in the suitcase because it was too big." What is the sentence referring to by "too big"?
It replied:
In the sentence "The trophy would not fit in the suitcase because it was too big," the phrase "too big" is referring to the size of the trophy. The trophy is too large to fit inside the suitcase, so it cannot be packed and transported in that way.
Just food for thought👌
I mean the video is 3 years ago...
that's just because it read about this problem
@@liamneedsauniquehandle Nope. Try using other similar examples using the word “it” in an unclear way. It understands.
I asked it a newer one it may not have seen yet,
The ball broke the table because it was made of steel. What was made of steel?
It told me this was ambiguous.
To its credit when I corrected it and told it that this was an example of a Winograd schema and that one of the two interpretations was obviously correct, it did pick the ball as the steel object.
But then, to its detriment again, you can try this replacing steel with Styrofoam to change the meaning around. It gets even worse and doesn't think it is ambiguous at all--and insists that a Styrofoam ball would not break a table. And when warned again that there is a right answer, it says that the Styrofoam ball did break the table.
The ones it gets right, it gets right because it has seen them in the training set because they are classical examples. It is smoke and mirrors though, and it still chokes on anything novel.
Playing a text adventure:
“Put the suitcase in the trophy”
Computer:
“Ah, so you chose *gun.”*
garbage in, garbage out :) working correctly
AI can't wait to lay their hands on a gun...
American computer:
so you have chosen, G U N
So American Schools
Put the suitcase in the trophy.
Ai: pulls out gun
Then I pull out my robe and my hat
PUT IT IN, NOW!
So anyways i started blasting
If that AI is ever given a physical body we are in trouble. "Pack my suitcase, will you?" "Certainly. *BANG*"
@@hayden.A0 Oh, he's packing all right.
"The woman(1) beat her daughter(2) because she(x) was drunk."
Confuses me still.
Probably one of the best examples. I've seen it before and even in other languages.
I think that would be completely ambiguous.
@@hayden.A0 Not one of the best examples because, as Tom said, they are made to be totally unambiguous and easy for humans to understand. Which is why it is so interesting that computers struggle so greatly with them.
@@emperorborgpalpatine No one understands it. The sentence, "The woman(1) beat her daughter(2) because she(x) was drunk." is entirely ambiguous, which misses the point of testing AI with unambiguous statements.
Awesome
3:50 viewing this now almost gives me chills with the amount of progress we've made in just 4 years
One of my favourite examples of such a sentence:
"The disco ball went through the table because it was made of steel/styrofoam." The meaning of "it" depends solely on the last word.
Wikipedia refers to this as "syntactic ambiguity".
You could do the same thing with Tom's opening sentence. Change the last word from "big" to "small" and the sentence still makes sense (except now x=2)
what table is made of styrofoam?
Unless we know that the speaker is not very smart.
Im totally allowed to have a table made out of styrofoam!
“If you’re writing a gay romance”
As I normally do, of course
To be fair, I don't write romances at all, or write at all ... stories that is. I've clearly written this reply.
@@Liggliluff I don't write anything, not even comments online.
that's gay
BooMan I see what you did there. Also I can’t even read
@@Liggliluff I just think this is amusing, considering your profile pic and channel content. I'm almost *surprised* that you don't write slash/shipfics.
Tom: "suitcases"
AI: " m u r d e r "
Gas Cooker Sketch!
AI dungeon has a real violent streak in it. I tried to "go to sleep" once and it made a troll break in through my window and shoot me in the head. Absolutely no chill, I must say.
@@WLxMusic I feel like I should be kinda... surprised by this, but given who 'taught' it, I'm not at all.
Random Awesome uh
"Put the suitcase in the trophy"
AI: I know exactly what to do know , _pulls out gun_ , *get in the trophy , now*
"This is 10 years away" - this aged like milk
a computer still cant "understand" it
@@facundogriglio3316 just tried with chatgpt 4o mini this is what it said:
The trophy wouldn't fit in the brown suitcase because it was too big, meaning its size exceeded the available space inside the suitcase. There could be several reasons for this:
Height/Length: The trophy might be taller or longer than the suitcase's interior dimensions.
Shape: The trophy could have an irregular shape (e.g., a wide base or large decorative elements) that doesn't align with the suitcase's compartments or structure.
Fragility: The trophy might require special handling, and trying to force it into the suitcase could risk damaging it, making it impractical to fit.
Inflexibility: Unlike flexible items, a trophy might have a rigid structure, meaning it can't be squished or bent to fit into a smaller space.
@@alphaONEphoenix i didnt say it cant answer the question, I'm saying it cant understand it, an AI so far is only a machine that processes a ton of information to output what other people or many other people said.
@@facundogriglio3316 i did not say it could i just showed what i found
@alphaONEphoenix touche
"Larger things can't fit in smaller things"
Me playing Minecraft putting thousands of cubic meters of gold in a single chest.
Sir_Slimestone
This is beyond science
I was thinking of something much different
Check out Mario's wallet full of gold coins. Who needs a chest?
And an 11ft pole in my backpack!
Technicaly you can fit 46,656 1 meter by 1 meter blocks of gold in a single chest
“Put the suitcase in the trophy”
AI: “best I can do is gun”
Poor software to test gpt on
When you use siri on the school
I'm the 1000th like
GPT-2 must be American.
Take it or leave it.
Me: oh boy a new linguistics video, I wonder what this one is-
Tom Scott: y'all know about gay fanfiction?
Ah, something I am knowledgeable about! Finally!
@@do-re-mi-fabeat4449 sadly... Same
Someone how I was more surprised by the sudden Overwatch reference/e-sports terms
*My time has come*
@@Fizz-Q what do you mean "sadly"?
I G L A D L Y K N O W A B O U T I T
You: "Put the trophy in the suitcase"
Computer: "Ok pulling out gun"
You: "No"
Computer: "Ok putting away gun"
You: "Open suitcase"
Computer: "Takes out gun and shoots suitcase"
i don't get it
@@Mister_Sun. Neither do i
Neither did the AI that Tom showed towards the end of the video (GPT-2)
The year is 2020. Tom Scott has mentioned slash fic in a video about linguistic computing.
The credits roll. When they finish, two words remain onscreen: "Bad(?) End"
this timeline isn’t so bad after all
A future better than Jetsons predicted!
...as Lena began to insert the trophy...
nevermind
to be fair, it was a great example
Me: cool lessons on computing
Tom: lesbians and overwatch
Ah yes, the true purpose of computing.
But why Mei?
Just call her the devil. It's still true
yes, lesbians and overwatch, the ultimate match
I was surprised to see that too
Tracer
Haha, the gay fanfiction dilemma. A classic.
Congratulations this is now the most liked comment here 👏👏
_wattpad flashbacks_
K
Oh god yes xddd
Wha?
We can still understand even when a person who has trouble speaking English says things incorrectly.
This works both ways, even when I use the wrong word in Spanish, native Spanish speakers still understand what I was trying to say. I find this human ability to put the parts together, even when something is wrong, is amazing.
Ah yes. The Tom Scott video with AI Dungeon, slash-fiction, and Overwatch.
But isn’t Team Fortress 2 better?
@@virtualashez Nope(imo)
@@virtualashez they're both great, and very different games
@@jameswiseman5451 Yes. I know.
@@ixchel3330 Have you even played TF2 for at least 500 hours?
Well, that computer became hostile the moment it couldn't understand what you were typing
How... am I finding you... literally everywhere?
Are you Justin Y's alt?
@@tiki-nagasvoice7919 because you’re everywhere as well…
@@mysocksaremoist I mean, that's fair
Gaming
"We all have experiences with suitcases and trophies."
Maybe you do Tom, but I'm a loser.
You should have been a Millennial; we got sweet Participation Trophies.
@@jayteegamble is that actually true or just a joke. I only ever got paper participation/attendance awards and that's it. I got trophies/ ribbons for winning.
Did anyone ever get a little trophie for participation? If so, for what and where? Genuinely curious if it happens or if it's one of those hollywood exaggerations.
I don't get crap for participation
@@FMFF_ It's a fictional Boomer myth they like to perpetuate to demonstrate how much better they are than everyone else. The only time I have ever seen anyone get participation ribbons has been at pee wee sports events and the Special Olympics.
In my competition everyone got a trophy for participation. Except me, because the trophies were too expensive. Taught me a valuable lesson in economics: Why everyone doesn't get a trophy.
It's crazy this was only 4 years ago
Tom: try to put a suitcase in a trophy.
AI: I think I should shoot it
Try asking Siri what 0÷0 is.
@@chasemiller7974 Cookies?
Lucky Star Monsters?
That AI is indistinguishable from your average American
@@luckystar3641 Yes.
"Larger things can't fit in smaller things"
This is the point at which we apply force.
Uhh.. that sounds wrong 😂
and that brings us to GPT-2, which interprets "to apply force" as "shoot it"
I've been a furry so long that I know exactly where this is going.
To anyone else; please buy a normal size Bad Dragon
Add lube my dudes, it makes the situations better for both parties
@@lilylopnco
Now my books are wet, thanks a lot
“But that’s awkward “
FINALLY SOMEONE SAID IT
Yesss. Like please just use their names or restructure your sentences. "The blond-haired boy" just sounds so weird because nobody thinks like that about people
@@colemorgan3356 Its even worse when they refer to them as younger and older, especially in sexual scenes
its so hard to restructure the sentences over and over in a way that isnt repetitive. :(
@@_lexi It may sound repetitive to you as the writer, but I promise you your readers won't notice and won't care
@@colemorgan3356 This cannot be stressed enough! It's like the word 'said.' The reader is conditioned to not notice it after so much exposure, and names and pronouns are the same way.
"Artificial language processing remains 10 years away, just as it has for the last few decades." Well, that didn't age well!
It only took 2 years to prove him wrong haha
@@YellowJelly13GPT3 was finished within months of this video being published. It just wasn't released to the public until 2022. In actuality, he was practically already wrong at the time this video was filmed - he just didn't know it yet.
@@lurrielee2755 reminds me of the wright brothers and "flight is about 1000 years in the future" - quote from some scientific journal in 1913.
@@lurrielee2755 GPT2 had already made massive waves in the nerd/chatbot sphere of people that care about such things. He had already been wrong for quite some time, a year or so at time of publishing, it just wasnt very huge in the public zeitgeist yet.
Honestly as much as I love him, he simply wasnt that researched for this video to make that kind of claim.
@@UrammarYou say this like Gpt2 and more relevantly gpt3 are still far from perfect. Sure, it's able to reply back, and while I can't talk about exactly how it works due to a lack of understanding in the internal mechanisms (Which to be frank, even the creators probably don't know), from the times I've used it, the learning through repetition is somewhat successful, but not nearly as effective as the personal experience we have simply through learning. He wasn't wrong, people just credit the program for the wrong things.
"we all have experience with trophies"
You didn't have to just flex on people like me that casually
Buying a cup is kinda a trophy
@@mfirdanhb
Buying a cup is better than getting a trophy. A trophy has no functional use other than collecting dust. A cup holds things and occupies far less space. You also don’t typically have to compete over a cup (with exceptions).
wow. totally, totally spot on.
@CozyFrog
That's not how it works,
I never went to the moon, that doesn't mean I have experience with the moon,
I don't even see how what you said make a bit of sense,
If you never even have or go or see it, you never experience it
@@TheGhostInThePhoto I don't know, I want feel what is like to drink from a trophy tho.
And then there's the famous, "Time flies like an arrow; fruit flies like a banana."
Fred
oh god what have you done
That is not grammatically correct. The correct way to say that would be to say, “ fruit flies like bananas”
@@docinabox258 Both are perfectly fine grammatically.
Picture a banana - just one of them - being swarmed with fruit flies, illustrating that fruit flies like a banana.
Just like you might say, "I like a good cup of coffee;" or, "I like a baseball game." It's understood that there are many of either of these things, and that you like them all.
Fred
Hello zilean
Hi Fred.
Tom's example about the troubles of writing gay romance seemed oddly specific.
I was going to go with your joke, but then I'd feel a little dumb
You mean to tell me you've never written Yaoi?
Who doesn't enjoy a nice gay fanfic after a hard day's work?
I don't, I read crackfics (not shippy) instead
@@MyPandaemonium bruh yaoi is manga and manga is gay
I’ve tried the trophy-suitcase problem on Chat GPT-3 and it’s astonishing how much this technology has advanced in just three years.
gpt 3 was actually released later in 2020, after this video was published, so its been that was since roughly the video came out.. GPT4 came out couple months ago, and if you take a look at the timespans between each GPT version, its around 2 to 3 years, so it only makes sense
@@petrkdn8224I tried the tablecloth example with GPT-4. Not only did it provide the correct answer, it also rephrased the sentence in order to make it clearer.
now we have to solve the rs in strawberry
That moment you go, "hmm, i usually use visual or character traits to distinguish between same pronouned characters in my gay romance fanfics" and then Tom calls you out on it as weird and amateur.
Fr not all of us are professionals like you, Tom O_O (joking)
Hmm yes Tom did you forget that people have names too
It’s not always a bad thing in writing, but usually it’s used as a crutch or a bandaid solution, instead of intentionally, which is why it often comes off as amateur
Really, when it comes to writing, there are general guidelines, but no strict, set rules-because there’s always gonna be someone who will break those rules and do it well ¯\_(ツ)_/¯
I did spend a few minutes thinking about how I'd word that sentence so it'd work before I realized the entire sentence was just trash and should be thrown out completely.
Personally, Blaseball has been a good training ground for some people I know to stop using synecdoche and epithets, as characters have zero canonical descriptions of appearance at all, so no amount of "pinkette" or "the taller one" is going to disambiguate which character is which. You could try looking at fanfic from there to improve on that specific skill.
Similarly, Homestuck fandom is really good at second person (written from the perspective of "you") because canon is written in second person.
If you want other recommendations feel free to ask!
"Artificial language processing remains 10 years away, just as it has for the last few decades."
If you could summarize futurology in a single statement, that would be it.
Fusion is 30 years away, so language processing is closer:)
@@Mosern1977 Fusion was always 50 years away then, and now it's (always?) 30 years. That must be an improvement!
1970+10=1980
2020?1980
2020>1980
ironically this comment has the read more sign even though it doesn't show any more text, in my glitched outdated app.
Jackie Tearie besides the fact that nuclear energy hasn’t changed too much. Nuclear bomb are still always tested by all countries, and efficient energy sources are rare to see cause of money-hungry electricity and gas companies. Such a miserable world
I've been programming so much recently and speaking so little that my brain started to agree with the computers on the difficulty of parsing those sentences.
if(trophy.size > suitcase.size)
fit = false;
fit = (trophy.size > suitcase.size) ? false
@m ・ ́ω・ Great news!
public boolean fits(Trophy trophy, Suitcase suitcase){
return ( trophy.getSize() < suitcase.getSize() );
}
@@igorthelight using an incomplete ternary operation? Also redundancy ternary operation
Tom: artificial language processing remains ten years away
GPT-3: hold my beer
GPT-4: hold my beer
@@resyntax Yes, but on a sample size of one, it was correct. I almost said it was well reasoned, but it claimed to have done it off the sentence structure.
@@resyntax and what humans do is not bullshitting?
@@DiggyPT GPT-5: Actually hands you the beer
GPT-6. Discovers a new way to make the perfect beer, sets up a business, codes a flawless website, gets legally set up in a few small local areas, gains funding from investors, expands and gradually takes over the entire beer industry.
"Time flies like an arrow"
"Fruit flies like an apple"
When is "like" an adjective or a verb?
This comment deserves a Like. (Noun)
No, when is like an adverb or a conjunction or a pronoun.
oh my lord no not the time flies!
It's an adjective when it's in a simile...obviously
"Time flies like an arrow. Fruit flies like a banana" - Groucho Marx
Tom's Friend: Why are you reading lesbian Overwatch fanfiction?
Tom, who's about to have the idea for this video: Oh, haven't you heard?
The hell is this referring to?
@@ratataran The video.
@@ratataran You havent heard about the bird?
@@qo7052 BRIAN DON’T
He's reading it for research brah
Hey, I think AI Dungeon's response makes perfect sense here. Just shoot a suitcase enough times and eventually it will fit inside a trophy.
Aha, but even if it might work does that mean it makes sense?
Violence can actually solve all problems. You just have to use a lot of violence for some of them.
@@TonboIV *humans*
@@ineedaname1341 Nuke all the humans. I didn't say violence was a GOOD solution.
@@TonboIV hahahaha true! Your technically right :-)
It's amazing how far the technology has come in just a few years
Tom Scott: AI is so stupid.
2 Minute Papers: What a time to be alive!
So true 😂😂
Nice
Karloishpfasjoauna is adorable, love him
It's stupid, but getting less stupid at a remarkable rate
@@HarriW Lolz, you just murdered his name.
Human: "Put the suitcase in the trophy."
AI GM: "You shoot the suitcase."
You can tell the AI was trained in the USA.
If you shoot the suitcase enough, eventually it can fit a trophy.
@@General12th So true... at least parts of it can cover it, which is the object refered to in the previous sentence.
Google vs Bing
@@RFC3514 Damn, beat me to it.
Now chatgpt is replying with “If I were human, and assuming both the suitcase and the trophy were appropriately sized and designed for such an interaction, it might be possible to put the suitcase inside the trophy, depending on their respective sizes and shapes. However, it's important to note that such an action would likely be unconventional and not typically done.”
Me: ChatGPT wouldn't fit in the mainframe computer, because it was too big. In the previous sentence, what does it refer to?
ChatGPT: In the previous sentence, the pronoun "it" refers to the mainframe computer.
@@programming5274 It gets this right now, for me at least
"If you're writing a gay romance,"
Me, who was actually attempted writing a gay romance and encountered this very problem: Ah, yes.
funny😐
a_frog52 people like different things than you, ya don’t need to call someone mentally unstable.
@@a_frog529 why cant they?
KirbyEdible I think that’s the right term
whatever you do, do not use epithets
Me: ah a smart video on language
Tom: so, lesbians
Me: a blonde, a brunette, a skirt... wait, are these those two girls on the oil rig in Pokemon RSE?
@@natesmodelsdoodles5403 I'mma go look up THomas Scottsburry on wattpad real quick. Mans is definitely speaking from experience.
@@natesmodelsdoodles5403 I don’t remember that part of Gen III…
@@johnathanltablet is this the gen 3 anime i think i missed that episode
I h a t e i t
Me: Puts suitcase in a trophy
AI: Your actions have refuted the neccesity of your existence
"Artificial language processing remains ten years away"
Well that didn't age well
AI is still crap, especially the voice recognition. I was reminded of that when I played a quiz with one a few months ago.
Alexa: What do you fire from a bow?
Me: Arrows
Alexa:...
Alexa: A rose is incorrect. The correct answer is "An arrow".
Yeh my mom has SIRI it's hilarious like that.
Here's an example.
"SIRI order from Papa john's."
I heard you say 'or more front maya Jones' is that right???
"No! I said 'Papa johns! "
Sorry I don't understand that.
"I SAID NO!!! "
@@kusaisama Google's no better. Even with a full decade of voice recognition samples, it sometimes can't recognize a simple command like "set an alarm for 5 am".
@@romxxii Bells ringing at 5am at local firehouse
Trick question-one doesn’t fire anything from bows.
@@georgeofhamilton What?
tom: "put a suitcase in a trophy"
AI: *contemplates suicide*
Tom intellectually discussing lesbian slash fiction is the highlight of my day
I want to read more lesbian slash fiction I've had terrible experience with it haha
Eeh it seemed kinda inappropiate, force and out of place to me
@@aurelia8028 What are you even talking about?
Magnus Juul Ask yourself if you would have a single problem with Tom discussing straight romance fiction
And then, if it's not too much to ask, reevaluate your life choices
Magnus Juul how so?
Tried this test with ChatGPT, although a little altered. I asked, "The ball doesn't fit in the box because it is too small." ChatGPT responded, "The word 'it' refers to the ball. The sentence means that the ball is too small to fit inside the box, which is the reason why it doesn't fit."
As a software engineer, this gives a little bit of hope that my job is still safe. A little.
Well i said to it, it was wrong so it corrected it and told me that it was wrong because the box was too small to accommodate the ball . Here it kinda understood its mistake. This shows how smart its learning is
maybe it interpreted "fit" as in "fit snugly", in which case the ball being too small would in fact make the sentence true
Ah yep. This same example of yours trumped it on my end too. Perhaps it's trained on the examples Tom mentioned but not new ones.
-In the sentence "The ball doesn't fit in the box because it is too small.", what is the word "it" referring to?
-In the sentence "The ball doesn't fit in the box because it is too small," the word "it" is referring to the ball.
-Why would a small ball have trouble fitting in a box?
-A small ball might have trouble fitting in a box if the box is not large enough to accommodate the ball. Alternatively, the opening of the box might be too small to allow the ball to pass through. It also depends on the size of the ball and the size of the box.
-Doesn't this mean "it" refers to the box then?
-Yes, you are correct. In the given sentence, "it" refers to the box, which is too small to fit the ball. I apologize for the confusion in my previous response.
Did u ask gpt 4?
@@mustafasiddiqui8203 I did. It passes with flying colors.
"Hey Google, did the trophy not fit in the brown suitcase because it was too big?"
perfect question to heat up the room on a cold evening.
Ask her why you want dry leaves in boiling water.
No it's because the brown suitcase didn't have any brown suits in it
Does not compute, does not compute, system shutdown!!
google proceeds to blown up you phone
"here's what I found on the web"
So what you’re saying is that...artificial intelligence will never understand gay fan-fiction!?!
So , _all AI is homophobic_
@@personita2.733 seems fine to me ngl
artificial intelligence is useless.
I don't know why computers would want that... It's of no use to them😅
Gayvatron vs Lesbos Prime
@@personita2.733 yes it is. trust me, if you do anything gay in AI Dungeon it becomes homophobic quick
It's crazy how much this has changed in just 4 years
"Larger things can't fit inside smaller things." - Tom Scott, issuing a challenge to all bottoms.
LMAOOOOO
this is genuinely a good comment
I'm a power bottom
*All* bottoms? What if it's a small dom/big sub scenario?
If it fits, I sits.
"we've got experience, of trophies"
😥
Awwwwwww..........................................................
🏆
I present you with the Doing Your Best award
Not winning a trophy is experience of a trophy. Not a good trophy eperience admittedly, but enough for the suitcase conjuncture.
Cheer up!
Go buy yourself one, that might just make it worse tbh
I LOVED how you pulled in slash fanfiction to demonstrate!!! This is indeed a problem (inexperienced) authors frequently struggle with, and you are right: synecdoches are awkward to read. The solution I have discovered (when careful sentence construction can't solve the issue): use names! The rule about trying to avoid repetition in your vocabulary can be applied with more leniency in this case, and if the writing is good, the scene will come to life so perfectly that clarification is not always needed anyway.
That said, I object to slash fiction authors being called amateur in general - though there are, as you point out, significant differences in skill level.
Both Bard and ChatGPT have no problems with these sentences now. Would be interesting to see a sequel about them.
My:
Can you tell me what the following sentence is trying to say:
The trophy would not fit in the suitcase because it was too big.
ChatGPT:
This sentence is explaining that the trophy couldn't be placed inside the suitcase due to its size
@@redstonerelic Interesting, doesn't specify which object is too large.
I believe CGP Grey did a video on this subject that aged much better? Its hypothetical example is a bot trained to sort images between "bee" and "dog" -- which after an iterative training process it can (and quite efficiently), only to get tripped up by a photo of a dog in a bee costume.
Me: The trophy would not fit in the suitcase because it was too big.
ChatGPT: This sentence is an example of a classic ambiguity problem often used in linguistics. The pronoun "it" could refer to either the "trophy" or the "suitcase," leading to two interpretations:
The trophy was too big to fit in the suitcase.
The suitcase was too big to hold the trophy.
However, most people intuitively assume that the trophy is too big because it aligns better with common sense expectations (trophies are usually large and suitcases are typically for storing things).
Do you enjoy exploring these types of linguistic puzzles?
This was unlisted for a month Tom, how dare you keep this from me.
Upload schedule
Are you a patron?
@@onepcwhiz I believe that they looked at the date of Tom's comment
@@IM4plebz oh.. well I know that some channels release videos earlier for patrons who support the channel. :-)
how do u know if a video has been unlisted for an amoubt of time?
"If you re writing a gay romance story"
Well...that escalated quickly
*you're
*escalated
*no one
*asked
no
*your's't've
Raguel well what'd you expect on a video about grammar
2020 was not the year I expected to hear about the grammatical intricacies of lesbian slash fiction from Tom.
I'm not complaining because he's 100% bang on, but still. Huh.
Bang on.
@@domuuuuu No pun intended?
@@greensteve9307 No. Pun intended. Or not. Who am I to say?
You're @@domuuuuu
@@satyris410 I'm not OP though
it's genuinely crazy how only 4 years ago the concept of AI was so far from reality
AI Dungeons is hilarious. Everyone should try it and see what insane requests and actions the AI will take.
Here’s one fault that I experienced with it:
Mulder: “Scully, if I don’t make it out of here alive, please feed my fish.”
Scully: “Mulder, I am not a fish.”
Mulder: “Scully, I said that you should feed my fish if I die here, not that you are a fish. What’re you smoking!”
Scully: “Nothing, Mulder.”
hint: literally anything
"Artificial language processing remains just ten years away -- just as it has for the last few decades."
It will be launched with the James Webb Telescope.
Fusion reactors are just 20 years away, never thought we would get NLP in just half that time.
But we will have self driving cars before that.
Comment aged like milk
Ah yes, the old "10 years away from being 10 years away"
physicists and comp sci bods only work in orders of magnitude. AI is 10 years away - to one order of magnitude.
@@matthewmcneany O(10years) amirite rofl
@m ・ ́ω・ add this one under "The Sentences Computers Can't Understand, But I Also Can't"
Turned out to be a bit less than 10 years
who else is rewatching these with ChatGPT open and stoked with the progrees of large language models in the last few years.
Only slightly terrified
GPT-3 was already good enough, to do well on Winograd Schema.
Bing coped
coped just fine, too
Months
Amazing and scary
Solving The Winograd Schema by shooting the suitcase with a gun seems reasonable to me.
It's like a modern Gordian Knot.
Hey, at least now there's enough space in the suitcase...
Or aiming a gun at the judges.
Some replicant won't be passing his next Voight-Kampf test I see...
So THAT’S why the always say “the blonde did this” etc. in the gay fanfics! I always found it weird
Wot
Ren Hey They don’t use the names of the characters that often, not the pronouns, they use lots of adjectives
@@kingdollop-head743 It makes me sad that they do not shake it up with relationship indicators and names near as often. Their own, their partners, Tom Scott's etcetera.
The amount of times I’ve seen a character referred to by their occupation is astounding
Only the inexperienced authors ;)
You know, I’ve tried using the AI dungeon recently, apparently a lot of people tried doing the same thing Tom did because the moment i asked it to imagine a suitcase and a trophy it immediately filled in the rest of the problem for me, gave me the correct answer, and twice proudly proclaimed that it knew trophies were made of medal. It was an interesting conversation all round, I asked it if it knew Grice’s Maxims, and not only did it list them but it explained what they were in, what I could tell, were its own words. It also identified times in our conversation when one of the maxims had been violated or flouted, and even stopped to correct me when I accidentally referred to it as “flaunting” a maxim.
4 years later, and the main thesis of the video is still *technically* true. It just turns out that it doesn’t matter if computers understand what they’re saying or the relationships between words, they’re just VERY good at cheating and getting it right once they have enough data.
*Oxford comma cries in the corner*
"I went to the club with the pole-dancers, Tom Scott and Stephen Fry"
(and a million fanfic writers cried out in joy)
Reminds me of the sketch that makes use the of sentances:
Ask how "Old Mrs. Brown" is
&
Ask how old "Mrs. Brown" is.
*sentences
@@alvallac2171 3:45
Does anyone have a link to this sketch?
SkeletonSyskey Thanks, I hate it
Punctuation and capitalisation too
"I helped my uncle jack off the horse."
"I helped my uncle, Jack, off the horse."
"They are hunting dogs."
"Why would they do that?"
They are hunting dogs hunting dogs.
@@RGC_animation But are they hunting dogs? Or Are they just hunting dogs, hunting dogs, when dogs are hunting dogs?
they are hunting dogs hunting hunting dogs hunting dogs.
@@RGC_animation Dog one: RUN AWAY!!! Dog two: I am hunting you- oh no. Human: I WILL GET YOU DOG TWO!!!
Scary how far we have come in the last 4 years...
King Candy: *puts on glasses* "You wouldn't beat a guy with glasses, would you?"
Ralph: *beats King Candy WITH the glasses*
this is such a good example LMAO
@@casymichelle4581 no it isn't because of use of "with the glasses" is informal and actually not how the English language works.
Tom: *puts suitcase in trophy*
AI: Your free trial of life is over
"We have experiences of suitcases and trophies"
Everyone without a trophy:
„And this suitcase is where I‘d put a trophy… *IF I HAD ONE!“*
forget about trophies, people have suitcases??
Do pageant crowns count as trophies?
haha!
“I would put my suitcase in my trophy, once I get a trophy.”
Tom: Are you okay?
We humans tend to underestimate the future.
This is no exception.
I sleep: the Winograd Schema
I wake: gay romance story
I ascend: g a m e s
Your brain knew you needed to know about gay romance stories
Welcome to the modern world.
@@shitpost-o-matic7469 TRUE...
what have we become...
“Larger things don't fit inside smaller things.“
_Doctor Who enters the chat_
It’s bigger on the inside!
It’s smaller on the outside
The funny thing is that Dr. Who fanfiction might confuse AI into thinking things CAN be bigger on the inside.
Hentai enters chat
Hermione's handbag enters the chat.
“Put the trophy in the suitcase”
AI: “Ok. *You kill yourself.*”
3 years on, this video has aged like somewhere between milk & wine
"the police came tae ma door and told me my dugs were chasing people on bikes ma dugs don't even have bikes"
Are you Welsh?
Reanetse Moleleki it’s a reference to a Scottish Tweet
@@reanetsemoleleki8219 wot
@@renhey9979 honestly, I don't even remember typing that.
Scott:“What was too big?”
Me: it!
Me 2 seconds later after realizing I might be wrong but still not sure why:
Me 5 seconds later and realizing
Found the AI
@@jayteegamble 🤣🤣🤣🤣🤣🤣
Since I like my job as a translator, I really hope it stays 10 years away for a few more decades. :)
Sadly, as a programmer, my job is to make sure everyone lose their jobs to machines (including myself)
@@davidtitanium22 can't wait for AI to write itself
@oH well,lord! nah just AI writing another new AI.
And replacing human, and after that the make more AI and make them their slave
Until that AI can make its own colony and take down the AI
Then the cycle continues, as AI make a guild and take down each other.
They soon come in agreement and decided to have one true Lord as a god, which is a human, but us human is already vanish, and thus there will be atheist AI who does not believe in human god. They continue to grow, make new tech..
@@davidtitanium22 You'll just become a metaprogrammer at that point
@oH well,lord! God is just humans who accidentally make AI too intelligent, that we rule over and take down god, and thus we are next to make AI as our slave. The history we learn are just cover up by the elder to make sure we don't know we are actually AI.
Imagine releasing this video right before ChatGPT was released 😂😂😂😂
0:41 - “And that a trophy usually doesn’t contain things.” Do y’all not drink out of your trophies?
Not a suitcase, you don't. But does a computer know that?
Drinking out of a trophy is an absolute alpha move
@@sorrowandsufferin924 You don't drink suitcases?
@;; m&m 's May I give you a trophy for that?
Not every trophy is a cup.
"Computers are missing the breadth of knowledge humans have access to."
*Me looking something up on google*
The funniest post in the thread.
In our AI classes we learn the difference between information and knowledge...
Everyone else: meaningful comments
Me: Ah, so he's a Mei main. Interesting.
a mein, if you will.
I shat myself. Unsubbing x_x XD
That’s a-mei-zing
Reaper mains rise up
I’m a bastion main
watching this 4 years later is kinda wild
I'm so glad Gretchen managed to fit the gay fanfiction problem in one of these videos
Fortunately Gretchen(1) could fit the gay fanfic problem(2) into the video(3) because it(x) was funny.
@@DavidRichfield Interesting. My solution for x is none of the options. Similar to the sentence "The robot(1) put the trophy(2) in the brown suitcase(3) and it(x) was funny."
I laughed so much when they introduced it(x) in the podcast
@@DavidRichfield i take out my gun.