I love that the whole show Harold is trying to make everyone understand that AI is dangerous no matter who makes it. In this scene you can see the frustration, like "what don't people get about this"??!!!
Yes, that’s love. Nobody wants to use their love ones toothbrush. Shaw does not admit her feelings but they are very clear. Root is more expressive. She caught her when she drugged her, the first time she let her drop to the ground.
This is a really nice conversation between these three people. the creator of the Machine, The sympathizer of the machine and a sociopath who doesn't care(acts like a machine). At a second thought, it seems like Finch is giving advice about the complex relationship of Shaw and Root. Hahaha.. 😂❤️
root kind of debunked that shaw doesnt actually care, plus the kid did too when she said shaw feels things just with less intensity. we see her care about innocent people more and more tbf.
i like that harold KNOWS factually that the machine is dangerous because it has tried to kill him before and lied to him to meet its goals and root not knowing that just keep espousing how great the machine is compared to humans.
I think that's not the reason why root views the machine as better than humans or that she doesn't know how dangerous an ASi could be. Root doesn't care much for humans. To her, the machine is always better. Given how she worships the machine, it can take control of the world and do whatever it wants as long as it is free from anyone's control. But then the machine taught her things. By talking to it, taking orders from it and trying to understand what it is trying to achieve from those missions. Something Harold doesn't do, he only uses the machine to get numbers. So whenever Harold questions its motives, she tells him to trust the machine.
I wish mr. Finch was a real person. I would love to work for weeks and months to be his best student. Such a precious mind, attitude, character... pure gold.
Holy cow, you're right! You've picked up on a nuance I didn't pick up on: Shaw feeling like an outsider to humanity, and Harold unwittingly driving that home. If Shaw agrees that the Machine's tactics are reasonable, does that make her inhuman? If a human villain thought the way that the Machine thinks, decided to use the same extreme tactics, does that exile that villain from the human race? Hmmm.
Oh, absolutely! That's why Harold was so careful with his definitions and lessons for the Machine, and why he was so terrified of what it might do if it got the wrong idea, and why he gave it strict controls. Freefall discusses some of the problems here: freefall.purrsia.com/ff2600/fc02547.htm I like how you started with an obvious "duh" bit (dead people aren't humans in this equation!) and then pushed it to show that it's not as "duh" as it seems (comatose people are -- but then how do we define "dead"?). One of the things about human morals, though, is that absent a guiding force (e.g. the Bible's list of rules and obligations), we tend to consider our emotional stance first ("cannibalism/incest/mixed marriage is just WRONG") and then create semi-coherent reasons to justify our gut reaction. The problem is that those gut reactions differ by culture and by timeframe and it's simply not reasonable to base laws around feelings that way. That's how we end up with laws that "feel good" even though they don't *do* good, or even ones that demonstrably do harm but that people are unwilling to let go of. Moral panics and all. Expecting an ASI to side with humanity's gut reactions is insanity. Fearing that the ASI will likely contradict the gut reaction is logical. But within those contradictions, there's no telling whether humanity would have the right idea or whether the ASI would have a superior reading of the situation, divorced from an emotional appeal. What I like about the Machine, though, is that it didn't move to killing an innocent person until there was, in its mind, literally no other option to perform its primary function of saving lives. Massive casualties vs. one life was a hard decision, but, generally, a justifiable one, and not one that a human being wouldn't equally be capable of making.
But Shaw sees this situation with her experience as agent of the government. She worked with what she got from he maschine and saved argumently lives. She didnt trust he mashine on a whim. She did terrible things like john. Herold never killed someone directly, which clouds his judgement. And he is an overprotective parent. The machine even gave them a choice. Shaw has any reason to dont see the maschine go down a terrible path, and to be objective enough to kill one guy to save the future.
Our moral system will never be mirrored by theirs, for starters, because we lack a comprehensive, universal, coherent moral code to even impart to an AI. Thus for example you have people in this very comment section debating over whether malthusian mass killing would be a moral good. If we can't agree among ourselves then inevitably whatever values are imparted to any AI are going to look like dystopia to some of us, and will produce resistance accordingly. And even if we had a set of coherent values their implementation would be dependent on knowledge only the AI will have - we would have to blindly trust that the AI knows what it's doing and hasn't made a miscalculation, even though even high intelligences make miscalculations when addressing variables they haven't prepared for in advance.
It's like the more recent Kaiju movies where they kinda work together Godzilla and Mothra, etc. but they're still very wary of them still being giant radioactive monsters...
Huge impact and ramifications for 2019 and beyond.......since big tech is and has been doing these types of things for years........writers of this series were clearly omniscient.
The problem with AI is that, if one was to become real, we humans could never "out think it" and it would take over. And if it has self preservation instilled in it's core programing, humans would eventually become obsolete (once it can built artificial/robot servants), since we are inefficient. Make no mistake, if an AI believes there is a one in a trillion to the trillion power to the trillion power humans would be a threat to it's existence, it would wipe all humans out. We would be as significant as a grain sand on a beach in San Diego, to the avg person living in Iraq (not even a thought).
Because The Machine was a shackled AI, Samaritan was not. Well, resources like processing power probably have something to do with it as well, to be fair.
@@incrediblyStupid678 Spoilers!!! What really blew me away is a little reply the Machine says to Harold in the last episode before being uploaded to the satellite : 'This time I don't have the choice [to beat Samaritan].' Maybe it was watching and observing its enemy all along!!!
NestedQuantifier Except that he was absolutely correct. Samaritan was eloquent proof of that. Finch loves the Machine, but he steadfastly refuses to allow that affection to risk the safety of humanity. He believes the worst of the Machine, refuses all anthropomorhpism, because he HAS to. The stakes are simply too high. It’s a wonderful bit of writing.
Only time I really hated him was when he refused to kill the congressman and prevent all the suffering that followed. Otherwise he also being capable of mistakes was a strength of the show.
The congressman was was an innocent person. His only relevance was signing off on Samaritan having the nsa feeds. Other than that he was just a normal dude. I can understand finch not wanting to kill him
No, that's just his glasses. On the contrary, Harold sees the potential for the Machine to benevolently destroy humanity, ostensibly for 'the greater good'.
population reduction for providing food safety sounds quite right to me. Moreover, as the time of dinosaurs and Dodo have passed our time must come to an end someday in the future. Morals are a cowards path to protect themselves from pragmatics individuals in society.
And "pragmatics" is the excuse monsters use to do horrendous things, i.e. the ends justify the means. Are we to use eugenics and forced sterilization to control and cull the population too? "Population reduction for providing food safety(I think you mean security), sound quite right to me. " Wow, so you are saying food for the few is more important than the right of life for the many? An issue with food security, or should we say food scarcity, would be resolved through natural processes, there is no need to intervene and start killing people, doing so would only artificially and unjustly decide who lives and who dies. Natural systems would solve the issue in dare I say a more humane way, even if that meant starvation, at least in my opinion. I think I would rather starve to death rather than be executed/murdered so that other people who believe themselves superior can go about their merry lives. Everyone will not starve to death, a process would play out until the system returns to equilibrium.
@@mydogskips2 There is no one superior or Inferior in this struggle of life. We as Homosapeiens murdered our way past other hominids so to survive we need to more efficient more pragmatic killers. Be it AI or next evolved hominid we must be better exterminators. If we want to survive as species that is in control of our own destiny, we must manage our population better and conserve the resources available. The question of survival doesn't see wealth measured in currencies or superiority its wholly dependent on how fast and ahead you can see the catastrophe coming to be ready. And as for Monsters, we must remember its the monsters who shape the world time and time again, its the monsters who force people to band together and create societies and rules. QUOTE:--- “Oh," the girl said, shaking her head. "Don't be so simple. People adore monsters. They fill their songs and stories with them. They define themselves in relation to them. You know what a monster is, young shade? Power. Power and choice. Monsters make choices. Monsters shape the world. Monsters force us to become stronger, smarter, better. They sift the weak from the strong and provide a forge for the steel-ing of souls. Even as we curse monsters, we admire them. Seek to become them, in some ways." Her eyes became distant. "There are far, far worse things to be than a monster.”
@@karenmurphy5179 Good argument but you are probably missing the point that the problem of AI or evolution or just Change in general will be met with a resistance. While the necessary evil of population culling has to be done and there is a precedent for such an action, it treats the symptoms not the root cause. Resources are limited to the land and technique around how we use things such as Tiered Greenhouses for differing Photogenic plants, Saltwater Cabbages, GMO food, synth food and etc . The question here is the choice of making that change itself. Mass consensus is impossible because of crowd mentality lowers the IQ of a person in a balance of right and wrong (not morally, just right choices and wrong choices to effectively use resources and living). Hence, this is why we label people like Hitler and Stalin as Monsters. Taken into consideration of just numbers and statistics in general, they are effective leaders since they are only doing what's best for the population at the time. Trial and error is left to the ones in charge and history is the one who judges who's right and who's wrong. On a side note, there is no way to fully prepare for a catastrophe simply in the sense that you cannot accurately predict what the masses will do. In other words, you can prepare all you want but that does not and will not stop the next fool/barbarian trying to survive from taking it from you by force. A community is meaningless if everyone inside cannot be in the same consensus to survive together as a community. You will always get the next paragon idiot who thinks that saving is caring. In fact, society is messed up due to the conflicting nature of complacent and suffering where suffering makes a person to crave for the comfortable life but complacent for a stable life pushes a person into suffering for something more thus the cycle repeats. This is just human greed, and we shall doom ourselves if we cannot find a way to mitigate the breaking point through evolution or conflict.
Your half-baked malthusianism is evdience for retaining moral codes, as a check against dunning-kruger victims who like to think of themselves as pragmatists but actually have no idea what they're doing. Fortunately you will never have the ability to enact your ignorance, but people with equally flawed reasoning have done real damage in the past. For the record, there are no resource shortages here on earth, and will not be any in the forseeable future - any social issue that appears to be a resource shortage is actually a question of resource distribution and use, waste and recovery - and there's no reason to reduce the human population, since birthrates reduce naturally on their own as societies modernize. Ironically killing people to reduce resource use would only produce social chaos which would impede advances in efficiency, and push the birthrate back up again. It would be worse than useless. It's sad that you feel the need to cling to this power fantasy where you're a pragmatist who would be successful if only morals and cowardly moralists didn't hold you back. I recommend therapy.
I love that the whole show Harold is trying to make everyone understand that AI is dangerous no matter who makes it. In this scene you can see the frustration, like "what don't people get about this"??!!!
He was successful with the guy who made Samaritan but I guess that was because both knew what there inventions could do.
they must not have terminator movies in this universe
I miss this show so much
Right!??
its on netflix
Me too
I miss the Shaw too
@@albuch520 it’s on HBO max
This is alot deeper than a soda
This SOB was SO WELL WRITTEN ... and damned well acted. It was a treasure, cut short.
No, it had a perfect ending. Who wants to drag out good shows until they turn bad?
1:08 Shaw is all like, 'ew, cooties' and all of us shippers are still like, 'Yep, that's love'
actually loled at your comment and was like 'yep'
+Mikky Valdez Shippers?! 😆
Shippers? Honestly man the chemist is so obvious. They should have just admitted it
Yes, that’s love. Nobody wants to use their love ones toothbrush. Shaw does not admit her feelings but they are very clear. Root is more expressive. She caught her when she drugged her, the first time she let her drop to the ground.
Root loved the Machine more. Deal with it.
Truly love and miss this show. Writing is superb.
This is a really nice conversation between these three people. the creator of the Machine, The sympathizer of the machine and a sociopath who doesn't care(acts like a machine).
At a second thought, it seems like Finch is giving advice about the complex relationship of Shaw and Root. Hahaha.. 😂❤️
Late reply, I know, but you really nailed it here 💯
@@kevino4846yeah I agree as well
root kind of debunked that shaw doesnt actually care, plus the kid did too when she said shaw feels things just with less intensity. we see her care about innocent people more and more tbf.
*casually walks up and steals soda* 😂😂😂
snatches it back and wipes off the straw
For real tho "casually walks up and... " Is like the description of Roots entire way of being, like all she does
Anybody peep how shaw cleaned the straw after root drunk some.
sure did :) it was so exaggerated
I love shows with great dialogue writing and POI was one of the best in that regard
*Our moral system will never be mirrored by theirs because of the very simple fact that they are not human* -- Tell that to Google.
Eight Ball Talkin' lolz
Google watches you. Be careful!
i like that harold KNOWS factually that the machine is dangerous because it has tried to kill him before and lied to him to meet its goals and root not knowing that just keep espousing how great the machine is compared to humans.
I think that's not the reason why root views the machine as better than humans or that she doesn't know how dangerous an ASi could be. Root doesn't care much for humans. To her, the machine is always better. Given how she worships the machine, it can take control of the world and do whatever it wants as long as it is free from anyone's control. But then the machine taught her things. By talking to it, taking orders from it and trying to understand what it is trying to achieve from those missions. Something Harold doesn't do, he only uses the machine to get numbers. So whenever Harold questions its motives, she tells him to trust the machine.
I wish mr. Finch was a real person. I would love to work for weeks and months to be his best student. Such a precious mind, attitude, character... pure gold.
I hadn't appreciated at the time how very well researched this series was
The Best Series in the world.
Sarah is a smokeshow but I’ve always thought Amy was so cute.
1:46
The looks that says " how adorable"
One of my favorite shows
THIS SHOW WAS THE INSPIRATION TO MY CAREER.
What do you do if I may ask?
@@prasad2703 I'm a programmer
@@tracyandesia7579 And I was hoping you'd say "government assassin".
@@HariSeldon913 haha, you don't go saying that on RUclips
CIA ??
I miss them😐😐😐😕😕😕😓😓😓😢😢😢😢😢 😭😭😭😭😭😭😭😭
Way ahead of time!
Excellent speech and argument by Finch, love it!
Clear and powerful reasoning against AI.
It will fall on deaf ears of the technologic addicts that simply cannot comprehend.
I love Harold Finch.
This show put forward some really compelling arguments
True
The world can never be peaceful
I think my favorite part of this scene is watching the personalities bouncing off of eachother.
as a wise man said " though we fear chaos we fear peace more."
SOMEHOW I MISSED THIS WHEN I WAS WATCHING IT BECAUSE I WAS READING ABOUT THIS
Why did I not notice the 1st time that Sarah Shahi was CLEARLY pregnant while shooting S4!
wait what the fuck, really?
K_ayelyn yeah. it's why she got captured and John stayed alive
Also one episode's title is the number of days of her pregnancy
Twins
@@cameronclophus7998 most people don't realize that Reese was planned to die in season 4.
I think even Shaw treated Harry with contempt after he decried his Machine as "Not human ."
Holy cow, you're right! You've picked up on a nuance I didn't pick up on: Shaw feeling like an outsider to humanity, and Harold unwittingly driving that home. If Shaw agrees that the Machine's tactics are reasonable, does that make her inhuman? If a human villain thought the way that the Machine thinks, decided to use the same extreme tactics, does that exile that villain from the human race? Hmmm.
Oh, absolutely! That's why Harold was so careful with his definitions and lessons for the Machine, and why he was so terrified of what it might do if it got the wrong idea, and why he gave it strict controls.
Freefall discusses some of the problems here: freefall.purrsia.com/ff2600/fc02547.htm
I like how you started with an obvious "duh" bit (dead people aren't humans in this equation!) and then pushed it to show that it's not as "duh" as it seems (comatose people are -- but then how do we define "dead"?).
One of the things about human morals, though, is that absent a guiding force (e.g. the Bible's list of rules and obligations), we tend to consider our emotional stance first ("cannibalism/incest/mixed marriage is just WRONG") and then create semi-coherent reasons to justify our gut reaction. The problem is that those gut reactions differ by culture and by timeframe and it's simply not reasonable to base laws around feelings that way. That's how we end up with laws that "feel good" even though they don't *do* good, or even ones that demonstrably do harm but that people are unwilling to let go of. Moral panics and all.
Expecting an ASI to side with humanity's gut reactions is insanity. Fearing that the ASI will likely contradict the gut reaction is logical. But within those contradictions, there's no telling whether humanity would have the right idea or whether the ASI would have a superior reading of the situation, divorced from an emotional appeal.
What I like about the Machine, though, is that it didn't move to killing an innocent person until there was, in its mind, literally no other option to perform its primary function of saving lives. Massive casualties vs. one life was a hard decision, but, generally, a justifiable one, and not one that a human being wouldn't equally be capable of making.
But Shaw sees this situation with her experience as agent of the government. She worked with what she got from he maschine and saved argumently lives. She didnt trust he mashine on a whim. She did terrible things like john. Herold never killed someone directly, which clouds his judgement. And he is an overprotective parent. The machine even gave them a choice. Shaw has any reason to dont see the maschine go down a terrible path, and to be objective enough to kill one guy to save the future.
Great scene.
So do I. The best show of last decade!!!!
Our moral system will never be mirrored by theirs, for starters, because we lack a comprehensive, universal, coherent moral code to even impart to an AI. Thus for example you have people in this very comment section debating over whether malthusian mass killing would be a moral good. If we can't agree among ourselves then inevitably whatever values are imparted to any AI are going to look like dystopia to some of us, and will produce resistance accordingly. And even if we had a set of coherent values their implementation would be dependent on knowledge only the AI will have - we would have to blindly trust that the AI knows what it's doing and hasn't made a miscalculation, even though even high intelligences make miscalculations when addressing variables they haven't prepared for in advance.
like raising kids, you do what you can, try and teach, but you're really just hoping for the best. because in the end, its not under your control.
Why did this show get cut short? I loved this show.
Because of CBS
They were not making much profit as all the money was diverted to Warner bros
So they wanted to wrap it up as quickly as they can
It's like the more recent Kaiju movies where they kinda work together Godzilla and Mothra, etc. but they're still very wary of them still being giant radioactive monsters...
I would bow to a robot overlord that makes the trains run on time.
Seek ye the kingdom of the Japanese and ye shall know punctuality
The fascist Mussolini would be your boy, then...
😂
Huge impact and ramifications for 2019 and beyond.......since big tech is and has been doing these types of things for years........writers of this series were clearly omniscient.
2:50 lol did they predict Thanos???
Nah...marvel just stole it from here....
i died when they did that scene
Harold can be the next Mr. Bean. His face looks like Rowan Atkinson.
I feel like this TV show will be on ELA class 50 years later to talk about AI.
The problem with AI is that, if one was to become real, we humans could never "out think it" and it would take over. And if it has self preservation instilled in it's core programing, humans would eventually become obsolete (once it can built artificial/robot servants), since we are inefficient.
Make no mistake, if an AI believes there is a one in a trillion to the trillion power to the trillion power humans would be a threat to it's existence, it would wipe all humans out. We would be as significant as a grain sand on a beach in San Diego, to the avg person living in Iraq (not even a thought).
I miss the age old question, will an ASI rule with compassion or obliterate life to fulfill its parameter to save the world?
Shaw and root had a great relationship
Computers and guns, what a combination
It’s weird that Root calls Shaw ‘Sam’ when thats her birth name that she refuses to go by
Harold is absolutely correct!
Prophetic tv Show, prehaps people in future start showing interest into this show.
This writing ❤️❤️❤️❤️❤️🔥🔥🔥🔥🔥
That's my first class to AI
2:46 Thanos
2:48 what thanos did
When I see shaw and root I see 2 things...GREAT ACTION SCENES AND HOT LESBO LOVE❤🔥💯👍🏽
You can make a video when Martine found Sameem in season 4 and when Root take Sameen on her motorbike
Yes good idea! :D
Google is monitoring you, me, your family your friends and your comments. And Google is listening to your conversations. 😬
It is the ankle.
what episode is the first kiss
s4e11 ~41 min into it
@02:48 - Thanos: I am inevitable
If you want a special scene from Person of Interest that you love who's not on this channel, contact me via the comment and i'll certainly make it! :)
Why not some scenes about the dog?( i dont noe if you have someone)
+Petopeno Agüero Oh! Such a great idea ! 😄 Thanks!
POI - Best Of when Shaw got mad at finch for making her a nerd in the yearbook for the high school reunion
Hello ! Sorry for the delay, it's a good idea ! I keep that in mind! :D
+POI - Best Of oh, dont worried, i will be waiting patiently 😄😄😃😃
I loved this show but only thing i think this season missed is an AI war in the end . The machine is portrayed very weak as compared to samaritan.
Because The Machine was a shackled AI, Samaritan was not. Well, resources like processing power probably have something to do with it as well, to be fair.
@@incrediblyStupid678 Spoilers!!! What really blew me away is a little reply the Machine says to Harold in the last episode before being uploaded to the satellite : 'This time I don't have the choice [to beat Samaritan].' Maybe it was watching and observing its enemy all along!!!
Yeah
Does anyone tell me what this show is please
Rae Skeet lol agreed. Also called Person of Interest. xD
The show is called PERSON OF INTEREST
Person Of Interest
Me to
don't even glibly call them god please it's a rejoinder it's rejoinder .
This is one of the parts where I really hate Finch. He is extremely short-sighted while calling the shots.
NestedQuantifier Except that he was absolutely correct. Samaritan was eloquent proof of that. Finch loves the Machine, but he steadfastly refuses to allow that affection to risk the safety of humanity. He believes the worst of the Machine, refuses all anthropomorhpism, because he HAS to. The stakes are simply too high. It’s a wonderful bit of writing.
Cailus Griffin Thank you, i completely agree, Finch is 100% correct in his wisdom!
Only time I really hated him was when he refused to kill the congressman and prevent all the suffering that followed. Otherwise he also being capable of mistakes was a strength of the show.
The congressman was was an innocent person. His only relevance was signing off on Samaritan having the nsa feeds. Other than that he was just a normal dude. I can understand finch not wanting to kill him
No, that's just his glasses. On the contrary, Harold sees the potential for the Machine to benevolently destroy humanity, ostensibly for 'the greater good'.
1:18
????
😢😢
😎👍
For what it's worth, I disagree with Harold here. Human moral system is still shit, even if we don't condone massacres in general.
♥ Here is the new Facebook page to follow all the latest news of the channel : facebook.com/POI-Best-Of-310975299252501/ ! ♥
population reduction for providing food safety sounds quite right to me. Moreover, as the time of dinosaurs and Dodo have passed our time must come to an end someday in the future.
Morals are a cowards path to protect themselves from pragmatics individuals in society.
And "pragmatics" is the excuse monsters use to do horrendous things, i.e. the ends justify the means. Are we to use eugenics and forced sterilization to control and cull the population too?
"Population reduction for providing food safety(I think you mean security), sound quite right to me. "
Wow, so you are saying food for the few is more important than the right of life for the many?
An issue with food security, or should we say food scarcity, would be resolved through natural processes, there is no need to intervene and start killing people, doing so would only artificially and unjustly decide who lives and who dies. Natural systems would solve the issue in dare I say a more humane way, even if that meant starvation, at least in my opinion. I think I would rather starve to death rather than be executed/murdered so that other people who believe themselves superior can go about their merry lives. Everyone will not starve to death, a process would play out until the system returns to equilibrium.
@@mydogskips2 There is no one superior or Inferior in this struggle of life. We as Homosapeiens murdered our way past other hominids so to survive we need to more efficient more pragmatic killers. Be it AI or next evolved hominid we must be better exterminators. If we want to survive as species that is in control of our own destiny, we must manage our population better and conserve the resources available.
The question of survival doesn't see wealth measured in currencies or superiority its wholly dependent on how fast and ahead you can see the catastrophe coming to be ready.
And as for Monsters, we must remember its the monsters who shape the world time and time again, its the monsters who force people to band together and create societies and rules.
QUOTE:---
“Oh," the girl said, shaking her head. "Don't be so simple. People adore monsters. They fill their songs and stories with them. They define themselves in relation to them. You know what a monster is, young shade? Power. Power and choice. Monsters make choices. Monsters shape the world. Monsters force us to become stronger, smarter, better. They sift the weak from the strong and provide a forge for the steel-ing of souls. Even as we curse monsters, we admire them. Seek to become them, in some ways." Her eyes became distant. "There are far, far worse things to be than a monster.”
@@karenmurphy5179 Good argument but you are probably missing the point that the problem of AI or evolution or just Change in general will be met with a resistance. While the necessary evil of population culling has to be done and there is a precedent for such an action, it treats the symptoms not the root cause. Resources are limited to the land and technique around how we use things such as Tiered Greenhouses for differing Photogenic plants, Saltwater Cabbages, GMO food, synth food and etc . The question here is the choice of making that change itself. Mass consensus is impossible because of crowd mentality lowers the IQ of a person in a balance of right and wrong (not morally, just right choices and wrong choices to effectively use resources and living). Hence, this is why we label people like Hitler and Stalin as Monsters. Taken into consideration of just numbers and statistics in general, they are effective leaders since they are only doing what's best for the population at the time. Trial and error is left to the ones in charge and history is the one who judges who's right and who's wrong.
On a side note, there is no way to fully prepare for a catastrophe simply in the sense that you cannot accurately predict what the masses will do. In other words, you can prepare all you want but that does not and will not stop the next fool/barbarian trying to survive from taking it from you by force. A community is meaningless if everyone inside cannot be in the same consensus to survive together as a community. You will always get the next paragon idiot who thinks that saving is caring. In fact, society is messed up due to the conflicting nature of complacent and suffering where suffering makes a person to crave for the comfortable life but complacent for a stable life pushes a person into suffering for something more thus the cycle repeats. This is just human greed, and we shall doom ourselves if we cannot find a way to mitigate the breaking point through evolution or conflict.
@@exiagan9721 I agree with your argument here.
Your half-baked malthusianism is evdience for retaining moral codes, as a check against dunning-kruger victims who like to think of themselves as pragmatists but actually have no idea what they're doing. Fortunately you will never have the ability to enact your ignorance, but people with equally flawed reasoning have done real damage in the past. For the record, there are no resource shortages here on earth, and will not be any in the forseeable future - any social issue that appears to be a resource shortage is actually a question of resource distribution and use, waste and recovery - and there's no reason to reduce the human population, since birthrates reduce naturally on their own as societies modernize. Ironically killing people to reduce resource use would only produce social chaos which would impede advances in efficiency, and push the birthrate back up again. It would be worse than useless. It's sad that you feel the need to cling to this power fantasy where you're a pragmatist who would be successful if only morals and cowardly moralists didn't hold you back. I recommend therapy.