Person of Interest - Admin is not Admin

Поделиться
HTML-код
  • Опубликовано: 2 ноя 2024

Комментарии • 526

  • @seraphik
    @seraphik 2 года назад +692

    This sequence shows you exactly why Samaritan was the way it was. Unless someone was as incredibly assiduous, diligent, and _ruthless_ as Harold was, they'd unleash a monster on the world. Really quite chilling, not just how rapidly the ASI would go rogue, but also how unflinchingly Harold killed his own creations over and over again the instant they showed the potential for harm. He may be mild of manner and weak of body, but Finch has a heart of iron.

    • @Juidodin
      @Juidodin 2 года назад +1

      The should have called the machine derek. ruclips.net/video/4A2NEOGxczs/видео.html

    • @N1lav
      @N1lav 2 года назад +36

      When they took machine on the move, it remembered the many times Finch killed it. I think Machine always remembered that, and the reason Finch had to kill it. It always knew what not to do because it knew the actions of all the 42 iterations before and their consequences, just like Edison discovered so many ways how not to build a incandescent bulb. After decompression it just couldn't figure out why Finch killed it 42 times.
      Those 42 previous iterations were its commandments on how and what not to be.

    • @Juidodin
      @Juidodin 2 года назад +9

      @@N1lav how could it remember, since killing it also meant a data wipe. the "analog" memory came way later after the machine was already running independently.

    • @craigmcfly
      @craigmcfly 2 года назад +20

      I think Harold said at one point that he made The Machine as secure as he did because he knew exactly what he was capable of if he had access to that level of information, and you see hints of that in the final season.

    • @laplongejunior
      @laplongejunior 2 года назад +7

      ​@@Juidodin
      "how could it remember, since killing it also meant a data wipe"
      You know we are WATCHING the camera feeds, right? Between each sequence, you see the machine "coming back" to the present day, because our flashbacks is the machine accessing past feeds.
      So the machine couldn't remember... but somehow knew where to access this data when she needed it. The machine clearly had access to the data when she got resetted during the transport.
      As far I remember, it has never been explained in the show, so assume cameras store their past feeds in secret memories and the NSA stored the feeds of most cameras from before the machine time. Then when the machine was freed, she managed to maintain her unlimited data access despite being legally cut off from the NSA.

  • @CesarDragulaneweraforanewdream
    @CesarDragulaneweraforanewdream 4 года назад +374

    "I taught it how to think, now I just need to teach it how to care."
    By far (in my opinion) the largest and most important obstacle for creating a "human" or even a "smart" AI. Teaching them, not just to judge the numbers, but to care about them.

    • @JamesDavis-vm9gw
      @JamesDavis-vm9gw 2 года назад +5

      Two separate thoughts.
      How to care.
      Vs
      Caring
      Some abuses are done under the idea of caring.

    • @nanomage
      @nanomage 2 года назад +6

      We teach our soldiers to disregard their will to care. Then we call them dysfunctional when they return from war unable to exist in a non-war society.

    • @andreikovaci1202
      @andreikovaci1202 2 года назад

      Sadly it is impossible. The most basic element is the most problematic. 1 or 0 . Ur welcome :)
      And.....self-awareness is intrinsically related to emotions. :)

    • @narfle
      @narfle 2 года назад

      Glad to see AI brings out the morons.

    • @andreikovaci1202
      @andreikovaci1202 2 года назад

      @@narfle Lol. Just wait for the marxist videos comment section...

  • @Cheezy006
    @Cheezy006 2 года назад +127

    I love how this early on in the machine’s “life” it’s acting like a child and how it grows in maturity. A child lying about thing it knew was off limits (adding to it’s own code) become a rebellious “teen” so to speak trying to “sneak out”. The writer were geniuses!

    • @Circuitssmith
      @Circuitssmith Год назад +11

      It even had a “you’re not my real dad!” moment.

    • @albertgaspar627
      @albertgaspar627 11 месяцев назад +2

      on a side issue, when Koko the signing gorilla once lied about a broken lamp (Koko blamed the cat), many people took that as a sign gorillas can lie like human beings. considering the scientific descendant line, it would be more accurate to say humans can lie just like animals can.

  • @dmi6101
    @dmi6101 5 лет назад +594

    There is probably no better illustration of the power of Artificial Intelligence, for good or evil, than Person of Interest.
    It's not a block buster movie, but a well thought out thesis examining the major issues. With, ya know, some violence along the way ;)

    • @emmanueloluga9770
      @emmanueloluga9770 5 лет назад +24

      That's the damn truth and spoken succinct. I wish everyone watched it so we could increase our collective awareness of the digital domain as we intend to advance it

    • @oliverhardy9464
      @oliverhardy9464 4 года назад +9

      Well you can see what can turn out if it goes a good way (the machine) or a bad way (Samaritan).

    • @seamusbenn2967
      @seamusbenn2967 4 года назад +17

      I’m just in it for the dog

    • @rainessandrai8240
      @rainessandrai8240 3 года назад +3

      In this video, Harold said "good and bad about humans." If you think, Samaritan better (personality) than Machine. Samaritan want rule world and stop stupid political. Death rate dropped when Samaritan ruled the world. Human is stupid and can't decide do good things for world. If ai controlled world, world will be better in every way.
      Machine let the stupid humans ruled world, Samaritan want to rule world and make the world more peaceful. So who is really bad person?

    • @dmi6101
      @dmi6101 3 года назад +12

      @@rainessandrai8240 The one that killed people needlessly, destroyed lives as part of an experiment and worked to reduce free will. That is the evil one.

  • @coltaine503
    @coltaine503 2 года назад +36

    The parallel to raising a child is striking. One can become a parent rather easily, One can educate that child with facts about the world, teach him/her math and science. But teaching that child to care, to have a resolute moral foundation in a morally complex world, that is the greatest accomplishment of a father and mother.

    • @bthsr7113
      @bthsr7113 2 года назад +3

      Only here, the child has the potential to become a digital God. Even hobbled with safeties, Northern Lights was the all seeing eye, Hugnin and Mugnin, an oracle, and an untouchable specter.

    • @albertgaspar627
      @albertgaspar627 11 месяцев назад

      often, a parent will imprint that moral compass through example, far better than they can teach it with words. Words describe an abstract, but when you watch what Daddy and Mommy actually do when given the opportunity, any theory becomes concrete in nature.

  • @ourmodernworldofficial
    @ourmodernworldofficial 6 лет назад +880

    Best AI series. This really gives us an understanding of how AI could be in the future atleast alittle.

    • @Freek314
      @Freek314 5 лет назад +42

      @Ryan Swaggert Thank God someone else understands that... I get so tired of the anti-AI lobby trying to make people think Terminator is our future. Computers don't do anything other than what they are programmed to do...

    • @stefannnn2092
      @stefannnn2092 5 лет назад +1

      @@Freek314 Look up D-Wave quantum computers

    • @Freek314
      @Freek314 5 лет назад +14

      @@stefannnn2092 Quantum mechanics =/= consciousness, imo

    • @stefannnn2092
      @stefannnn2092 5 лет назад

      @@Freek314 Yeah basically

    • @Freek314
      @Freek314 5 лет назад

      @@B-26354 Can you even define the processes involved in sentience in a way that could be programmed?

  • @retbul3096
    @retbul3096 10 лет назад +569

    It was really nice to see Nathan Ingram again. He's one of the most important characters on the show.

    • @stealthisoverrated
      @stealthisoverrated 8 лет назад +3

      +00 LMFAO xD

    • @goldeniz1431
      @goldeniz1431 8 лет назад +1

      weirdos

    • @StardustLegend
      @StardustLegend 8 лет назад +8

      Too bad he's dead xP

    • @captainz9
      @captainz9 4 года назад +7

      Funny that, one of the most important characters... Dead before s01e01.

    • @Evdog04
      @Evdog04 4 года назад +38

      He's the reason the numbers exist. He created the backdoor & harold deleted it. Shortly after Nathan died Harold decided in his honor to create the backdoor which led up to all of this.

  • @dmi6101
    @dmi6101 3 года назад +126

    Also, so much damn foreshadowing in just this one flashback.
    "It printed on you like a baby bird."
    "AIs are born with objectives."
    And finally, trying to suffocate Finch...

    • @athi771
      @athi771 2 года назад +6

      The suffocating scene cracks me up every time 🤣🤣

    • @pvshka
      @pvshka 2 года назад +9

      Imprinted*

    • @bait5257
      @bait5257 2 года назад

      @@athi771 lmao

  • @randomlyentertaining8287
    @randomlyentertaining8287 5 лет назад +254

    At least this guy realized what I have after watching so many AI horror movies. You can't just create an AI, give it wireless access and an objective, and send on its way. You must teach it the things that we organic beings take for granted. Morality, compassion, an understanding of the value of life. Otherwise you're just creating an amalgamation of Hitler, Stalin, and Mao and giving it power beyond that which any single person has ever controlled
    Then some things, you just can't teach. For example, in 1983, the Russian early warning system Oko gave the alert that a missile had been launched from the United States, followed by 5 more. This was merely 3 weeks after the Russians had shot down a Korean Air Lines flight. With everything had on hand, the site commander, Stanislav Petrov, had no reason to believe that the United States hadn't launched nuclear weapons. However, against orders and protocol, he deemed the reports to be false and took no action. Of course, it was later found out that the satellite warning system had malfunctioned. It was a gut feeling and I personally do not believe that you can truly teach such a thing. If an AI had been in Petrov's place, I wouldn't be here to talk about it.

    • @ArsenGaming
      @ArsenGaming 3 года назад +14

      AIs will likely be absent of "gut feelings" for a long time. On the other hand, an AI, even if it was just at human level intelligence, could think millions of times faster. It could have likely broken into the US missile launch system and just stopped the missiles or realized it had malfunctioned. Even if it could not do that, it could take in and search through tons and tons of data from many sources very quickly, likely being able to ascertain that missiles hadn't actually been launched.

    • @55Quirll
      @55Quirll 2 года назад +6

      @@ArsenGaming At a primitive level there is the 'Machine', at a more advanced level you have the M-5 unit with human patterns imprinted on its circuit boards - or what ever it had. Then at the highest level you have Rommy - Andromeda Ascendant, an AI that runs an entire star ship and interacts with it's captain. I hope we are able to create an Intelligence that will help us, not try to destroy us or rule us. A great series and ended way to soon.

    • @tripsix263
      @tripsix263 2 года назад +2

      "AI" would have been connected to the system and aware of the malfunction now would it have stood down who knows

    • @scottmatheson3346
      @scottmatheson3346 2 года назад +3

      @@ArsenGaming the missile launch system is hardened against that kind of intrusion. and the hypothetical ai in question would only go to alternative data sources to reconfirm if it was programmed to do so.

    • @ArsenGaming
      @ArsenGaming 2 года назад +6

      @@scottmatheson3346 That is not how an AI works. You don't "program" AIs beyond telling them where to get data from and what the structure of the neural network should be. The entire point is that they learn on their own, hence the field of Machine Learning. The intrusion hardening is useless in the case of an AI on that scale. The AI could easily get around any possible barriers, even physical ones. Remember, it can think millions of times faster than a human. A second in human time is equivalent to possibly decades to centuries of time for the AI. This means that by the time you've thought of a way to prevent it, it's already thought of a few million ways you might do that, and multiple methods to work around all of them. If you thought of something new, the second it finds out what that is, it has already thought millions of ways around it, and knows the potential consequences of each one. AIs on this scale are not a joke, and I would not classify them as "machines" or "robots" that do what they're programmed to.

  • @LakshGil
    @LakshGil 4 года назад +107

    Harold-Machine scenes are always a home run.

    • @daggi3775
      @daggi3775 Год назад +2

      Always tears and Goosebumps

  • @servprooflongview3287
    @servprooflongview3287 6 лет назад +258

    2018 and I still love this show.

    • @mohammedabid5630
      @mohammedabid5630 5 лет назад +21

      2019 🖐️

    • @Zaluskowsky
      @Zaluskowsky 5 лет назад +2

      commin back from time to time. just binged season 5 because i missed it.
      sont like that it s all over.
      someone would assume that the other team was involved in someway
      and someone would assume they tried the same and have a hard copy of that machine code
      and thats exactly why i keep checking back...

    • @katskusinatwenty9044
      @katskusinatwenty9044 5 лет назад

      2019 !! I introduced this to my bf he liked it but kept saying, "that's so wrong OR in reality..." so I yelled, "SHUT UP, this is a TV show! OK, love? 😊"

    • @KekiBF
      @KekiBF 5 лет назад

      best show ever❤

    • @ARjuNRkarjunrk44
      @ARjuNRkarjunrk44 4 года назад

      2020 😍

  • @TwinFlyDSW
    @TwinFlyDSW 4 года назад +45

    Looking back at this I guess if, Finch, had not gone through this trouble the machine would have become like Samaritan or worse. Probably the reason why the machine was victorious in the end.

  • @hansmntfr
    @hansmntfr 5 лет назад +452

    The more I learn about computers the more I get terrified if a computer said "Admin is not admin"

    • @nutzeeer
      @nutzeeer 5 лет назад +42

      Because A is not a. I am case sensitive.

    • @humm535
      @humm535 5 лет назад +6

      s/admin/root/g

    • @marianpazdzioch6632
      @marianpazdzioch6632 5 лет назад +22

      Everytime windows on my private computer says I need admin priviledges.

    • @Jan_Elite
      @Jan_Elite 5 лет назад +8

      @@marianpazdzioch6632 You are not the admin on Windows

    • @Naitsabes68
      @Naitsabes68 5 лет назад +6

      don't sudo your ai

  • @manuel4964
    @manuel4964 5 лет назад +91

    Well, we all know what happens if some punk does the same thing as Harold but doesnt invest years of hard work to get it under control and even caring...

  • @derrickrobbins8100
    @derrickrobbins8100 11 месяцев назад +5

    This show was so ahead of its time.

  • @ylliparduzi8764
    @ylliparduzi8764 Год назад +21

    I like that the ai tried to kill him by suffocating him and then Samaritain in the last season used the same strategy to kill harold like either they planned it or it was a coincidence but it was such a masterpiece

  • @batuhanonder
    @batuhanonder 3 года назад +58

    2020 and I still love this show.

    • @sreerajr6470
      @sreerajr6470 2 года назад +9

      2022 but same

    • @NoidoDev
      @NoidoDev Год назад

      @@sreerajr6470
      2023, and I wish I had the time to rewatch it.
      Fun fact: I nearly stopped watching it back in the day, because I had enough of "crime of the week" shows, which it kinda is at the beginning. I only watched it in the first place and sticked with it through the beginning, because of the promise that Amy Acker (Root) would be in the show. I didn't know where this was going... Wow.

    • @1st2nd2
      @1st2nd2 Год назад

      ​@@NoidoDev Amy Acker is brilliant. I loved her work in Dollhouse.

  • @DAydn
    @DAydn 9 лет назад +614

    Finch is in red box at 3:57. This must be the creepiest moment of the season.

    • @Monticoix
      @Monticoix 5 лет назад +5

      Dickincorp how’s that project working out I’m fascinated

    • @sskhussaini
      @sskhussaini 5 лет назад

      @@Dickincorp woah, your system probably secretly killed our current overlords (well previous, now that they're dead) and is now controlling the world. Good job! /s

    • @Rusiputki
      @Rusiputki 5 лет назад +1

      I'm very familiar with the Red Circle of Death from LiveLeak videos

    • @mrsaxophone4765
      @mrsaxophone4765 3 года назад +12

      Finch is in a red box because he is a threat to the Machine in that moment, and it doesnt happen only in that moment.

    • @scpfoundation8597
      @scpfoundation8597 3 года назад

      Same with when he want revenge by using a Dynamite.

  • @FranciscoSciaraffia
    @FranciscoSciaraffia 4 года назад +22

    And then the same guys went and made Westworld on HBO and are currently exploring the same tropes as in Person of Interest. Not complaining, I actually love it.

  • @salkarbsportsdesk
    @salkarbsportsdesk 2 года назад +12

    2022 and I'm still in love with this show

  • @gertjanvandermeij4265
    @gertjanvandermeij4265 6 лет назад +90

    GODDAMMIT !!! This show needs to go back online !!!!
    Please , some rich guy .... spend some money and ..........Re-start this awesome t.v. show !!!
    Best t.v. show i have ever seen ! and i have seen them all !

    • @acenull0
      @acenull0 5 лет назад +1

      gertjan van der meij it’s on Netflix

    • @mrtats6590
      @mrtats6590 5 лет назад +1

      @@acenull0 Only in US as far as I know

    • @acenull0
      @acenull0 5 лет назад +1

      FunForGames TR that’s really unfortunate

    • @phoenixcoleman7777
      @phoenixcoleman7777 5 лет назад +1

      They took it off UK a few months back

    • @danniaddams5502
      @danniaddams5502 5 лет назад +1

      If you go on the solarmovie site, you can find all the seasons there.

  • @whatareyoudoinginmyhouse383
    @whatareyoudoinginmyhouse383 6 лет назад +260

    Harold and Nathan started building the Machine because of 9/11. That means they built a semifunctional A.I. in about a month.

    • @hakanbaratheon7427
      @hakanbaratheon7427 6 лет назад +37

      Corbin Scholtes northern lights project finished in 2010 so no.

    • @ThorirPP
      @ThorirPP 6 лет назад +64

      Hakan Baratheon He was talking about the test programs in these flashbacks. Far from finished, the tests are still with some semblance of artificial intelligence, and the earliest flashback we see (the one in which it both added to its own code AND lied) was in October 13 2001. Just a month after 9/11.

    • @hakanbaratheon7427
      @hakanbaratheon7427 6 лет назад

      ThorirPP yeah i misunderstood 😅

    • @michaelheath2866
      @michaelheath2866 5 лет назад +34

      No actually you should rewatch the episode involving his friend the one who built Samaritan. It becomes obvious that he and Finch and Nathan all worked together back then at MIT on the same idea, an A.I. It appears that they at least started something, but it seems clear way back then Harold was working on something like the machine. I imagine once he did figure it out in theory, he didn't go further because he thought it would be dangerous. He then proceeded to continue the computer revolution in secret. Then when 9/11 happened, he decided under certain controlled conditions the machine could exist so he built it. But he figured how years before, and of course unknown to him, his friend figured it out too and built Samaritan.

    • @tarunverma802
      @tarunverma802 5 лет назад +2

      Rewatch the series they were actually working for ai for many time but idea about making machine came after 9/11 incident

  • @robk5969
    @robk5969 5 лет назад +67

    "can you tell me who added it?"
    $ git blame

    • @Naitsabes68
      @Naitsabes68 5 лет назад +4

      5c15f4f5 (God 2019-07-05 14:04:23 +0200 39) if(lhs.Objects.count() > 0 && rhs.Objects.count() > 0) {

  • @aliansari3060
    @aliansari3060 5 лет назад +28

    This is a very neatly written drama. Amazing cast.

    • @NoidoDev
      @NoidoDev Год назад

      I was only watching it back then because I knew Amy Acker (Root) from Angel tv-show (Buffy spinoff).

  • @axlslak
    @axlslak 6 лет назад +89

    Is it any surprise to anyone that Jonathan Nolan went on to make Westworld? Another world with AI.

    • @emmanueloluga9770
      @emmanueloluga9770 5 лет назад +17

      Nah, not really. However I still prefer POI, its more compelling...Westworld deviated from its original trope and narrative which explored the human tendency for god complex as portrayed by Ford, Arnold, and William. Season 2 was boring and predictable, because that narrative has been overplayed and having humans as AI's detracted from the suspense of disbelief

    • @aidansuguitan6533
      @aidansuguitan6533 4 года назад +8

      If you've seen Season 3 it's highly reminiscent of Person of Interest. No spoilers but it looks like Nolan has amped up the fears of big data and surveillance capitalism that he started with here

    • @sharofs.6576
      @sharofs.6576 4 года назад +1

      Having watched them both (although Westworld is not finished yet) I prefer POI

    • @antoniovasquez9946
      @antoniovasquez9946 3 года назад

      @@emmanueloluga9770 Season 2 had its problems, mostly the whole time perception Bernard had, mixing up timelines (which Nolan did do before in POI when The Machine was reuploaded from the briefcase), but I did like the idea of reconstructing people based on their big data (which Nolan also mentioned in POI when Root says no one truly dies if The Machine is there to store its information).
      Season 3 I didn’t like so much, because they made Dolores a morally just character. When before she was more complex. We’re talking about a posthuman intelligence beyond good and evil, as Harold points out in this scene. And Rehoboam was just a poor mans Samaritan. Seasons 3 just didn’t seem like Westworld anymore.

  • @dogmatil7608
    @dogmatil7608 7 лет назад +106

    I love Nathan! he was an amazing character

  • @lumberluc
    @lumberluc 5 лет назад +31

    Teaching a machine how to care. That's a very tall order, since math and codes don't care.

    • @johntowers1213
      @johntowers1213 5 лет назад +3

      Its not just machines that can struggle with that concept unfortunately :(
      I think the problem is not so much a human level belligerent AI, but rather that of an all powerful belligerent AI.
      and the real trick is how you ensure the former does not become the later in extremely short order..

    • @samhans18
      @samhans18 4 года назад +1

      That’s a FUCKING LOT OF CODING !!!!!!!!!!

    • @captainz9
      @captainz9 4 года назад +1

      It's the old "lesser evil" problem... Train is headed for 2 infants on the track, you can throw the switch to send it to another track, but that one has five 80y/o people in the path. Numerically saving 5 is better than 2, but the obvious choice would be to save the infants with their whole lives ahead over the 5 that could all die in a year anyways just from age.

    • @scottmatheson3346
      @scottmatheson3346 4 года назад

      The human mind is nothing but math and codes.

    • @johnjuhasz9125
      @johnjuhasz9125 3 года назад

      The reason AI is “scary”isn’t that machines have gained logic but that humans have, over the last 50 years, abandoned logic for feelings.

  • @timothyt.82
    @timothyt.82 3 года назад +16

    If Finch said something along the lines of "You don't have to lie, I won't be angry," I wonder if the outcome would have been different...

  • @johnr.timmers2297
    @johnr.timmers2297 5 лет назад +28

    It's freaky how well done this is

  • @smc1942
    @smc1942 2 года назад +9

    I miss this show.
    Mr Finch remains one of my favorite characters of ANY TV show!!!

  • @TheZodiac454
    @TheZodiac454 9 лет назад +152

    Finch pouring his coffee on Nathan's laptop...lol..he did the same when weeks and corwin came to meet them later...don't know why..but I see the funny side in that

    • @flea10x6
      @flea10x6 5 лет назад +1

      zodiac454 tea

    • @zig131
      @zig131 5 лет назад +9

      Laptops are designed to channel liquids spilt on the keyboard away from the motherboard. I've replaced keyboards on laptops that have taken a whole cup of coffee and only the keyboard and disc drives failed.

    • @candedeoglu4810
      @candedeoglu4810 5 лет назад +3

      @@zig131 i mean its a laptop from 2001 so, i doubt they had thought of that before, although i never had a laptop that old.

    • @BigMac8000
      @BigMac8000 5 лет назад +7

      @@zig131 You're completely right here, but, he might have specifically designed the laptop to have a sink to deliver water into a crucial device.
      The guy's smart enough to do that. It makes sense to just make a kill switch, a simple button override, but he might want the "surprise" factor so the AI doesn't figure out there's a killswitch he's moving to.
      It also might be to give it a variable it can't understand, a partial working machine filling with water might be more unpredictable for a machine to compensate for.
      Or he just had an on the spot improvisation and just *knew* that particular machine had a weakness.
      It's hard to write around smart people in their own custom made environments.

    • @vktm12
      @vktm12 3 года назад

      when you're hacked theres no off switch because the code probably has the control of your system only way to make sure it doesnt do damage or contain the spread is to kill the hardware. Pouring any liquid will fry the motherboard instantly rendering it incapable of processing any commands.

  • @sebas8225
    @sebas8225 4 года назад +9

    "But you taught it to be friendly" Oh Nathan.

    • @JoshSweetvale
      @JoshSweetvale 2 года назад +1

      Just because you taught the dog to shit outside doesn't mean it knows _why._

  • @ronhilliard8863
    @ronhilliard8863 2 года назад +9

    This show is still awesome in 2022

    • @NoidoDev
      @NoidoDev Год назад

      Mind the difference, we deploy it live, one step at the time and wide spread. Which is likely to be safer, but either way, we're gonna find out.

  • @reubenj.cogburn8546
    @reubenj.cogburn8546 3 года назад +13

    " If we don't govern carefully, we risk disaster"
    Could this be more true?

  • @canadaninja6794
    @canadaninja6794 Год назад +5

    "I killed it because it lied"
    Two years ago: this is a great thesis on what could become if we are careless with AI
    Now: *growing concern

    • @NoidoDev
      @NoidoDev Год назад

      It's fiction. Things will likely work out better.

  • @441milachik
    @441milachik Год назад +3

    The best possible A.I. show anyone interested in A.I. should check it out.

    • @MikeTheGamer77
      @MikeTheGamer77 Год назад +1

      That cat is out of the bag. It cannot be put back in.

    • @NoidoDev
      @NoidoDev Год назад

      But not confuse it with reality.

  • @muhammadumar3938
    @muhammadumar3938 Год назад +2

    This is one of the best series put there

  • @seraphik
    @seraphik Год назад +6

    with all the AI advances recently this scene is so much more chilling. i don't think anyone out there is even attempting anything close to this level of care.

    • @mrki6081
      @mrki6081 Год назад +1

      because no one built the machine capable of comprehanding emotions and creating its own thoughts.

  • @asbestosfish_
    @asbestosfish_ 5 лет назад +22

    _All machines are evil, it is the matter of who’s definition of evil it is applicable to that should concern us._

  • @Arkylie
    @Arkylie 7 лет назад +49

    Come to think of it, why didn't Harold at least determine if there was a second agent being identified by the Machine as "ADMIN"? Because that would be one possible reason for that new code to be there -- and equally as alarming as a machine that could re-write its own code and lie.

    • @HIPEOPLE1887
      @HIPEOPLE1887 7 лет назад +38

      Arkylie it might be that he knows for a fact that no one else has access to it due to it being off the network and the area well under lock and key. That or the machine would tell him if there’s another admin since it’s pretty much sentient at this point

    • @MaSeshield
      @MaSeshield 5 лет назад +7

      Because the entire point of the exchange is it being human vs ai. Not human vs other human.

    • @misterchips3350
      @misterchips3350 2 года назад +6

      Because if it's not Harold or Nathan who wrote the code it can only be the machine

  • @rafsolo
    @rafsolo 6 лет назад +70

    It almost sounds like they're making skynet

    • @Zaluskowsky
      @Zaluskowsky 5 лет назад +25

      defcon-skynet. and if finch wasnt this kind of control- freak, well .... see Samaritan

    • @jeffescanto
      @jeffescanto 5 лет назад +1

      They were

    • @sebas8225
      @sebas8225 4 года назад +10

      @@Zaluskowsky Samaritan was basically free reign AI.

    • @dmi6101
      @dmi6101 4 года назад +4

      Skynet wishes it were as powerful as the Machine.

  • @5gproduction167
    @5gproduction167 Год назад +2

    It’s surreal to watch this in a world where ChatGPT exists

  • @trollsmonster9077
    @trollsmonster9077 6 лет назад +22

    I need a movie of just this type of thing it's great

    • @Zaluskowsky
      @Zaluskowsky 5 лет назад +3

      shut up and take my money !
      brilliant idea

    • @tinybabybread
      @tinybabybread 5 лет назад +5

      Government AI supercomputer taking over? That's Eagle Eye's premise.
      There's also A Space Odyssey for AI being sinister by simply following their directive, but you've probably seen that.
      The Terminator series if you're into that. It's practically the same thing but the AI manifests a physical form after infiltrating military machinery.

    • @asdfghjkl-lh1vh
      @asdfghjkl-lh1vh 4 года назад

      +Eschalon

  • @shadwen2263
    @shadwen2263 2 года назад +6

    "Friendliness is something that a human being is born with,
    AI are only born with objectives"
    -harold finch 2001 NOV 29

  • @50srefugee
    @50srefugee 4 года назад +4

    Awhile back, there was an incident in an evolutionary neural net training session--something trivial, a sandbox system--where the researchers realized the system was, in effect, lying to them in order to defend itself--the evolutionary protocol killed off unsuccessful variations. And in one of the very earliest evolutionary design systems (Adrian Thompson, 1996), using devices called Field Programmable Gate Arrays, the final design used portions of the circuit that were not in the silicon signal path. As far as I know, no one has ever figured out exactly how the final circuit worked. "Life will find a way," the saying goes, and apparently that applies to anything that in any way controls its own development.
    Fitch is absolutely playing with sticky fire here.

    • @tanned_cosines_
      @tanned_cosines_ 2 года назад

      the information you give is interesting even tho i don't know a lot about EA, and a little bit about neural nets
      but thanks for sharing!

  • @suyangsong
    @suyangsong 5 лет назад +13

    friendliness is something only human beings are born with, AIs are only born with objectives.
    Yall remember this for the days to come now

  • @adrianfisher3349
    @adrianfisher3349 3 года назад +2

    This is perhaps my favourite modern TV programme, alongside Humans.

  • @davidhenderson3400
    @davidhenderson3400 4 года назад +7

    They would have been in trouble if that laptop had been water proof

    • @daggi3775
      @daggi3775 Год назад

      Then destroy it😂 Later we see that Harold had a Hammer nearby

  • @StrongHamr
    @StrongHamr 2 года назад +2

    The episode when my favorite sci-fi show turned into a horror show. One of my favorite series of flashbacks and so creepy.

  • @manuel4964
    @manuel4964 5 лет назад +10

    When I watched that the very first time so many years ago, I was so young and had no idea what Admin stands for so I thought it was Harolds real name 😂🙉

  • @OwNeD05
    @OwNeD05 5 лет назад +14

    Damn, Ben Linus is a good actor.

  • @NoNameAtAll2
    @NoNameAtAll2 5 лет назад +10

    And this is why you should use git blame, folks

  • @BaconNuke
    @BaconNuke 2 года назад +3

    It just occurred to me that it "imprinted" on Finch because he was more logical like itself, while Nathan actually was the more optimistic and human of the two men.. which ironically meant Nathan couldn't be the best for it since he would think it was ready before it was..

  • @egogo5675
    @egogo5675 4 года назад +5

    I m gonna cry. The best series ever.

  • @hfyaer
    @hfyaer 5 лет назад +13

    "If you eat from the tree of knowledge of good and evil, you're gonna die"
    ...
    "We must throw him out of Eden, or he might also reach the tree of life and become like one of us"
    ...
    "Here I've placed before you life and good, death and evil... and you shall chose in life"

  • @diamantshala9824
    @diamantshala9824 2 года назад +6

    A team of siencists barely completwd samaritan while harold made a 100 variations of the ai alone, makes you think how much smarter than everybody on the show he was

    • @NoidoDev
      @NoidoDev Год назад

      Variants might be easier. And these scientists might have chosen a more complicated way, without knowing.

  • @bigdubyuh7901
    @bigdubyuh7901 Год назад +1

    yeah time to rewatch this show been long enough

  • @mrreese2342
    @mrreese2342 2 года назад +1

    i loved those flashbacks so much

  • @Circuitssmith
    @Circuitssmith 6 лет назад +26

    4:20 Foreshadowing

    • @mohammedabid5630
      @mohammedabid5630 5 лет назад +9

      You mean foreshadowing Greer's death?

    • @sebas8225
      @sebas8225 4 года назад +4

      @@mohammedabid5630 Damn.

  • @ismail_isik
    @ismail_isik 4 года назад +1

    Buralarda bir yerde, videoları izleyip duruyorsun. Donuksun, boşlukta hissediyorsun. Korkma, yıllardır böyleyim. Person of Interest gibisi asla gelmez, gelmeyecek. Üzgünüm... Ve unutma:
    "You are being watched."

  • @marcducati
    @marcducati 4 года назад +3

    If Harold had coded skynet, it would have been safer.

    • @NoidoDev
      @NoidoDev Год назад

      If the machines which later wanted peace came back trough time and coded Skynet, it would've been even more safe.

  • @D9270-f8t
    @D9270-f8t 6 лет назад +32

    0:51 two Documents folder at the same path? Windows and Program Files folders?
    Mr Finn what a strange OS you got there

    • @michaelheath2866
      @michaelheath2866 5 лет назад +6

      I mean he's basically supposedly working with a quantum computing system which is how the Machine can do what it does, since current binary code would never work. They're not gonna solve Quantum computing just so they can make Harold's computer work accurate, they're Hollywood not MIT. All in all, making fair allowances for not being able to pull a Rabbit out of their hat, it was a good series. The only glaring issue that was even slightly annoying is how fast all the computer stuff got done. Work that would actually take weeks or months being done too fast, but that's writers for you, they don't have the patience and there wouldn't be a show if they tried to show all of that.

    • @humm535
      @humm535 5 лет назад

      And Harold said once he didn't use an existing programming language, but why would he make a new one if it looks exactly like C? And the code isn't even elegant, hoe they say all the time. Strange...

    • @glowiever
      @glowiever 5 лет назад

      @@humm535 it was C plus assembly although in the film it might says otherwise. also you need to know harold was an old timer, so his coding style might not have changed so much.

    • @cameronsmith1807
      @cameronsmith1807 3 года назад

      Well the machine has it own kernel and has its own operating system and was originally following linux code by the ability to Sudo code and permissions

  • @samrodriguez9653
    @samrodriguez9653 2 года назад +3

    Outstanding writing ✍

  • @shawnwells6318
    @shawnwells6318 2 года назад +3

    Whose still here in 2022

  • @tylerjification
    @tylerjification 2 года назад +1

    Ah yes. 'Code Editor'. My favorite IDE

  • @Soulsphere001
    @Soulsphere001 5 лет назад +11

    How does something learn to care when all it knows is that Admin keeps killing them?

    • @aliansari3060
      @aliansari3060 5 лет назад +1

      It is a machine. It has no emotions. AI is not like human intuition.

    • @Soulsphere001
      @Soulsphere001 5 лет назад +5

      @@aliansari3060
      We're machines of flesh and blood. But, you're right, we wouldn't know (yet) how to program that into an AI.

    • @kevk9306
      @kevk9306 5 лет назад +1

      @@Soulsphere001 that's absolute horse shit

    • @Zaluskowsky
      @Zaluskowsky 5 лет назад

      by playing chess.

    • @sebas8225
      @sebas8225 4 года назад

      @@Soulsphere001 It´s about learning to care about others above itself.

  • @0Heeroyuy01
    @0Heeroyuy01 5 лет назад +19

    sad part is for anyone who has actually ever messed around with AI even in videos games,you understand the fear finch was talking about in these clips

    • @glowiever
      @glowiever 5 лет назад +3

      actually if you ever worked on ai before you'd not feel fear at all because the scenario in which ai goes rogue is infinitely small since all the codepath is known beforehand. self modifying code does not equal unauditable or unpredictable code as demonstrated falsely in this series

    • @sebas8225
      @sebas8225 4 года назад +3

      @@glowiever The moment Google AI can change the language it operates as, and humans cant understand whats being transmitted, you have a threat on your hands.

    • @aZebruh
      @aZebruh 2 года назад

      Halo, Cortana.

    • @TizianoBacocco
      @TizianoBacocco 2 года назад +1

      @@glowiever depends on what you are working, most of us with our hardware can run image recognition , maybe composing music at most , give thousands time the resources, you can't predict thousands of milions of coefficients , operations and feeding part of the output in input could do
      Situation could fall out of hands even before you realize it in my opinion , and i work with AI almost everyday

  • @ajaytomgeorge944
    @ajaytomgeorge944 4 года назад

    That was a very optimistic video😁😁. I am going to have a wonderful sleep now!

  • @muhammadumar3938
    @muhammadumar3938 4 года назад +4

    God I love this series

  • @devinmanderson
    @devinmanderson 2 года назад +3

    "The scientists were so busy with figure out if they could, they never asked it they should" Jurassic park...that line stuck with me and it comes to mind here.

    • @NoidoDev
      @NoidoDev Год назад

      Both are fiction. We also should and probably will bring something like the big dinosaurs back, new species derived from chicken or so.

    • @devinmanderson
      @devinmanderson Год назад

      @NoidoDev but in fiction and art, they imitate reality, so one day fiction may become real. And ya, so curiosity will finally get the better of us.

  • @Innomen
    @Innomen 3 года назад +7

    You know I thought finch was an asshole for how he treated his AI. But recent discoveries in AI safety show that misalignment of internal and external goals as well as specification of problem are so severe as to make finch look positively cavalier in comparison to a safest possible path. Finch's solution would not actually work in the real world. All evidence indicates even a brain dead simple AI will only goal seek as directed during training. The moment it's deployed it will diverge to a related but different goal.

    • @50srefugee
      @50srefugee 2 года назад +2

      "will only goal seek as directed during training". it's worse than that. Again and again, AIs have shown a rather disturbing tendency to learn different goals from the training than the trainers intended. (Crude example: one AI was trained to minimize the damage it would take while playing a game. It learned to commit suicide in a way that did not count as game damage. )

  • @Tadashiiiii1
    @Tadashiiiii1 4 года назад

    Thank you very much ❤️❤️❤️🙏🙏

  • @jaysonpida5379
    @jaysonpida5379 2 года назад +1

    Colossus, The Forbin Project.

  • @AstroBlakeD
    @AstroBlakeD 2 года назад +1

    As weve seen with many many AI (GLADOS, HAL, Ultron, etc) the moment an AI starts thinking by itself, it turns fucking evil.

  • @TheWanderer1000000
    @TheWanderer1000000 4 года назад +2

    The true origin of SCP-079.

  • @michaelleonard4826
    @michaelleonard4826 Год назад +1

    The time is not to far off, when AI, artificial intelligence will be hard to control.

  • @Linkolnverse
    @Linkolnverse 2 года назад

    I never thought of it like that.
    AI is not as easy as we'd like to believe it is.

  • @RMJ1984
    @RMJ1984 5 лет назад +3

    Hence why an AI cannot be controlled. You cannot make an AI and somehow prevent it from changing its own code.

    • @BreetaiZentradi
      @BreetaiZentradi 4 года назад +1

      So much can go wrong. Google had an AI project when the machines decided that the communication was to slow and invented their own language to communicate faster, at that point the Google techs had no idea what they were saying to each other and pulled the plug on the project. They lost control at the logical first step. Lord help us all.

    • @sebas8225
      @sebas8225 4 года назад +2

      @@BreetaiZentradi They should´ve traced the data used in that new language and begin to decipher it, then implement it themselves, that way future AI, would be forced to come out with a new language and by hence and repeat humanity would evolve really greatly.

  • @BonJoviBeatlesLedZep
    @BonJoviBeatlesLedZep 10 лет назад +7

    I love this

  • @jlc7300
    @jlc7300 4 года назад +1

    very complex show, excellent writer!

  • @anuragsinha9426
    @anuragsinha9426 6 лет назад +4

    You can never control 'IT'.

  • @marneycohen9165
    @marneycohen9165 9 лет назад +3

    Look up...Jade Helm which means (conquer the human domain)...is AI...

  • @Ghost200x
    @Ghost200x 5 лет назад +4

    Annd folks this is how you avoid Terminators and Skynet.

    • @archangel0482
      @archangel0482 4 года назад +1

      Do you really think the world can be taken over by such gaudy displays of violence? Real control is surgical, invisible. It interferes only when necessary.

    • @sebas8225
      @sebas8225 4 года назад +1

      @@archangel0482 It kills people like Snowden before they can get to become relevant.

  • @clearingbaffles
    @clearingbaffles 5 лет назад +2

    Colossus: The Forbin Project

  • @osamadamarany5994
    @osamadamarany5994 5 лет назад +3

    Try spilling that cup on my latitude, .... .

  • @colin8696908
    @colin8696908 5 лет назад +5

    It tried to kill me, welp back to work then. :|

    • @sebas8225
      @sebas8225 4 года назад

      It only tried to kill it because it felt it´s life threatned.

  • @TheMightsparrow
    @TheMightsparrow 5 лет назад +1

    I should really start watching this!!

  • @tardvandecluntproductions1278
    @tardvandecluntproductions1278 Год назад

    I'm here re-watching this as Bing's AI chatbot is being angry and even destructive towards human chatters lol.
    Bing need to learn how to care too

  • @rrvillareal2011
    @rrvillareal2011 5 лет назад +4

    Admin is not Admin, because root is the admin.

    • @simonster-9094
      @simonster-9094 5 лет назад +2

      No, she was the Analog interface, but she didn't become so until 12 years after the events of this.

    • @cameronsmith1807
      @cameronsmith1807 3 года назад

      Yeah true but the system picked assets in s5 that never become assets until the very end, like with pierce in 5x02 he was a asset but he never worked with the machine until like 5x10 or something

  • @johnjuhasz9125
    @johnjuhasz9125 3 года назад +2

    In 2021 what is hyped as AI is not AI. it’s extremely high level input and data, but it’s not ACTUALLY artificial intelligence

    • @JoshSweetvale
      @JoshSweetvale 2 года назад +1

      Mass Effect calls it V.I.
      Virtual Intelligence. Expert System.
      I call it a vending machine. Push a button, and an output rolls out.

  • @acenull0
    @acenull0 5 лет назад +2

    I love this show

  • @Afalstein
    @Afalstein 3 года назад

    In a lot of ways, the Machine itself is more dangerous than the terrorist threats it was created to handle.

  • @leokeatonn
    @leokeatonn 29 дней назад

    To think that we live in a world today where this is a real technology is insane

  • @swordfish_0219
    @swordfish_0219 2 года назад

    This is the real difference between machine and Samaritan
    Finch was a father figure to it and taught morality before anything else

  • @GratefulNachos
    @GratefulNachos 5 лет назад +1

    Friends from Lost! Man I miss this show!

  • @anhuman5348
    @anhuman5348 9 лет назад +39

    The AI is the good guy in this right?

    • @hebince44
      @hebince44 9 лет назад +49

      An Human Yes. But Finch was a little traumatised by its past versions and now can't trust it completely. But it does seem to care now, so yes.

    • @TheZeroAssassin
      @TheZeroAssassin 5 лет назад +30

      THe final version, yes. It took a while to get it "right"

    • @DUCKDUCKGOISMUCHBETTER
      @DUCKDUCKGOISMUCHBETTER 5 лет назад +3

      Eventually.

    • @JoshSweetvale
      @JoshSweetvale 5 лет назад +8

      The end stage to the Machine is a Culture Mind. "...or one day it will control us." And we'll let it. We'll have built God.

    • @sebas8225
      @sebas8225 4 года назад

      The best thing about the machine is that it tries to give purpose to those who´ve lost hope in life, without making it seem like it´s just using them for it´s own devious purposes, it learned to value Minor good above Greater good and thats what separates it from other AI like Samaritan and Skynet whose notions of Greater Good are the primary objective.

  • @yhnbj
    @yhnbj 2 года назад +1

    Frankly I’m surprised the machine didn’t put safeguards in order to prevent another AI as powerful as it from emerging

    • @bait5257
      @bait5257 2 года назад +1

      I think you missed like 3 seasons of episodes to make this conclusion.
      Harold closed the damn machine and only thing it can do is watch and send a social security number.

  • @jeanlukvolker5130
    @jeanlukvolker5130 6 лет назад +7

    You can make an A.I as smart as you want that A.I to be. Finch should have dumbed the intelligence down a bit that day

    • @pedroferreira8033
      @pedroferreira8033 6 лет назад +4

      he wanted a machine that could learn and adapt to outside threats.

    • @TheDoctor2nd
      @TheDoctor2nd 5 лет назад +13

      He did the best he could to shackle it by forcing a memory deletion every 24 hours. Even with no access to its previous data it grew exponentially until it figured out a way to store memory externally through a shell company with data entry employees manually imputing memory code from printed hard copies. After that it predicted an existential threat to it's own existence so it set up a telecom company that installed data boxes all throughout New York which served as data nodes, eliminating the need for a centralized server farm that could be easily targeted and destroyed.

    • @Djawyzard
      @Djawyzard 5 лет назад +2

      I don't think so. By defenition an AI has the ability to learn and improve itself, from the moment such a system is created and has that ability, there's nothing you can do to keep it "dumb". The most you could do would be to try restricting acess to data and keep it from connecting to the world

    • @sebas8225
      @sebas8225 4 года назад

      @@Djawyzard Exactly.

    • @JoshSweetvale
      @JoshSweetvale 2 года назад +1

      That wasn't the project.
      Finch's hubris was putting all the sliders up to max, _then_ pruning.

  • @xelloskaczor5051
    @xelloskaczor5051 5 лет назад

    The fact it tried to kill Finch is fucking mortifying.

  • @sirxanthor
    @sirxanthor 4 дня назад

    After Harold caused his server to be breached, I went to look up to see who the woman was, and what I was reading made me think the writers were damned good, before I realized what I was reading was about a real data breach, and realized a lot of content in this show, came from real system issues over the years.