Комментарии •

  • @johnnydrydenjr
    @johnnydrydenjr 3 года назад +169

    A cat can be a mousetrap.

    • @elisha2358
      @elisha2358 2 года назад +20

      And this proves the point that a mouse trap is multiply realized. However, it does not prove that a cat can be multiply realized.

    • @elisha2358
      @elisha2358 2 года назад +2

      @UCCvINqJHA036L_BpiSQAjIQ I'm not sure I agree that a lion can be a cat.

    • @Simulera
      @Simulera Год назад +4

      @@elisha2358 the use of the word “cat” means it it can be multiply realized, but there are type distinctions to be careful about here.

    • @GermanZorba
      @GermanZorba Год назад +3

      Most cats are mousetraps😂

    • @gorgzilla1712
      @gorgzilla1712 Год назад +4

      Well yeah actually

  • @JAYDUBYAH29
    @JAYDUBYAH29 2 года назад +32

    You are a fantastic teacher. This lucid common sense conversational style is so often missing from how many teach philosophy.

  • @hoagie911
    @hoagie911 Год назад +17

    One of the issues with the attack on the brain-mind identity theory is that, while perhaps octopuses and other animals feel "pain", they may not do so in exactly the same way as we do. That is to say, while the mental states may be similar, they are not the same, and thus there is no issue with the brain states being different. This gives rise to questions of how we can know if mental states are similar or the same, and how rather different brain biologies can lead to (what we assume to be) similar mental states. But, importantly, these seem to be open questions, while the functionalist needs them to be closed in order for their attack on identity theory to hold.

    • @L4wr3nc3810
      @L4wr3nc3810 Год назад +4

      Good point, though i think you can still argue that no human brain is really the same, yet we feel the same stuff

    • @VladimirGluten47
      @VladimirGluten47 9 месяцев назад +3

      This was exactly my thinking. Rather than 'pain' being both brain state B and brain state X for humans and octopuses respectively, isn't is more likely that human pain is caused by brain state B, but not X, and octopus pain is caused by brain state X but not B?
      And I feel like we can get as specific and atomic as we would like. My pain is brain state 1, yours is brain state 2, and my pain tomorrow when i am hurt differently than to today is brain state 3.

    • @hoagie911
      @hoagie911 9 месяцев назад

      @@L4wr3nc3810 But do we feel exactly the same stuff? We certainly feel very similar things, but then our brains are all very similar.

    • @nikhilweerakoon1793
      @nikhilweerakoon1793 5 месяцев назад

      That’s a good point. It seems to me in that case; it depends on how you define similarity/your metaphysics of properties. Even if mental states aren’t one and the same as one another, perhaps a relevant sense of overlapping similarity in the mental states constitute realizing the same function and hence being an a mind. It also seems to depend on how you define a “brain.” a functionalist if a brain is strictly biological, would better explain the possibility of artificial intelligence coming to have consciousness in some sense. In my view at least. At the end of the day however to me it seems the debate is entirely irresolvable due to the epistemic inaccessibility of others mental states nevertheless of animals.

    • @bonafide_ok
      @bonafide_ok 4 месяца назад

      “Pain” doesn’t even have a clear definition to begin with. The identity theory is trash

  • @jamesboswell9324
    @jamesboswell9324 Год назад +20

    "The function of a cat is to be grumpy and scratch your furniture and stuff like that..." LOL Said like a true dog-lover!

  • @DarisDamaris
    @DarisDamaris 3 года назад +60

    I'm so impressed. Mr. Kaplan, you did such a great job explaining something complex and made it so easy to follow. You are a perfect educational role model! Cheers

    • @bthomson
      @bthomson Год назад +2

      I did have some teachers like this but you can NEVER have too many! 😍🙏👍❤💎🎖🏆🏅🎯

  • @a.a.h.s.2345
    @a.a.h.s.2345 3 года назад +41

    I can't thank you enough for these videos. I've been really struggling with an essay regarding the Identity Theory and Functionalism, but these videos have practically saved my life (and grades)! Keep up the great work!

  • @mattmerc8513
    @mattmerc8513 2 года назад +9

    Thank you for making this complex theory so easy to understand in one go. the quality of these vids are top notch

  • @magnesiumbutincigarette2271
    @magnesiumbutincigarette2271 2 года назад +3

    It is a great success that you explain some arguments in an easy way although the texts about philosophy of mind are so hard to get it. Thank you so much 🙂✋

  • @captainscarlett1
    @captainscarlett1 Год назад

    I'm reminded of my training to be an NCO in the army...functional leadership...a leader is someone who carries out the functions of a leader by addressing needs...group needs, individual needs and task needs. This is teachable and doesn't require individual qualities or situational expertise.

  • @mariuszpopieluch7373
    @mariuszpopieluch7373 2 месяца назад

    This channel is excellent. I teach philosophy of mind and it’s my go to intro to each topic. Really like the energy and enthusiasm-it’s a pleasure to watch and listen.

  • @novas.6814
    @novas.6814 3 года назад +12

    This was great. Thank you for taking your time to help us out.

  • @gusbozeman8337
    @gusbozeman8337 2 года назад +6

    this man just saved me with this video 2 hours before I have to turn in my paper. Mr. Kaplan is a godsend

  • @gm2407
    @gm2407 Год назад

    I am so glad that before I came accross this video I was considering the mind I independently used fuctionalism to described cognition to require something to experience and an apparatus of any form to abstract the interactive experience, the cognition is the processing of the mind of the experience. I feel like I am starting off on the right path for philosophical thinking.

  • @GeneralPublic
    @GeneralPublic Год назад +10

    An octopus has what is called a distributed nervous system instead of a central nervous system. All 8 tentacles contain stuff similar to brain stuff, a network of neurons throughout the body. If you chop off a tentacle, the tentacle can survive for a little while on its own, and it shows signs of being sentient, and can react to things just like an octopus, even without a brain. While an octopus does have a central brain, their 8 tentacles act like 8 additional brains, sort of, so they are sort of like 9-brained creatures, except the network of neurons is all one big network, and as long as it’s all connected, the entire octopus functions like one giant brain.
    Since an octopus can be big and its entire body is part of this distributed nervous system that is sort of like a brain, that makes octopi pretty smart compared to other invertebrates. Imagine if your arms and legs had brain matter in them and could think and act immediately without having to send nerve signals all the way to your head and wait for a nerve signal from the head to tell the muscles how to move, if there were some brain-like structures in your limbs that could respond to sensory stimuli and tell your muscles how to react really fast. Technically humans have a little bit of this ability but not very much, in the spine, like your spine can react to certain nerve signals from your legs and tell your legs how to react, but this is much more limited than in an octopus, this is part of how reflexes work, like when a doctor hits a certain location on your knee and your leg goes up without you telling it to do that consciously. But most of what humans do is controlled by the brain, whereas in an octopus, all 8 tentacles can pretty much operate on their own. They are connected the same way the right brain and left brain in a human are connected. Most people just have one consciousness instead of 2. But in some people, the right brain and left brain aren’t really connected well enough, and end up operating separately, with 2 separate consciousnesses.
    There are theories about this stuff, how if you cut a consciousness into parts it splits into multiple consciousnesses, and if you connect multiple consciousnesses up together well enough, they merge into a single consciousness. So if I hooked my brain up to yours with electrodes in the same part of the brain in millions of wire connections, theoretically, according to neuroscience, our consciousnesses would merge into one single shared consciousness for both of us that knows everything either of us knows, senses what either of us can sense, controls all our muscles in both bodies, but operates as a single mind, a single consciousness. This is from science, not philosophy. And if you completely disconnect the 2 hemispheres of the brain in a person or another mammal with a similar type of brain, you end up with two minds, two consciousnesses. So with an octopus, technically it has one mind, one consciousness, but it is not centralized, but distributed throughout its body, and can be split apart.
    And if you cut off a tentacle, well, the octopus just lost not just part of its body but part of its mind too, so it is very upset. And the tentacle’s mind is now separate from the rest of the octopus’s mind so it is upset too, and it is still sentient and can still think and feel pain and have emotions, unlike for instance if you chop off a human arm or leg, the arm or leg has no consciousness or mind, because in humans, almost all of that happens in the brain, with a tiny bit happening in the spine instead. Although maybe the spine has a little mind of its own that is separate from the larger mind in the brain. We don’t really know for sure. I just felt like talking about how octopus tentacles are sentient and can still think even if you chop them off, and scientists have proved this in experiments. Please don’t hurt any octopi in experiments. It might not be very ethical, now that we know even their tentacles are sentient.
    Personally I have witnessed something similar in lizards. There are lizards where if you catch them and are holding them by the tail, their tail detaches from the rest of the body and the lizard escapes, bleeding a bit from where its tail came off, and then it can grow a new replacement tail. And the tail that came off wriggles around for awhile on its own, not even connected to the main body. The tail seems to have a bit of its own nervous system that can, at least, tell it to wriggle around a lot if it is disconnected from the main body of the lizard. Is the tail sentient or conscious at all? Who knows? It might be. It can certainly move on its own after being detached.
    Also if you chop the head off a chicken, the chicken can run around for awhile without a head or a brain, still knowing how to run on its legs, which it typically does until it bleeds to death. Is a headless chicken sentient? Maybe. Some people think so. But a chicken or lizard doesn’t have quite as distributed a nervous system as an octopus, nor is it as centralized as a human or other mammal like a cat or dog. This thing where the brain does all the thinking is mostly just a mammal trait, not as true for other animals. Birds have tiny brains but that doesn’t make them dumb, crows are fairly smart, some of the thinking isn’t done in the brain.
    A distributed nervous system is sort of like massively parallel cloud computing where multiple regular computers networked together act like one huge supercomputer. It’s a similar phenomenon but not exactly the same because computers don’t seem to be sentient or conscious... not yet. Maybe in a few years they might be. If a simulation of a neural network is just as smart as a real brain, then according to functionalism, it has a mind too. AIs seem like they might get to that point in the next few years, at least a little bit. Like if you think a fruit fly is conscious and sentient, AIs will almost certainly reach that level, at least if you consider an emulated or simulated consciousness to be just as real as the real thing. Which you should.
    In software, you can play the same game directly on hardware, or in an emulator emulating that hardware, for instance playing Super Mario Brothers either on an actual original NES game console, or on a program that emulates an NES. In either case, whether on original Nintendo hardware or a Nintendo emulator, you are still playing the same game, Super Mario Brothers, and functionally it is exactly the same. So if you could emulate or simulate a brain, this would also be just as much a mind as the real thing, according to functionalism, because a mind, like Super Mario Brothers, isn’t defined by what it’s physically made of but instead by its function. Super Mario Brothers is a functional kind, just like a brain, you could play it on the latest smartphone and it would still be Super Mario Brothers. Maybe you could even train an octopus to play it, with the right interface and control system designed for an octopus to use. Different tentacles could control different buttons, up down left right A B select and start, 8 buttons for 8 tentacles, a perfect match. Each tentacle would either be pushing a button or not pushing it depending on the position the tentacle points in. Then we would just need an interface to communicate the state of the game to the octopus because maybe vision might not be its main sense. With enough funding, we could create a Nintendo for octopi, and use the Nintendo-playing octopus to solve all of philosophy. Or at least the octopus would have that experience, like in Robert Nozick’s experience machine. Or we could put it in a Chinese room. I guess the Nintendo would be like a Chinese room because it would raise the question, does this octopus actually know how to play Super Mario Brothers, just like asking, does the person in the Chinese room actually know Chinese? The answer is yes, obviously, because this octopus is really smart. All of its tentacles can think, after all... its entire body is like one giant brain.

    • @severussin
      @severussin Год назад +3

      I went on the journey with you and enjoyed it. Much appreciated!

    • @L4wr3nc3810
      @L4wr3nc3810 Год назад +2

      This was real fun to read, thanks!

    • @eliasrose3842
      @eliasrose3842 9 месяцев назад

      great comment, thanks.

  • @FR-kb1fc
    @FR-kb1fc 10 месяцев назад +1

    Another great lecture. A couple comments. First, gold is the most electrically conductive element that does not easily oxidize, so it functions well as a conductor in oxygen-containing environments. Second, there is a different sense of functionality that I've been trying to understand. Prof. Kaplan says that gold has a certain number of protons; but really, we can't know that, what we can know about gold is that it functions in a certain way. By this I mean, if we put gold ions in a machine called a mass spectrometer, the gold will behave in a certain way, and from this behavior, we infer that it has a certain number of protons. And it isn't just gold, it's everything in the material universe. We can only know how a thing functions. We can never know what a thing "is". That is the kind of functionalism that I'd like to understand better; however, based on this lecture, that seems quite different than the meaning of "functionalism" in philosophy.

    • @bonafide_ok
      @bonafide_ok 4 месяца назад

      In the extreme, all concepts are a functional kind

  • @robertochacon5338
    @robertochacon5338 Год назад

    your channel is simply great! :) thanks for your hard work!

  • @markelmobuenaobra7047
    @markelmobuenaobra7047 2 года назад +1

    Thanks for this, Mr. Kaplan!

  • @williamgarner6779
    @williamgarner6779 Год назад +6

    Only in modern times could someone discuss in this way mousetraps and cat and not at least in passing ask if a cat is a mousetrap. That was their main purpose (as far as humans were concerned) for a thousand years.

  • @atmansoni6406
    @atmansoni6406 3 года назад +10

    Very well explained, thank you for the animated examples :)

  • @anuragc1565
    @anuragc1565 3 года назад +3

    Wow ... thanks for such a clear and succinct explanation.

  • @universe36
    @universe36 3 года назад +5

    Awesome! You are a great teacher!

  • @MrGking1303
    @MrGking1303 2 года назад +4

    Absolute legend, just saved my exam

  • @ad4id
    @ad4id Год назад

    Great video. Well done

  • @ignaciocorralesbriceno8783
    @ignaciocorralesbriceno8783 3 года назад +1

    agradecido mi rey, siga así mi wacho peluo

  • @LoveAndLustInc
    @LoveAndLustInc Год назад +4

    I dig it. The AI community would have a field day with this theory!

  • @calorion
    @calorion Год назад +3

    Somebody is really misunderstanding the identity theory. It's either Putnam or me. My understanding of Place's argument (in the previous lecture) is that *a given instance of pain* in a human mind is identical to brain process B. This is a way to explain how physicalism works, not intended to be some sort of overarching neuroscience claim about brain processes. Place isn't claiming that every instance of pain in every person (or being!) looks identical on a neuroscience level! He's saying that in a particular brain, brain process B gives rise-is identical-to the experience of pain.
    So…I don't see how there's anything for Putnam to object to here. Have I just completely misunderstood Place?

    • @BlazeOrangeDeer
      @BlazeOrangeDeer 7 месяцев назад

      Yes, the possibility of another brain state that also contains pain seems inherent in Place's argument. The simplified version of Place's argument he presented naturally extends to this case, because if two people each experience pain they might be in brain states B1 and B2 (presumably different people are not ever in exactly the same brain state). So pain could not have been a single brain state but was always a collection of brain states that are alike in some way. Functionalism is then a clarification of what it is about this collection that justifies labeling them as painful.

  • @mateoromo5587
    @mateoromo5587 Год назад

    Thank you so much for this video!!!

  • @udyret28
    @udyret28 2 года назад +2

    I can’t find this primer by Bradley that is in the readings. Can anyone help?

  • @CaedmonOS
    @CaedmonOS Год назад

    I love the way you start your videos so much also apparently I'm a table

  • @pepedestroyer5974
    @pepedestroyer5974 Год назад +1

    You are a Great teacher. Your videos are gold. I hope you have more subscribers.

  • @AlexCebu
    @AlexCebu 3 месяца назад

    Looks like Putnam didn't realise that pain itself is a multiple state, a group of states. We have toothace, we have stomach pain etc. Pain itself regardless or specifics is an abstraction and general idea.

  • @bthomson
    @bthomson Год назад

    Your expression when you are trying to think up what makes a cat is priceless! A cross between disgust and long forgotten memories! I do not think that cats are your thing!

  • @kristianvarga9845
    @kristianvarga9845 Год назад

    really good explanation

  • @perplexedon9834
    @perplexedon9834 Год назад +1

    Functionalism, or at least the octopus argument, seems kind of circular to me. We have no way of knowing if there are different kinds of subjective ways that it is to "be like" a creature, because any organism necessarily only has access to one kind. It could be that the form of consciousness in Brain State O is fundamentally different from Brain State B, that both are ways in which a being can "be like" something, but that are qualitatively meaningfully different in some way. The idea that they are qualitatively the same IS the conclusion of functionalism. One of the premises alone entails the conclusion, therefore the argument is circular.
    This applies especially well when considering AI, because we may well have to concluded that there is no test by which we could identify if an AI is having an experience comparable to Brain State B. We as individuals can only conclude that the human mind is capable of producing consciousness, and really only our own.
    Interestingly if an AI were to be similarly conscious, it would have to conclude the same about us. I would actually like to believe that a bunch of gears and string, or some rocks rolling down a hill, or the superstructure of the universe, could have a conscious experience, but no physicalist theory bridges that gap. Functionalism best explains the easy problem of consciousness for sure, as we can call conscious any set of information processing matter that functional meets certain criteria, but you need something unfalsifiable like panpsychism to bridge the hard problem.

  • @thomassoliton1482
    @thomassoliton1482 Год назад +1

    Functionalism - “If you can do the job, then you are the mental state”. Does this make any sense if there is no “you” in that context? Could a machine do this? Would it be “conscious”? Which is the whole point of this proposition? Yes, but that would not necessarily guarantee the “entity” with the “mental state” would be aware of that state, which is required of consciousness. An ant recognizes scents and executes specific behaviors as a result - but is it aware of what it is doing? Certainly not in the manner we consider to be awareness. The ant has little choice in terms of it’s response to the scent, while humans have many choices - what we call “freedom of choice”. In fact, it cannot be proven that “we” really have a choice. What seems like multiple choices are actually many outputs (options) which we can envision and evaluate. Our brains are designed to weigh those option rapidly and make a choice. There is no “me” making a choice - just me watching the choice being made. Perhaps I choose door “B” but then I smell something bad near the door that “persuades” me to choose another door instead. Free will is just an illusion hiding the fact that most of our choices are subconscious. All we can really do is observe our choices and learn from them. That is what conscious fundamentally is - a constant process of observation and evaluation underlying adaptation. Awareness is the process of mental evolution.

  • @os2171
    @os2171 2 года назад +1

    Great! Thanks!

  • @nathanialblower9216
    @nathanialblower9216 3 года назад +6

    If Super Spartans are a problem for behaviorism, aren’t they a problem for functionalism?

  • @checosa777
    @checosa777 Год назад +3

    can we have the pdf readings? i wanna read more about it

  • @jamesforgason5341
    @jamesforgason5341 Год назад

    you are saving my life for my test tomorrow

  • @ivanilyic6492
    @ivanilyic6492 Год назад

    Thank you!

  • @alancosgrove4728
    @alancosgrove4728 18 дней назад

    I enjoy your lectures very much for getting to the salient issues and recapping prior to making the main points. You mention various pages in the book or notes but on your web site I can only find a few of them in the course materials and no mention of a reading list or a course book. Can you tell me if you prepared extensive notes for all of the course or is there a specific book(s) to accompany this excellent philosophy of mind series please? Thank you.

  • @FortYeah
    @FortYeah Год назад

    THANK YOU SIR!!

  • @4_P3R50N
    @4_P3R50N 9 месяцев назад

    Are causes and effects defined as physical, or would a change purely in ones conscious experience count as such? (If something like that even exists according to functionalism)

  • @Menschenthier
    @Menschenthier 7 месяцев назад +1

    Since I wasn't in the course, which text by Bradley is it?

  • @johnnygate3399
    @johnnygate3399 Год назад +2

    Looks like a form of behaviourism to me. Jealousy needs behaviours such as whimpering. What about emotional spartans who feel jealous but do not whimper or betray their feelings?

  • @tAntrik18
    @tAntrik18 3 года назад +3

    One can say that the mental aspects of my life, consciousness, pain, etc, are some function of neurons in my brain. And this is similar to the position of identity theory.
    And if that is true, then the argument for multiple realizability of mental states is not clear. My conscious experiences at any point are just an outcome of a particular way my brain fires neurons. And I happen to call some of these firings as pain.
    In the same way, I am labelling some of the brain states of octopus as pain. But the mental aspect of octopuses life is still given by the particular way it's brain works.
    I am not sure though if identity theory can be cast this way.
    Edit: I have already watched the video on identity theory. Still cannot solve the problem. Great videos though.

    • @SitcomedyCD
      @SitcomedyCD 3 года назад

      I think that the motivating objection that brought up functionalism against identity theory isn't to deny that consciousness and sense experiences are a function of neurons and your brain, but that identity theory does *not* allow for an understanding of mental states as multiply realizable. If the identity theorist says that pain is identical to c-fiber firing, then things that aren't brains that fire c-fibers to create pain aren't really being recognized as pain under the theory. This is regardless of whether what you're talking about is actually having the experience of pain, and surely the important part of pain is the sense experience and not how it came about! The functionalist theory doesn't have this problem because it allows that consciousness and sensations could be multiply realizable. It doesn't have to be C-fibers, it can be alien gelatin heating up, or light hitting a receptacle or whatever. It doesn't matter what type of being you have, as long as it has something "like" a brain that fulfills the role of being the thing that the experience arises from, then that being has a mind with sensations and experiences that correspond to its own unique mechanisms.

    • @Brian.001
      @Brian.001 2 года назад +3

      @@SitcomedyCD Yes, but in the middle of that you assumed that 'pain' is multiply realizable, which has not been established. All we know is that an octopus and a person can get into a similar functional state - not that the experience each of them has is the same. They could be two unpleasant but quite distinct experience types. Saying it is 'just obvious' that an octopus can experience pain, judging by the functional states it exhibits, is just begging the question. I have never seen a demonstration that pain can be experienced identically across species.
      If we were to accept functionalism, we would be holding that token brain states of distinct types can each realise a single type of experience. Token-type identity is not even comprehensible, given that pain is an experience, not a function. If an experience is identical with a brain state, they must share their type.

    • @SitcomedyCD
      @SitcomedyCD 2 года назад

      @@Brian.001 nice, thanks

  • @theedj007
    @theedj007 Год назад +1

    I cannot for the life of me locate the primer/reading by Adam Bradley; my internet kung fu has utterly failed at locating this resource. Is it a chapter in a book? A journal article? Any help in pointing me in the correct direction would be appreciated!

    • @lizazhbankova4070
      @lizazhbankova4070 4 месяца назад

      Any luck finding the text? My google kung fu doesn't work with it either.

  • @msmd3295
    @msmd3295 8 месяцев назад +1

    Even if mental states are multiply realizable that’s not really the core of brain-mind identity. The central element of importance is the fact that without the brain, mind cannot be realized. One would have to demonstrate that mind can exist without brain for there to be any duality. It is well known scientifically that mind ceases when the brain ceases. Thus they’d have to be elements of the same thing… a physical brain.

  • @jonstewart464
    @jonstewart464 2 года назад +2

    Jealousy is a great example to expose why functionalism is wrong. We all know that jealousy has *qualia* that we experience while our brains carry out the function of processing the inputs and churning out the outputs. When we use the word "jealousy" we're talking about the the qualia - what *feels like* to be jealous, and not the function nor (as in identity theory) the physical goings on in the substrate i.e. certain neurons firing in a certain pattern.
    Let's say I have a dream, where there's some character that I don't know in my normal life, and they're kissing a man on a beach. I don't see who they're kissing, but at that moment I wake up, and I'm filled with this overwhelming feeling of jealousy. There's no input, just images in my dream that don't make much sense. There's no output, I don't exhibit any whimpering, I don't tell anyone "I'm fine". I just wake up with this overwhelming feeling of jealousy conjured by my dreaming mind. There's no inputs, no function, no outputs - but there's definitely jealousy.
    So functionalism is false, and so is any other theory that does not account for qualia.

    • @jonstewart464
      @jonstewart464 2 года назад +1

      @@arletottens6349 I don't think it's enough for waking with jealousy to *sometimes* have a function. If sometimes there's no function at all, then it cannot be true that (waking with) jealousy *is* a functional state, it must be that jealousy is something else (a feeling) *usually/sometimes associated with* some function or other. If we take away the function are we left with nothing? No, we still wake up with all the qualia of jealousy.

    • @jonstewart464
      @jonstewart464 2 года назад

      @@arletottens6349 That's exactly what I would I would be saying if I believed in a theory of cars that defined them purely by their ability to move and had no knowledge or experience of engines. But I don't, I believe in engines, and I know what they do.

    • @jonstewart464
      @jonstewart464 2 года назад

      My reading is that if no inputs and no outputs can be identified, then no function can be identified. I don't see how an internal state of the mind/brain can be regarded as an input (e.g. a dream). And I don't think it makes sense to say that an experience *might sometimes* have an output, each experience is unique. The language of inputs and outputs isn't sufficient to capture what we care about.

    • @jonstewart464
      @jonstewart464 2 года назад

      @@arletottens6349 That's really interesting. But doesn't functionalism say that it's only when you do probe the internal state that it is conjured into existence? If functionalism allows for all kinds of internal states to exist when not being probed by inputs, then how is it a theory of mind rather than an experimental paradigm for finding out about minds?

    • @okamisensei7270
      @okamisensei7270 2 года назад

      I'm not a functionalist either, but just to probe your argument: I think you're assuming an atomic entity in the mind. The agent experiencing the images, might be different from what is creating and presenting it.
      Dreams tend to blur the line between the one experiencing and the experience, but you could still say that the image of the people kissing is an input that is separate from the agent that receives this input and then experiences jealousy.
      Functionalism could still be a valid theory without an atomic agent

  • @chrisw4562
    @chrisw4562 9 месяцев назад

    Thanks for another great lecture. I don't buy the multiply realizability argrument. In the example, that argument would require human pain to be identical to octopus pain. Really? It feels to me a lot of the philosphers are making things up to prove their point, describing that in a language that nobody can understand, and then it takes generations to debunk them. Brilliant.

  • @ioannaantonaki4883
    @ioannaantonaki4883 Год назад +1

    can you provide the readings in pdf?

  • @4_P3R50N
    @4_P3R50N 9 месяцев назад

    How does conscious experience fit in with this? If pain is solely defined by input & output, then the actual experience doesn’t matter if we don’t define the experience as a required output. Does Putnam express an opinion on this?

  • @HeDeRust
    @HeDeRust 3 года назад

    are about the eliminativism materialism?

  • @destrakkejakke
    @destrakkejakke 2 года назад +1

    Can u tell me how u made the video ? Is this like a glass screen then u flip it so the writing isnt mirrored ?

    • @profjeffreykaplan
      @profjeffreykaplan 2 года назад +1

      I get asked this a lot. Here is a video I made explaining it: ruclips.net/video/6_d44bla_GA/видео.html

  • @matthewgingell3792
    @matthewgingell3792 3 года назад +3

    Under the functionalist view, if a super stoic receives jealously inducing inputs but those inputs have no impact on their behavior/outputs, would we still say they experience a mental state of jealousy?

    • @mithrae4525
      @mithrae4525 Год назад +1

      Putnam: "Behaviourism is wrong because Super Spartans don't exhibit behaviours associated with pain."
      Also Putnam: "The outputs/functions associated with pain are where the real deal is at. Can't think of a counterexample to that, no siree!"
      I suppose one could argue that the function of pain, jealousy etc. are informative rather than behavioural; a super spartan who feels pain can then decide how to respond or modify their actions, such as removing their hand from the fire that's causing tissue damage (without showing any typical pain responses like crying out etc.).

    • @christiangreff5764
      @christiangreff5764 Год назад

      @@mithrae4525 I have to agree with you that functionalism does, at least on a basic level, seem pretty much like behaviorism with extra steps.
      The Super-Spartan counter-example to behaviorism is flawed in the first place. The Super Spartans, as described in the text itself, have the disposition to act in typical 'pain-behavior-like manner', they are just supressing it. Aka to convince us readers that the super spartans are, indeed, feeling pain, they had to be described in such a way as to have the very disposition they are supposed not to have to prove that mental states do not equal dispositions. We can, of course, imagine an overhauled example were mad neuro-science has lead to totally changed up pain responses in the subject (victim) of Dr. Evil. But then again, if in this new scenario there are truely none of the typical pain-behaviors shown (like, at the very least, avoidance to future repetition of pain-causing stimuli), how would we ever identify what we observe as pain-behavior? Fundamentally, is there such a point at which the underlying perception and therefore the mental state itself has to be changed to cause a certain behavior? I mean, if the unfortunate subject, after Dr. Evil's operations, willingly seeks out what are to us causes of pain and carefully avoids what we would excpect to cause fun or pleasure, then does it still make sense to classify what they are feeling from the former as pain and what they are feeling from the latter as fun?

    • @mithrae4525
      @mithrae4525 Год назад

      @@christiangreff5764 It seems you're saying that the 'disposition to exhibit pain-related behaviours' is simply a subjective mental state that recipients of pain have, including Super Spartans. But then how is that behaviourism any more? How is it even a physicalist theory any more? Even Cartesian dualists would agree that pain is a subjective mental state which disposes people to act in certain ways. Surely the whole point of physicalism is to associate mental phenomena with something objective; in this case with their consequent behaviour, not merely with a mental disposition.

    • @christiangreff5764
      @christiangreff5764 Год назад

      ​@@mithrae4525 Ah, sorry, it seems I did not manage to clearly communicate the point I was trying to make. Indeed, as you put it, mental phenomena are inferred by their consequent behavior. So much so, that if not at least conscuiously supressing the associated reactions, we would probably see any claim of an entity experiencing a specific mental phenomenon (such as pain) as highly dubious.

  • @johnokazaki7967
    @johnokazaki7967 Год назад

    9:48 if anything that "fulfills the function" becomes that thing, then we can say that a cow that gives a driving course is an instductorsince it pkays that role. However, for an instructor to "instruct" you first need a set of characteristics, for example, being able to talk the targets language. In this scenario, a cow, though its fulfilling the role and purpose of teaching, it can't truly teach since it lacks a vital characteristics which is talking like humans do. I think, if something fulfills a function doesn't really means it is the thing described, instead, it requires to have characteristics that -allow- them to fulfill a function, thereby making it the thing we say it is. Thus, what makes something is not its ability to fulfill a purpode, instead, its individual characteristics that create the emergent property that makes allows the to fulfill a function.

  • @jonthomasspears2255
    @jonthomasspears2255 2 года назад

    if our table is broken (oh no!) is it still a table?

  • @genec9560
    @genec9560 2 месяца назад

    I am enjoying your channel immensely, Jeffery. This video got me thinking ... What if a cat had a bad accident, and needed 51% modern robotic mechanical parts in surgery to regain full function. Is it still a cat? What about 75%, 99%?

  • @farhadmodaresi4182
    @farhadmodaresi4182 Год назад +1

    20:20 had me cracking lol

  • @GregoryWonderwheel
    @GregoryWonderwheel Год назад

    Descartes got his dualism from Aristotle who divided reality into the Physical and the Metaphysical. Science then said the Metaphysical was merely superstition. Descartes was attempting to rehabilitate Metaphysics by talking about mind, but he didn't have a psychological perspective to make sense of metaphysics so he used his physical perspective to describe mind. The idea that material things can't be moved by immaterial mind is a category error based on false premises for "material". It's a mistake to identify mind with brain matter, and it's equally mistaken to identify mind with mental states; though each mistake has a relatively good reason for making the mistake based on the assumed categories that result in the category errors.

  • @charlesb2222
    @charlesb2222 10 месяцев назад

    "If you ain't made of the right stuff, you ain't gold" -MC Kaplan

  • @popcarvalho
    @popcarvalho 2 года назад

    If I can do the job, I am the mental state. But how this statement could solve the mind body problem? I´m not sure if I undestood properly...

  • @jessewilley531
    @jessewilley531 9 месяцев назад

    The sad thing is... I put a cardboard box in my college apartment between my writing desk and living room. I always intended to get a real table to go there but it wound up staying there all four years I was there.

  • @adeelashraf7366
    @adeelashraf7366 2 года назад +1

    If mental states are multiply realizable then how a cat or man is not?

  • @terekrutherford8879
    @terekrutherford8879 10 месяцев назад

    Great content and series. But I'm really having trouble understanding how mental states are multiply realizable. If human pain is brain state B and octopus can only experience octopus pain in brain state O, that seems reasonable and still explains octopus behavior. I don't understand the assertion that different beings experience pain in the same way. Each brain should experience things differently due to its different biology and the sensory organs of the body providing information to the brain, and each mental state should be unique due to these variations.

    • @REDPUMPERNICKEL
      @REDPUMPERNICKEL 10 месяцев назад

      Seems to me your understanding is just fine.
      The only entity which you can know with absolute certainty to be conscious
      is the entity that is your self and even then,
      only while your self is conscious.
      I am certain that
      you are conscious while you are reading this sentence but
      I cannot be *absolutely certain* like I am about my self as I type it.
      That I may be typing in a dream does not change the fact that
      I am conscious of the doing.
      The differences between awake conscious and dream conscious
      seem to be,
      one's understanding of the possible while dreaming
      may not be fully participating in the process
      hence my dream belief that I have levitated...
      The other night, very oddly,
      I dreamed that I was completely another person.
      Subsequently,
      my appreciation for the power of imagination is much greater.
      Cheers!

  • @northernbrother1258
    @northernbrother1258 Год назад +1

    Another problem with the dualism theory of mind is that if the brain is damaged the "mind" also suffers.

    • @sirreginaldfishingtonxvii6149
      @sirreginaldfishingtonxvii6149 Год назад

      The most common counterargument I have heard about this is likening the brain to a computer setup. If the display is damaged what is shown on screen can be abnormal, yet the graphics card, processor, and motherboard are all intact and sending the same signals they always were. The point being that there can be a sort of mental "chain of command" where some things can't go through if one part is damaged.
      This does of course bring up further questions. Like "if this is the case, then does our brains also limit our "minds" purely by existing, and binding the mind to a very limited meat computer? Are brains a limiting interface?"
      There's also the argument that the mind, as the "software" or OS the brain is running, can still be messed up if you damage the physical parts of the computer it is stored within.

  • @AlexCebu
    @AlexCebu 3 месяца назад

    Why people call jealousy A MENTAL STATE? when jealous 26:22 a someone is tense and agressive etc. and all those are BODY STATES.

  • @MugenTJ
    @MugenTJ Год назад

    Well, this if anything this functional theory clarifies identity theory. Not a refutation. As far as human mind, it is still just the brain. A computer can exhibit a mind that of different materials. Just like you can have a biological cat or a robotic cat. These two theories are complimentary. Not opposing. In fact, the material generating the function matters a whole lot. I got an itch every time he said it doesn’t matter the thing is made out of! 😅😅

  • @NotRelatable2u
    @NotRelatable2u Год назад

    @Jeffrey Kaplin Is Garfield not a cat?

  • @robbiekatanga
    @robbiekatanga 2 года назад

    But how does functionalism address the mind-body problem?

  • @dwinsemius
    @dwinsemius Год назад

    Great stuff, but I find myself trying to figure out how Kaplan learned to write backwards on thin air? Does that mean I not really a philosopher or scientist, but really just an engineer? How can I put trust in ideas of a guy that doesn't know how a mousetrap works?

  • @none8680
    @none8680 Год назад

    How are we concluding that pain in humans is the same as pain in octopuses? Or jealousy in one person is the same as jealousy in another?

  • @robertbeniston
    @robertbeniston Год назад

    Why would there be a problem with the mental or spirit connecting to the physical? There would be interaction between them. Just because the mechanics are not known would be beside the point.
    The function of Gold is to be valuable.

    • @rizdekd3912
      @rizdekd3912 Год назад

      " Just because the mechanics are not known would be beside the point." Then is there a point to dualism? We could just say that the physical somehow, in ways not yet understood, produces the mind just like we say matter/energy 'produces' gravity by somehow bending time/space.

  • @chrisw4562
    @chrisw4562 9 месяцев назад

    Thanks for the lecture. Functionalism seems easy to debunk. A computer program can do many functions the same as the mind. Does that mean the computer has a mind? I don't think many would agree. However, will a computer eventually be able to have a mind? I think the answer is yes.

    • @farafonoff
      @farafonoff 9 месяцев назад +1

      'Debunking' functionalism involves unmeasurable stuff like 'mind', 'understanding', 'Consciousness'. If we try do define and measure them, than we can build a computer to have this things. Different words, same issues as with 'god', 'angels', and 'soul'.

    • @BlazeOrangeDeer
      @BlazeOrangeDeer 7 месяцев назад +1

      Obviously if a machine can do some but not all of the things a mind can, a functionalist would not call it a mind. It's missing the complete functionality. You might still be able to consider it a partially functional or defective mind

  • @glenrotchin5523
    @glenrotchin5523 Год назад

    How do you write backwards so well?

  • @landongonzales9076
    @landongonzales9076 Год назад

    11:04 this aged like fine wine

  • @MrDanDant
    @MrDanDant 4 месяца назад

    It occurs to me, if "brain" is a functional kind, functionalism and identity theory are identical. Am I wrong?

  • @originalandfunnyname8076
    @originalandfunnyname8076 Год назад

    I feel like functionalism is just extension to identity theory - instead of identifying mental states with concrete brain activity and neurons in humans, we can say that there're multiple possible ways to construct some mental state, still using only physical brain, but not necessarily human brain. Just like a table can be made of wood, metal or any solid matter, mental states can also made of brain process B OR mental process O. But they can only be made of mental processes, like a table couldn't be made of water for example. So I'm not fully convinced with this counterexample to Identity theory.

  • @Menschenthier
    @Menschenthier 7 месяцев назад

    The subtitles mean: "it might be hard to make a vending machine out of Plato" 😄 Maybe you could make Plato out of play dough?

  • @pierrelabrecque8979
    @pierrelabrecque8979 11 месяцев назад

    I read enough comments to understand any further venerating will be redundant. Perhaps, because your content is so captivating, it took a couple videos to realize you write like Leonardo Davinci. You Likely talk like him to.

  • @user-bf2sf7xq2t
    @user-bf2sf7xq2t 5 месяцев назад

    With this explanation I'm passing exams tomorrow, lol😁

  • @xenoblad
    @xenoblad Год назад

    20:21
    This may seem childish, but I want someone to make a gif of Prof. Kaplan just making an octopus sound.

  • @realbland
    @realbland Год назад +3

    But obviously an octopus' brain experiences pain differently to a human's, so its still "pain" but its a different version of pain than the pain we experience because they have brains that work differently. Its the same problem as the argument from what's it like to be a bat, we just happen to both have the sense for when the body being harmed in some way, which we call pain.

    • @xin9458
      @xin9458 Год назад +1

      This is a good point! Extending this, it can be argued that no two humans experience pain the exact same way either - we might all exhibit neuronal patterns that roughly resemble a textbook "Brain State B," but it's never going to be completely identical. Every person's pain is different because every mind is unique (just as every table is unique? His might be an old packing case while mine is made of wood), but the fact that pain is a mental process associated with a specific physical state is conserved...

    • @realbland
      @realbland Год назад

      @@xin9458 exactly! consciousness is no more a physical kind than a cat is, but thats not because consciousness manifests identically across all life forms with the capacity for it, but, in the same way that there can be many different kinds of cat, and many forms that a cat can take, there are many "ways" that consciousness appears.

    • @xin9458
      @xin9458 Год назад

      @@realbland That makes sense! Also very in line with the philosophy of biology in general - within the frameworks presented in this video series though, would this position be something in between the Mind-Brain Identity Paradigm and Functionalism? As in, mental states definitely correspond to brian states, but there are multiple brain states that functionally create consciousness...?

    • @realbland
      @realbland Год назад +2

      @@xin9458 i would say its closer to the identity theory, just while recognizing the fact that biology and chemistry in reality is somewhat complicated, and generalizations necessarily cant reflect that

  • @bigol7169
    @bigol7169 Год назад

    17:48

  • @aidenheffernan7556
    @aidenheffernan7556 Год назад

    the differences between the kinds seem arbitrary.. like gold has multiple isotopes so are there are multiple ways to be gold

  • @CMVMic
    @CMVMic 11 месяцев назад

    Isnt this making a category error. Functionalism has to do with the mind of a thing. If we define humans and cats as substances, then functionalism doesnt apply to these definitions. Functions apply to a substance's identity, what it does that grants it subjectivity. These are just semantic distinctions but a cat can be defined according to its functions and so can a human being. In this sense, a philosophical zombie and a human can be identical. It really just depends on how we define these things. A robot that behaves and looks exactly like a cat and be defined as a cat. I think the bigger issue is with how one defines a thing as identical. To be identical, a thing must not have any numerical or spatial distinctions i.e. A=A, but then to claim something is identical in degrees or have identical parts is to infer that the two things are not identical in every aspect. Also, a table, whether it is made out of word or steel, it is still the same fundamental substance, arranged in specific ways to fulfil certain functions. I think we can claim mental states can be physical events.

  • @tozfttoz
    @tozfttoz Год назад

    20:20 best part of the video🤣

  • @DevonBagley
    @DevonBagley Год назад +1

    The functionalism argument doesn't really address the claim of identity theory. There are many types of computers that all operate on completely different physical laws but still produce the same output. It may be wrong to specifically say that computers produces result A by specifically doing specific process B, but it is not wrong to say that all computers produce result A by doing physical processing. The fact that the process is different for different computers doesn't negate the fact that result A is always given for input B in spite of the physical differences in the process that takes place.

  • @Fran-ik6ob
    @Fran-ik6ob 11 месяцев назад

    A cat is a mouse trap😀, great video

  • @christopher7539
    @christopher7539 Год назад

    An octopus cannot feel pain -- it can only feel its analogue 'ulugurugluruluru'

  • @xyzoopsie7804
    @xyzoopsie7804 3 года назад +2

    I heard "vending machine out of Plato"😂

    • @denizsarkaya5410
      @denizsarkaya5410 3 года назад +2

      I have the opposite problem: whenever someone says "Plato" I hear "Playdough" ahaha

  • @LISA.WANG.
    @LISA.WANG. Год назад

    5:10
    12:17

  • @waynr
    @waynr 10 месяцев назад

    I'm pretty sure I can do the job of a table! The question is, will anyone hire me? 🤔

  • @coffeeisgood102
    @coffeeisgood102 Год назад +1

    My cat is not grouchy, but it’s job is to annoy.

  • @yungzed
    @yungzed 4 месяца назад

    how is a robot cat not a cat by definition of functional kind??

  • @rutikajadhav9642
    @rutikajadhav9642 2 года назад

    I was 6 minutes into the vid thinking this was a sociological theory :')

  • @MrGeometres
    @MrGeometres Год назад

    The brain is the hardware, the mind is the software.

  • @parheliaa
    @parheliaa Год назад

    A good old "Duck principle" form the software development world.

  • @rickwyant
    @rickwyant Год назад

    Mind is a process? Generated by the brain?

  • @DatDack
    @DatDack 3 года назад

    I don't quite understand how multiple realizability disproves the identity theory.
    Identity needn't be necessarily exclusive. Many things can all have property A, we'll call it catness. For example a calico and a black cat are both cats. Saying that a calico is a cat isnt disproven by a black cat also being a cat.
    So if Place asserts the mind-brain identity theory and says pain is the result of human brains in humans, that can be true while also being the result of other things in different cases (the octopus brain, or an AI, etc). The octopus brain causing pain for octopuses doesn't make the human brain not cause pain in humans!
    Place is making an empirical claim, and the fact that mental states are created by other things in other organisms doesn't make them NOT caused by the brain in humans.
    Can someone explain where I'm misunderstanding? Thank you!

    • @DatDack
      @DatDack 2 года назад

      HELP PLS

    • @filipkaraivanov8158
      @filipkaraivanov8158 2 года назад

      In the identity theory mental states just are brain states. So, the mental state of pain just is nociceptive neurons firing. The claim is not just that they correlate but that they are identical. Hence, pain is necessarily nociceptive neurons firing. Here the multiple realizability objection applies because a thing can have the mental state of pain without nociceptive neurons firing and hence the two cannot be identical.

    • @Jensen8918
      @Jensen8918 2 года назад

      @@filipkaraivanov8158 Sounds like both the octopus and the human brain have analogous ways of experiencing pain, even i their architecture is different. Their architect remains the same, biology through natural selection. Seems so simple to me but I may be missing something. I think people are taking Place to literally, or if he is being that literal he is stupid.
      I call this theory Strong Illusionist Identity Functionalism.

    • @MsJavaWolf
      @MsJavaWolf Год назад

      There are always several interpretations of those theories, even by professional, academic philosophers.
      I don't think it disproves the type of identity theory that I'm familiar with. I just see the concept of a brain in a broader sense, it's just a physical information processing object. Octopi still have a sort of nervous system that might be able to produce mental states, and since their nervous system differs from ours, they might also be in slightly different mental states, I don't see any contradiction.
      I mean, if you had such a strict definition of what a brain is, then why even assume that cats have mental states? Their brains are already significantly different from ours, it seems arbitrary to include all sorts of animals with different brain structures but exclude octopi.