Yudkowsky vs. Le Guin (Or: Why Total Utilitarianism Doesn't Work) | Chaianalysis

Поделиться
HTML-код
  • Опубликовано: 15 сен 2024

Комментарии • 28

  • @ChaiaEran
    @ChaiaEran  10 месяцев назад

    I've seen a couple of comments now that bring up "scope insensitivity" as a reason that Yudkowsky might be correct, so I want to address that real quick. As I said in the video, Claim 3 of Temkin's argument is that a mild discomfort for a whole lot of people would be preferable to excruciating torture for one person, _no matter how many people are affected._ Falling back to scope insensitivity makes the assumption, as Yudkowsky does in his response, that Dust Speck advocates are appealing to non-linear aggregation, such that the total pain of a massive number of people experiencing Dust Specks is less than the pain of one person experiencing Torture. However, that's _not_ what I'm saying. Put simply, I reject the idea that the sum total of pleasures and pains aggregated across the entire testing population is all that matters when evaluating scenarios. You also have to take into account the density, distribution, and modal average of pleasures and pains. The Dust Specks have a very low modal pain to begin with, and are distributed across a massively large population such that the density of pain is very low, and this holds true _no matter how many people are affected._ Adding more people only expands the distribution, and it doesn't impact the density or mode at all, whereas in Torture, the _total_ pain may be less, but the distribution is far lower, and the density is much, _much_ higher, concentrated on a single person - and the modal pain is also much higher, as a single Dust Speck is obviously far, far less painful than Torture. In this way, we can still say that horrific events that impact a lot of people are worse than ones that impact just a few people, as the modal pain is still very high in, say, an earthquake, while also having a wide distribution and low density, but that one person being tortured is worse than any number of people getting dust specks in their eye. (The density of pain also helps explain why getting a million dust specks in your eye all at once is worse than getting one dust speck in your eye, even though both have low modal pain - the distribution is low, and the density is high.)

  • @ScbSnck
    @ScbSnck 10 месяцев назад +6

    Bruh

  • @zigotina
    @zigotina 10 месяцев назад +5

    what the fuck is this 😂

  • @pissditching
    @pissditching 10 месяцев назад

    found you from tumblr and this video is so good. i need to hear you talk about philosophy forever.

  • @Enis000
    @Enis000 10 месяцев назад

    To simplify for myself. Total utilitarianism is a form of reductionism and therefor wrong because it doesn't take the phenomenon emergence into account.
    Suffering is emergent 1+1=3
    Thought provoking video, thanks.🥰

  • @JohnQuatro
    @JohnQuatro 10 месяцев назад +2

    Chaia, you have this skill of taking things I’ve never thought about, and making me hyper focus on them for a half hour while you give your essay. Absolutely loved the content, and learned something(s). Great video!❤🎉

  • @genevievebeaupre4029
    @genevievebeaupre4029 10 месяцев назад

    You: feeling a significant discomfort from holding a balance for a huge amount of time.
    Us: 2^10 people feeling a small pleasure from seeing you holding it.
    Excellent moral behavior, I shall subscribe.

  • @Pallerim
    @Pallerim 10 месяцев назад +4

    Le Guin, philosophy and fanfic nerdery in the same video? I think the youtube algorithm finally did something good. This was thoroughly entertaining, keep it up!

  • @domenicrosati
    @domenicrosati 10 месяцев назад

    If you haven't read it, check out Beggers in Spain - it explores LeGuin-like versus Yudkowsky-ish attempts at making the future

  • @thecactus7950
    @thecactus7950 10 месяцев назад

    I like the video. The Ones Who Walk Away from Omelas is probably my favorite short story of all time. I remember crying when I first read it. I was very young when I read it, but I still get still sometimes get teary when thinking about it. There were things I did not like about the video though. I hope I don't come off as antagonistic, because I overall liked the video, and think we need more like it.
    Firstly, I think you presented The Ones Who Walk away from Omelas as more univocal than it is. Ultimately I draw the same conclusion from the story you do, but I think the story as written sets up a dilemma (among other things, that story has a lot of depth I think). For me the tension is between Omelas feeling like an utopian city that truly is amazing, so amazing we can't even really comprehend it, and the thing about the child, which is so so terrible, and gives a feeling that something is deeply deeply wrong. I think the ending is ambiguous also. Like those who walk away are not treated as heroes. They're written as ordinary people who have some deep conviction and a certain sense of nobility, but who also might be confused. I certainly don't think its supposed to be a one-sided knockdown argument against totalist utilitarianism, which is how I feel you presented it.
    Secondly, I think you're poisoning the well a little bit too much, and appear too biased and ideologically motivated. Like putting air quotes around every eliezer/rationalism/effective altruism related thing, assuming that these people are just obviously wrong about everything they say. I tend to agree with MIRI type people about AI, and now I feel this video is very hostile towards me, even though I agree with a lot of the points about totalist utillitarianism.
    Thirdly, I feel like you don't give a fair representation of the arguments eliezer would've given in defence of his argument about dust specs. I think his most honest reponse to people believing torture to be obviously the worse option, would be that people are incapable of reasoning about big numbers. I think that is what he refers to as "scope insensitivity". Basically: Everyone would agree that 100 people being murdered is worse than 1 person being murdered. But if you make numbers big, people stop being able to reason in this way. Like 10^12 people being tortured will to most people seem roughly as bad as 10^14 people being tortured. I don't feel like you addressed this in the video, and that you just assumed the conclusion eliezer drew from the dust spec thought experiment to just be so blatantly wrong as to not be worth engaging with. Like the "Eliezers capitalist" example you gave, is basically the utility monster thought experiment, and it is contentious. Some people will says its obviously awful, others will say it is good, largely for reasons like scope insensitivity and transitivity. So I don't think its good to bring it up, and just conclude that one side is obviously wrong.
    I liked the style of the video. And the readouts of quotes. Also appreciate holding up the weight for so long. Imagine that would be tiresome in the shoulder. Sorry for the compliment sandwich I realize I just made, I just typed this out in one go.

  • @Norfma
    @Norfma 10 месяцев назад

    No idea why RUclips algorithm decided to push me your video but it was enjoyable. If you'd like some feedback I think you should tone down the despise and avoid so clearly taking a side if you're going to hold a balance for the entire video or maybe you should just grab a gavel and act as a judge.
    For the content of what has been said itself:
    -I personally don't see the value of the cutoff point (16:10) and comparing pain levels to torture or hindrance... in a fine spectrum, it doesn't matter what word you choose to represent the level of pain as long as they are standing side by side on a large scale...
    -Also your comparison to covid is pretty disingenuous since the scale of the problem is entirely different, a million people dying against 5 billions having a small hindrance is hardly on the same scale as 1 person suffering to 2^69 small hindrances...

  • @user-kk2ng4he5j
    @user-kk2ng4he5j 5 месяцев назад

    How did you find the single thing Yudkowsky is correct about to disagree with (not saying his rationalization is correct, I only watched up to 17:45, but his conclusion is obviously correct)

  • @tootnewt
    @tootnewt 10 месяцев назад

    thanks for the great vid! I would watch a MIRI/Yudkowsky deep dive

  • @JBAIMARK3
    @JBAIMARK3 10 месяцев назад

    Very interesting.
    In my head, the question this dude needs to be asked is; "in what context would you find something to be worth suffering and dying for?" rather than "what are you willing to inflict on the alien?"
    I'm sure if you wanted to get into the weeds you could make many real-life comparisons to profit & cheapness, and how many people have no problem underpaying people to improve their own existence (from the executive level to the consumer level), the needless consumption of meat for pleasure in advanced economies which aligns very well to the analogy of a child-like entity suffering to enhance the lifestyles of those around them.

  • @hotelmario510
    @hotelmario510 10 месяцев назад

    _The Dispossessed_ is one of my favourite novels of all time. If you have trouble reading it, I recommend the audiobook, read by Tim Treloar. I don't know if that's available for sale where you are, but it really made the book come alive for me.

  • @_sarpa
    @_sarpa 10 месяцев назад +1

    so you say that moral intuition should be all that is needed to reject a normative claim? does it apply to, say, homosexuality? aversion has no value in any circumstances when determining the value of anything. yes, here I said it - moral intuition is utterly worthless. it's based on cognitive biases, often magnified by thought experiments; in the case of the dust speck argument, it's the inability to imagine very large amounts or intensity of something.
    so how can you support utilitarianism if you reject moral intuition, then? the answer is: by relying on phenomenological knowledge. my hedonism does not come from moral intuition, as I radically reject it (and as it is rational to do), but rather directly from intimate knowledge of the very nature of mental phenomena. also, a close analysis of moral claims reveals that when a non-hedonist makes a moral claim, they merely express their emotional attitude and turn it into an imaginary, nonexistent property to project it onto some concept. so when you say, for example, that murder is wrong, you project your dissatisfaction with the idea of murder onto murder, and you pretend your dissatisfaction is somehow a property of the event of murder, which is obviously an incoherent stance, since emotions are not properties of external events.
    what this means is that the sole objects of true ethics are pleasure and suffering, and moral disagreement concerning metaethics arises when people commit a cognitive error when using these two objects of ethics.
    the implication here is that entities, including trees, rats, pigs, people, planets don't have any intrinsic value. only events, in the form of mental phenomena, can be intrinsically valuable.
    even if we assume you are correct when you disagree with the claim above, you've got an additional problem: if you rationally analyze your phenomena, you will see they are all that is needed to explain the mind, and that the self doesn't exist. how can you say the self has any value if it doesn't exist?
    edit: Temkin seems to have invented terms that sound smart in order to make cognitive biases sound not so much like cognitive biases. 'essentially comparative view'? you've just made up a term, what now? what does it change? so you're taking the 'context' into consideration, and what do you consider in that context specifically? what is its object? what is there beyond the sum of pleasure and suffering? well, the emotional reaction to the thought experiment, in other words, moral intuition. that's it.

  • @ConvincingPeople
    @ConvincingPeople 10 месяцев назад +2

    That was an excellent demonstration of one of the many ways in which Yudkowskyian total utilitarian rationalism is fundamentally a nonsensical mess. I also appreciated that, unlike a lot of people who talk about "The Ones Who Walk Away from Omelas", you emphasised what the actual point is-a critique of the notion that a flourishing society must be founded upon injustice which tacitly underpins a whole lot of how modern industrial society is organised-while using that as a way to tackle what is truly wrong with Yudkowsky's worldview. I'd love to see you do more on the rationalists, as you seem to have their number in a way few outside of those who cover them semi-professionally do.

    • @_sarpa
      @_sarpa 10 месяцев назад

      if you believe Omelas is a bad place, then how bad would that make our world, or any actually possible world?

  • @anneaunyme
    @anneaunyme 10 месяцев назад

    Although Yud is undoubtedly a weird fellow (and "wrong" on many things), I don't think you are really on point with your criticism.
    First, the guy is clearly neurodivergent. As such one couldn't assume he means something when that's not explicitly what he means. When he writes 3^^^^^3 that is not the same as 2^69 : 2^69 is big indeed but that's only about a thousands more than the number of possibles values of a byte for example. Eleizer's stupid number on the other hand simply doesn't make sense for the human mind. It is simply too big to fathom.
    And this is the point where Mr Yudkowky's reasoning "fails": as this number has no practical meaning, this argument has no practical application. There isn't 3^^^^^3 humans to experiment dust specks in their eyes and if there were it would completely change stuff. The reason why our morals tells us to choose the dust specks is because we can't truly believe it will affect that many people (and for any practical situation, we are right about that).
    Your reverse through experiment (Eleizer's Capitalist) has the same flaw: it can only work if you manage to find a way to reach the stupid number of happiness points for a single individual, but the possibility of such a thing is doubtful at best. And that's not a contradiction with total utilitarianism: you could perfectly have a utility value for each and every thing but for some reason a physical limit that makes it harder and harder to gains HP when you already have a lot of them. Actually this is one of the most consensual fact in economics: people who are already happy are harder to make happier than those who are desperate.
    To make your experiment more relevant you'd need to provide some justification about how you proceed to make this one agent that happy (it is possible actually to frame it in a way that it can't be countered in Eleizer's Yudkowsky's framework).

  • @BenGroebe
    @BenGroebe 10 месяцев назад

    My God, people whose whole deal is "I'm actually really smart, you should listen to me because I'm really smart, don't think hard about what I'm saying because trust me, I'm really smart" are the most annoying people on the planet. Yudkowski leaning into "I don't even have a high school degree" would have been more on-brand, because while that doesn't really say much about someone's capabilities it certainly falls into the "I reject the system because I'm a special genius" narrative these guys love to hype. Somehow I knew about all the LessWrong and basilisk nonsense, but didn't know he wrote a lengthy, rationalist Harry Potter fanfic.
    Also, friend, you should avoid using so aggressive a noise gate in your audio production. Leaving hardware and recording environment the same, I'd recommend either letting a bit of the dry mix through even when the gate is active, slowing the attack and release times, or using a FIR filter to target the noise frequencies you're trying to suppress. None of these is a perfect solution, but they should give a more even sound with less notable humming abruptly popping in and out of silence.

    • @BenGroebe
      @BenGroebe 10 месяцев назад

      How could I forget the most obvious solution? Playing some background music helps to hide the differences in levels induced by the noise gate, though I'd still recommend pairing it with one or more of the aforementioned methods.
      Keep up the good work, your analysis and writing is strong =)

    • @ChaiaEran
      @ChaiaEran  10 месяцев назад

      Thanks! I was trying to cut down on some audible buzzing in my previous videos, guess I overcompensated.

    • @BenGroebe
      @BenGroebe 10 месяцев назад

      @@ChaiaEran Noise is a constant menace, especially if you can't afford a fancy studio space. I spent many hours in college trying to figure out how to hide it in my amateur song recordings. You're doing fine, using a noise gate is a good start, especially when you've got a whole script, blocking, videography, costuming, etc. to figure out. You're not overcompensating, you're taking steps to improve your work.

  • @jannickpetersen3038
    @jannickpetersen3038 10 месяцев назад

    Not to be rude, but you look very goofy holding that tiny scale the entire video

  • @jackmak2980
    @jackmak2980 10 месяцев назад +4

    Trans

    • @ChaiaEran
      @ChaiaEran  10 месяцев назад +7

      What gave it away, the pride flag on my jacket? ;)

    • @noarmtim
      @noarmtim 10 месяцев назад +3

      Give this one a cookie if for no other reason than to keep them quiet.

  • @DanielOlivierArgyle
    @DanielOlivierArgyle 10 месяцев назад

    You totalitarianismified the shit out of me in 10 seconds and made me leave