• 3 Posts
  • 56 Comments
Joined 2 years ago
cake
Cake day: August 13th, 2023

help-circle
  • I usually say the following. I’m paraphrasing a spiel I have delivered in person several times and which seems to get things across.

    'there’s a kind of decentralized cult called rationalism. they worship rational thinking, have lots of little rituals that are supposed to invoke more rational thinking, and spend a lot of time discussing their versions of angels and demons, which they conceive of as all powerful ai beings.

    rationalists aren’t really interested in experiments or evidence, because they want to figure everything out with pure reasoning. they consider themselves experts on anything they’ve thought really hard about. they come up with a lot of apocalypse predictions and theories about race mingling.

    silicon valley is saturated with rationalists. most of the people with a lot of money are not rationalists. but VCs and such find rationalists very useful, because they’re malleable and will claim with sincerity to be experts on any topic. for example, when AI companies claim to be inventing really intelligent beings, the people they put forward as supporting these claims are rationalists.’



















  • What if our simulated universe is actually way, way less terrible than the real world? What if the simulation was created specifically to have lower suffering/higher utils than in reality?

    certain sentences you’ve extracted suggest the author considers severe suffering incomparably worse than any pleasure (for example why would ‘removing suffering’ necessarily improve the universe? in a framework where suffering and pleasure are comparably significant factors, it is possible that removing suffering would remove enough pleasure to tip the balance negative).

    that’s a point of view I’m very sympathetic to, but it means external reality would almost certainly be worse than the simulation, because the external reality contains the simulation, and therefore contains at least as much severe suffering as the simulation. put another way, is it worse for a torture chamber to exist, or for torture chamber makers to exist?

    I think what this is highlighting is a disconnect in their thinking between what I presume to be a greatest good based utilitarian framework, and what they actually think is cool and are talking about, which is heroic individual action. they want to present as hari seldon but they’re really wanking about pulp adventure shit