• 5 Posts
  • 22 Comments
Joined 2 years ago
cake
Cake day: July 6th, 2023

help-circle




  • The story has now hopped to the orange site. I was expecting a shit-show, but there have been a few insightful comments from critics of the rationalists. This one from “rachofsunshine” for instance:

    [Former member of that world, roommates with one of Ziz’s friends for a while, so I feel reasonably qualified to speak on this.]

    The problem with rationalists/EA as a group has never been the rationality, but the people practicing it and the cultural norms they endorse as a community.

    As relevant here:

    1. While following logical threads to their conclusions is a useful exercise, each logical step often involves some degree of rounding or unknown-unknowns. A -> B and B -> C means A -> C in a formal sense, but A -almostcertainly-> B and B -almostcertainly-> C does not mean A -almostcertainly-> C. Rationalists, by tending to overly formalist approaches, tend to lose the thread of the messiness of the real world and follow these lossy implications as though they are lossless. That leads to…

    2. Precision errors in utility calculations that are numerically-unstable. Any small chance of harm times infinity equals infinity. This framing shows up a lot in the context of AI risk, but it works in other settings too: infinity times a speck of dust in your eye >>> 1 times murder, so murder is “justified” to prevent a speck of dust in the eye of eternity. When the thing you’re trying to create is infinitely good or the thing you’re trying to prevent is infinitely bad, anything is justified to bring it about/prevent it respectively.

    3. Its leadership - or some of it, anyway - is extremely egotistical and borderline cult-like to begin with. I think even people who like e.g. Eliezer would agree that he is not a humble man by any stretch of the imagination (the guy makes Neil deGrasse Tyson look like a monk). They have, in the past, responded to criticism with statements to the effect of “anyone who would criticize us for any reason is a bad person who is lying to cause us harm”. That kind of framing can’t help but get culty.

    4. The nature of being a “freethinker” is that you’re at the mercy of your own neural circuitry. If there is a feedback loop in your brain, you’ll get stuck in it, because there’s no external “drag” or forcing functions to pull you back to reality. That can lead you to be a genius who sees what others cannot. It can also lead you into schizophrenia really easily. So you’ve got a culty environment that is particularly susceptible to internally-consistent madness, and finally:

    5. It’s a bunch of very weird people who have nowhere else they feel at home. I totally get this. I’d never felt like I was in a room with people so like me, and ripping myself away from that world was not easy. (There’s some folks down the thread wondering why trans people are overrepresented in this particular group: well, take your standard weird nerd, and then make two-thirds of the world hate your guts more than anything else, you might be pretty vulnerable to whoever will give you the time of day, too.)

    TLDR: isolation, very strong in-group defenses, logical “doctrine” that is formally valid and leaks in hard-to-notice ways, apocalyptic utility-scale, and being a very appealing environment for the kind of person who goes super nuts -> pretty much perfect conditions for a cult. Or multiple cults, really. Ziz’s group is only one of several.














  • She seems to do this kind of thing a lot.

    According to a comment, she apparently claimed on Facebook that, due to her post, “around 75% of people changed their minds based on the evidence!”

    After someone questioned how she knew it was 75%:

    Update: I changed the wording of the post to now state: 𝗔𝗿𝗼𝘂𝗻𝗱 𝟳𝟓% 𝗼𝗳 𝗽𝗲𝗼𝗽𝗹𝗲 𝘂𝗽𝘃𝗼𝘁𝗲𝗱 𝘁𝗵𝗲 𝗽𝗼𝘀𝘁, 𝘄𝗵𝗶𝗰𝗵 𝗶𝘀 𝗮 𝗿𝗲𝗮𝗹𝗹𝘆 𝗴𝗼𝗼𝗱 𝘀𝗶𝗴𝗻*

    And the * at the bottom says: Did some napkin math guesstimates based on the vote count and karma. Wide error bars on the actual ratio. And of course this is not proof that everybody changed their mind. There’s a lot of reasons to upvote the post or down vote it. However, I do think it’s a good indicator.

    She then goes on to talk about how she made the Facebook post private because she didn’t think it should be reposted in places where it’s not appropriate to lie and make things up.

    Clown. Car.


  • What a bunch of monochromatic, hyper-privileged, rich-kid grifters. It’s like a nonstop frat party for rich nerds. The photographs and captions make it obvious:

    The gang going for a hiking adventure with AI safety leaders. Alice/Chloe were surrounded by a mix of uplifting, ambitious entrepreneurs and a steady influx of top people in the AI safety space.

    The gang doing pool yoga. Later, we did pool karaoke. Iguanas everywhere.

    Alice and Kat meeting in “The Nest” in our jungle Airbnb.

    Alice using her surfboard as a desk, co-working with Chloe’s boyfriend.

    The gang celebrating… something. I don’t know what. We celebrated everything.

    Alice and Chloe working in a hot tub. Hot tub meetings are a thing at Nonlinear. We try to have meetings in the most exciting places. Kat’s favorite: a cave waterfall.

    Alice’s “desk” even comes with a beach doggo friend!

    Working by the villa pool. Watch for monkeys!

    Sunset dinner with friends… every day!

    These are not serious people. Effective altruism in a nutshell.






  • Random blue check spouts disinformation about “seed oils” on the internet. Same random blue check runs a company selling “safe” alternatives to seed oils. Yud spreads this huckster’s disinformation further. In the process he reveals his autodidactically-obtained expertise in biology:

    Are you eating animals, especially non-cows? Pigs and chickens inherit linoleic acid from their feed. (Cows reprocess it more.)

    Yes, Yud, because that’s how it works. People directly “inherit” organic molecules totally unmetabolized from the animals they eat.

    I don’t know why Yud is fat, but armchair sciencing probably isn’t going to fix it.


  • This is good:

    Take the sequence {1,2,3,4,x}. What should x be? Only someone who is clueless about induction would answer 5 as if it were the only answer (see Goodman’s problem in a philosophy textbook or ask your closest Fat Tony) [Note: We can also apply here Wittgenstein’s rule-following problem, which states that any of an infinite number of functions is compatible with any finite sequence. Source: Paul Bogossian]. Not only clueless, but obedient enough to want to think in a certain way.

    Also this:

    If, as psychologists show, MDs and academics tend to have a higher “IQ” that is slightly informative (higher, but on a noisy average), it is largely because to get into schools you need to score on a test similar to “IQ”. The mere presence of such a filter increases the visible mean and lower the visible variance. Probability and statistics confuse fools.

    And:

    If someone came up w/a numerical“Well Being Quotient” WBQ or “Sleep Quotient”, SQ, trying to mimic temperature or a physical quantity, you’d find it absurd. But put enough academics w/physics envy and race hatred on it and it will become an official measure.