• 0 Posts
  • 114 Comments
Joined 1 year ago
cake
Cake day: March 22nd, 2024

help-circle

  • Below a minimum level of hingedness the actual mental ability of the cult leader in question is irrelevant. On one hand it speaks to an ability to invent and operate incredibly complex frameworks and models of the world. On the other hand whatever intelligence they have isn’t sufficient for them to realize (or be convincible) that they’re fucking nutters.

    This leads us into part 17 of my ongoing essay about how intelligence - as in “the raw mental resources supposedly measured by IQ or whatever other metrics” is useless and probably incoherent.




  • You know I was wondering about where the name came from and it’s sufficiently plausible that I believe it. Notably in the story her threat - the reason just being around her is so dangerous - is because she has some kind of perfect predictive ability on top of all the giant psychic kaiju nonsense. So she attacks a city and finds the one woman who needs to die in order for her super-thinker husband to go mad and build an army of evil robots or whatever.

    It very much rhymes with the Rationalist image of a malevolent superintelligence and I can definitely understand it being popular in those circles, especially the “I’m too edgy to recognize that Taylor is wrong, actually” parts of the readership.







  • Read through the whole affair and just had to keep shaking my head the whole way through. Like, they’re at least capable of pretending to disavow the Nazis and fundies who follow the same pro-kid-having definitely-not-eugenics ideas. And they talk about some of the real obstacles or having kids; raising a child and setting them up for something resembling success is expensive and hard and the world doesn’t exactly feel like it’s on an upwards trajectory. But rather than look at those problems in their own right and try to actually make living cheaper or easier for people in ways that would make having kids more viable and more rewarding (How long are you at the office? How much time and energy do you have when you get home?), these upper-class twits of their generation are trying to convince the few people for whom those aren’t as serious problems that they should have kids. Like, I’m all in favor of improving reproductive health and that kind of technology but we are nowhere near the point where that’s going to make a population-level impact on demographics. Like, they’re out here trying to figure out if they can make the run to third base when they’ve only just brained the mascot with a foul ball. Solve the actual problem in front of us first.


  • ‘It became clear to me that people wanted more children than they were having,’ Babu says.

    So clearly the best action is to constantly tell everyone how great kids are and that they should totally have them. Because that solves the problem of people wanting kids they don’t/can’t have. I try to read even our designated sneer fodder in good faith but I can’t understand why anyone thinks these people are at all intelligent beyond the “only slightly less than average” level. I thought Good Will Hunting taught everyone the difference between smart and rich, but maybe that was just me.



  • I’ve watched a few of those “I taught an AI to play tag” videos from some time back, and while its interesting to see what kinds of degenerate strategies the computer finds (trying to find a way out of bounds being a consistent favorite after enough iterations) it’s always a case of “wow I screwed up in designing the environment or rewards” and not “dang, look how smart the computer is!”

    As always with this nonsense, the problem is always that the machine is too dumb to be trusted rather than too smart and powerful. Like, identifying patterns that people would miss is arguably the biggest strength of machine learning in general, but that’s not the same as those patterns being meaningful or useful.


  • You could argue that another moral of Parfit’s hitchhiker is that being a purely selfish agent is bad, and humans aren’t purely selfish so it’s not applicable to the real world anyway, but in Yudkowsky’s philosophy—and decision theory academia—you want a general solution to the problem of rational choice where you can take any utility function and win by its lights regardless of which convoluted setup philosophers drop you into.

    I’m impressed that someone writing on LW managed to encapsulate my biggest objection to their entire process this coherently. This is an entire model of thinking that tries to elevate decontextualization and debate-team nonsense into the peak of intellectual discourse. It’s a manner of thinking that couldn’t have been better designed to hide the assumptions underlying repugnant conclusions if indeed it had been specifically designed for that purpose.


  • I mean, if you’re talking specifically in context about people with vaginas instead of women then using the gendered term does exclude both women without vaginas and men with them who are probably a relevant group in that context. But seriously how often does that come up for you? How often is the most important part of the woman you’re referring to her anatomy?

    And while “females” is probably just as accurate in most contexts it’s also been poisoned with incel vibes at this point and it’s gonna be some time before it can be salvaged for general use outside of specific biological contexts without sounding like you’re about to unload a whole lot of baggage into the thread instead of getting therapy.