It’s not always easy to distinguish between existentialism and a bad mood.

  • 3 Posts
  • 42 Comments
Joined 1 year ago
cake
Cake day: July 2nd, 2023

help-circle







  • There’s also the communal living, the workplace polyamory along with the prominence of the consensual non-consensual kink, the tithing of the bulk of your earnings and the extreme goals-justify-the-means moralising, the emphasis on psychedelics and prescription amphetamines, and so on and so forth.

    Meaning, while calling them a cult incubator is actually really insightful and well put, I have a feeling that the closer you get to TESCREAL epicenters like the SFB the more explicitly culty things start to get.


  • EA started as an offshoot of LessWrong, and LW-style rationalism is still the main gateway into EA as it’s pushed relentlessly in those circles, and EA contributes vast amounts of money back into LW goals. Air strikes against datacenters guy is basically bankrolled by Effective Altruism and is also the reason EA considers magic AIs (so called Artificial Super Intelligences) by far the most important risk to humanity’s existence; they consider climate change mostly survivable and thus of far less importance, for instance.

    Needless to say, LLM peddlers loved that (when they aren’t already LW/EAs or adjacent themselves, like the previous OpenAI administrative board before Altman and Microsoft took over). edit: also the founders of Anthropic.

    Basically you can’t discuss one without referencing the other.


  • It’s complicated.

    It’s basically a forum created to venerate the works and ideas of that guy who in the first wave of LLM hype had an editorial published in TIME where he called for a worldwide moratorium on AI research and GPU sales to be enforced with unilateral airstrikes, and whose core audience got there by being groomed by one the most obnoxious Harry Potter fanfictions ever written, by said guy.

    Their function these days tends to be to provide an ideological backbone of bad scifi justifications to deregulation and the billionaire takeover of the state, which among other things has made them hugely influential in the AI space.

    They are also communicating vessels with Effective Altruism.

    If this piques your interest check the links on the sidecard.


  • The justice of the argument is clear to me. I have already made arrangements for my children to come to not be genetically mine. When the time comes, I will call upon their aid, presuming the sequencing does not tell us there are incompatibilities

    I like how that implies that he keeps a running genetic tally of all his acquaintances in case he needs to tap them for genetic material, which is not creepy at all.

    (Rorschach voice) Watched them today—parade their genetic blessings like medals earned in some cosmic lottery. Strong jawlines, symmetrical faces, eyes that catch the light just right. Retrieved 23AndMe card from alley behind 41st street. Admins restrained commenter screaming how it’s all just eugenics. Is everyone but me going mad?






  • Archive the weights of the models we build today, so we can rebuild them in the future if we need to recompense them for moral harms.

    To be clear, this means that if you treat someone like shit all their life, saying you’re sorry to their Sufficiently Similar Simulation™ like a hundred years after they are dead makes it ok.

    This must be one of the most blatantly supernatural rationalist Accepted Truths, that if your simulation is of sufficiently high fidelity you will share some ontology of self with it, which by the way is how the basilisk can torture you even if you’ve been dead for centuries.


  • IQ test performance correlates with level of education

    I read somewhere that this claim owes a little too much to the inclusion of pathological cases at the lower end of the spectrum, meaning that since below a certain score like 85 you are basically intellectually disabled (or even literally brain dead, or just dead) and academic achievement becomes nonexistent, the correlation is far more pronounced than if we were comparing educational attainment at the more functional ranges.

    Will post source if I find it.



  • The whole article is sneertastic. Nothing to add, will be sharing.

    What you’re dealing with here is a cult. These tech billionaires are building a religion. They believe they’re creating something with AI that’s going to be the most powerful thing that’s ever existed — this omniscient, all-knowing God-like entity — and they see themselves as the prophets of that future.

    eugenic TESCREAL screed (an acronym for … oh, never mind).

    “Immortality is a key part of this belief system. In that way, it’s very much like a religion. That’s why some people are calling it the Scientology of Silicon Valley.”

    Others in San Francisco are calling it “The Nerd Reich.”

    “I think these guys see Trump as an empty vessel,” says the well-known exec who’s supporting Harris. “They see him as a way to pursue their political agenda, which is survival of the fittest, no regulation, burn-the-house-down nihilism that lacks any empathy or nuance.”