It’s not always easy to distinguish between existentialism and a bad mood.

  • 2 Posts
  • 25 Comments
Joined 1 year ago
cake
Cake day: July 2nd, 2023

help-circle

  • Archive the weights of the models we build today, so we can rebuild them in the future if we need to recompense them for moral harms.

    To be clear, this means that if you treat someone like shit all their life, saying you’re sorry to their Sufficiently Similar Simulation™ like a hundred years after they are dead makes it ok.

    This must be one of the most blatantly supernatural rationalist Accepted Truths, that if your simulation is of sufficiently high fidelity you will share some ontology of self with it, which by the way is how the basilisk can torture you even if you’ve been dead for centuries.


  • IQ test performance correlates with level of education

    I read somewhere that this claim owes a little too much to the inclusion of pathological cases at the lower end of the spectrum, meaning that since below a certain score like 85 you are basically intellectually disabled (or even literally brain dead, or just dead) and academic achievement becomes nonexistent, the correlation is far more pronounced than if we were comparing educational attainment at the more functional ranges.

    Will post source if I find it.



  • The whole article is sneertastic. Nothing to add, will be sharing.

    What you’re dealing with here is a cult. These tech billionaires are building a religion. They believe they’re creating something with AI that’s going to be the most powerful thing that’s ever existed — this omniscient, all-knowing God-like entity — and they see themselves as the prophets of that future.

    eugenic TESCREAL screed (an acronym for … oh, never mind).

    “Immortality is a key part of this belief system. In that way, it’s very much like a religion. That’s why some people are calling it the Scientology of Silicon Valley.”

    Others in San Francisco are calling it “The Nerd Reich.”

    “I think these guys see Trump as an empty vessel,” says the well-known exec who’s supporting Harris. “They see him as a way to pursue their political agenda, which is survival of the fittest, no regulation, burn-the-house-down nihilism that lacks any empathy or nuance.”












  • It hasn’t worked ‘well’ for computers since like the pentium, what are you talking about?

    The premise was pretty dumb too, as in, if you notice that a (very reductive) technological metric has been rising sort of exponentially, you should probably assume something along the lines of we’re probably still at the low hanging fruit stage of R&D, it’ll stabilize as it matures, instead of proudly proclaiming that surely it’ll approach infinity and break reality.

    There’s nothing smart or insightful about seeing a line in a graph trending upwards and assuming it’s gonna keep doing that no matter what. Not to mention that type of decontextualized wishful thinking is emblematic of the TREACLES mindset mentioned in the community’s blurb that you should check out.

    So yeah, he thought up the Singularity which is little more than a metaphysical excuse to ignore regulations and negative externalities because with tech rupture around the corner any catastrophic mess we make getting there won’t matter. See also: the whole current AI debacle.