Getting Trump reelected should count as ‘billionaire philanthropy’.
It’s not always easy to distinguish between existentialism and a bad mood.
Getting Trump reelected should count as ‘billionaire philanthropy’.
thinkers like computer scientist Eliezer Yudkowsky
That’s gotta sting a bit.
Nabokov’s Lolita really shouldn’t be pigeonholed as merely that, but I guess the movies are another story.
Dolores in Lolita was like twelve though, at least in the book.
In case anybody skips the article, it’s a six year old cybernetically force grown to the body of a horny 13 to 14 year old.
The rare sentence that makes me want to take a shower for having written it.
…with a huge chip on his shoulder about how the system caters primarily to normies instead of specifically to him, thinks he has fat-no-matter-what genes and is really into rape play.
There’s also the communal living, the workplace polyamory along with the prominence of the consensual non-consensual kink, the tithing of the bulk of your earnings and the extreme goals-justify-the-means moralising, the emphasis on psychedelics and prescription amphetamines, and so on and so forth.
Meaning, while calling them a cult incubator is actually really insightful and well put, I have a feeling that the closer you get to TESCREAL epicenters like the SFB the more explicitly culty things start to get.
EA started as an offshoot of LessWrong, and LW-style rationalism is still the main gateway into EA as it’s pushed relentlessly in those circles, and EA contributes vast amounts of money back into LW goals. Air strikes against datacenters guy is basically bankrolled by Effective Altruism and is also the reason EA considers magic AIs (so called Artificial Super Intelligences) by far the most important risk to humanity’s existence; they consider climate change mostly survivable and thus of far less importance, for instance.
Needless to say, LLM peddlers loved that (when they aren’t already LW/EAs or adjacent themselves, like the previous OpenAI administrative board before Altman and Microsoft took over). edit: also the founders of Anthropic.
Basically you can’t discuss one without referencing the other.
It’s complicated.
It’s basically a forum created to venerate the works and ideas of that guy who in the first wave of LLM hype had an editorial published in TIME where he called for a worldwide moratorium on AI research and GPU sales to be enforced with unilateral airstrikes, and whose core audience got there by being groomed by one the most obnoxious Harry Potter fanfictions ever written, by said guy.
Their function these days tends to be to provide an ideological backbone of bad scifi justifications to deregulation and the billionaire takeover of the state, which among other things has made them hugely influential in the AI space.
They are also communicating vessels with Effective Altruism.
If this piques your interest check the links on the sidecard.
The justice of the argument is clear to me. I have already made arrangements for my children to come to not be genetically mine. When the time comes, I will call upon their aid, presuming the sequencing does not tell us there are incompatibilities
I like how that implies that he keeps a running genetic tally of all his acquaintances in case he needs to tap them for genetic material, which is not creepy at all.
(Rorschach voice) Watched them today—parade their genetic blessings like medals earned in some cosmic lottery. Strong jawlines, symmetrical faces, eyes that catch the light just right. Retrieved 23AndMe card from alley behind 41st street. Admins restrained commenter screaming how it’s all just eugenics. Is everyone but me going mad?
I liked how Scalzi brushed it away, basically your consciousness gets copied to a new body, which kills the old one, and an artifact of the transfer process is that for a few moments you experience yourself as a mind with two bodies, meaning you have at least the impression of continuity of self, which is enough for most people to get on with living in a new body and let philosophers do the worrying.
I feel like a subset of sci-fi and philosophical meandering really is just increasingly convoluted paths of trying to avoid or come to terms with death as a possibly necessary component of life.
Given rationalism’s intellectual heritage, this is absolutely transhumanist cope for people who were counting on some sort of digital personhood upload as a last resort to immortality in their lifetimes.
You mean swapped out with something that has feelings that can be hurt by mean language? Wouldn’t that be something.
Are we putting endocrine systems in LLMs now?
Archive the weights of the models we build today, so we can rebuild them in the future if we need to recompense them for moral harms.
To be clear, this means that if you treat someone like shit all their life, saying you’re sorry to their Sufficiently Similar Simulation™ like a hundred years after they are dead makes it ok.
This must be one of the most blatantly supernatural rationalist Accepted Truths, that if your simulation is of sufficiently high fidelity you will share some ontology of self with it, which by the way is how the basilisk can torture you even if you’ve been dead for centuries.
IQ test performance correlates with level of education
I read somewhere that this claim owes a little too much to the inclusion of pathological cases at the lower end of the spectrum, meaning that since below a certain score like 85 you are basically intellectually disabled (or even literally brain dead, or just dead) and academic achievement becomes nonexistent, the correlation is far more pronounced than if we were comparing educational attainment at the more functional ranges.
Will post source if I find it.
Yeah but like national socialist power metal isn’t a thing in the way nsbm is.
I wonder if it’s primarily occultism’s nazi problem metastasizing, foundational dorks like Vikernes notwithstanding.
The whole article is sneertastic. Nothing to add, will be sharing.
What you’re dealing with here is a cult. These tech billionaires are building a religion. They believe they’re creating something with AI that’s going to be the most powerful thing that’s ever existed — this omniscient, all-knowing God-like entity — and they see themselves as the prophets of that future.
eugenic TESCREAL screed (an acronym for … oh, never mind).
“Immortality is a key part of this belief system. In that way, it’s very much like a religion. That’s why some people are calling it the Scientology of Silicon Valley.”
Others in San Francisco are calling it “The Nerd Reich.”
“I think these guys see Trump as an empty vessel,” says the well-known exec who’s supporting Harris. “They see him as a way to pursue their political agenda, which is survival of the fittest, no regulation, burn-the-house-down nihilism that lacks any empathy or nuance.”
He wasn’t usually. Another difference with siskind was that with TLP you mostly knew where you stood, or at least I don’t remember any near-end-of-text jumpscares where it’s revealed the whole thing was meant as really convoluted IQ apologetics, or some naive reframing of the latest EA embarrassment.
He seems very aware of how writing works at least, and unlike EY some of his fiction is serviceable.
Literally what stuff, that AI would get somewhat better as technology progresses?
I seem to remember Yud specifically wasn’t that impressed with machine learning and thought so-called AGI would come about through ELIZA type AIs.