- 8 Posts
- 33 Comments
TinyTimmyTokyo@awful.systemsOPto SneerClub@awful.systems•Nate Soares: "Buy my book or everyone dies'English0·13 days agoThat thread is wild. Nate proposes techniques to get his kooky beliefs taken more seriously. Others point out that those very same techniques counterproductively pushed people to into the e/acc camp. Nate deletes those other people’s comments. How rationalist of him!
TinyTimmyTokyo@awful.systemsto SneerClub@awful.systems•shocked that anyone could be so uncouth as to accuse our boy Cremieux/Lasker of *plagiarism* goodness meEnglish0·3 months agoNot surprised to find Sabine in the comments. She’s been totally infected by the YouTube algorithm and captured by her new culture-war-mongering audience. Kinda sad, really.
TinyTimmyTokyo@awful.systemsto SneerClub@awful.systems•The Endgame of Edgelord Eschatology - new Emile TorresEnglish0·3 months agoWe should be trying to stop this from coming to pass with the urgency we would try to stop a killer asteroid from striking Earth. Why aren’t we?
Wait, what are we trying to stop from coming to pass? Superintelligent AIs? Either I’m missing his point, or he really agrees with the doomers that LLMs are on their way to becoming “superintelligent”.
TinyTimmyTokyo@awful.systemsto SneerClub@awful.systems•Scoots hot new AGI goss just dropped, Trump loses 3rd election to Grok in stunning upsetEnglish0·3 months agoScott talks a bit about it in the video, but he was recently in the news as the guy who refused to sign a non-disparagement agreement when he left OpenAI, which caused them to claw back his stock options.
TinyTimmyTokyo@awful.systemsto SneerClub@awful.systems•Scoots hot new AGI goss just dropped, Trump loses 3rd election to Grok in stunning upsetEnglish0·3 months agoI’m fascinated by the way they’re hyping up Daniel Kokotajlo to be some sort of AI prophet. Scott does it here, but so does Caroline Jeanmaire in the OP’s twitter link. It’s like they all got the talking point (probably from Scott) that Daniel is the new guru. Perhaps they’re trying to anoint someone less off-putting and awkward than Yud. (This is also the first time I’ve ever seen Scott on video, and he definitely gives off a weird vibe.)
TinyTimmyTokyo@awful.systemsto SneerClub@awful.systems•Scoots hot new AGI goss just dropped, Trump loses 3rd election to Grok in stunning upsetEnglish0·3 months agoAfter minutes of meticulous research and quantitative analysis, I’ve come up with my own predictions about the future of AI.
TinyTimmyTokyo@awful.systemsto SneerClub@awful.systems•Scoots hot new AGI goss just dropped, Trump loses 3rd election to Grok in stunning upsetEnglish0·3 months ago“USG gets captured by AGI”.
Promise?
TinyTimmyTokyo@awful.systemsto SneerClub@awful.systems•LessOnline is a festival celebrating truthseeking and blogging, the totally not race science is just a bonusEnglish0·3 months agoOf course they use shitty AI slop as the background for their web page.
Like, what the hell is it even supposed to be? A mustachioed man writing in a journal in what appears to be a French village town square? Shadowy individuals chatting around an oddly incongruous fire pit? Guitar dude and listener sitting on invisible benches? I get that AI produces this kind of garbage all the time, but did the lesswrongers even bother to evaluate it for appropriateness?
TinyTimmyTokyo@awful.systemsto SneerClub@awful.systems•EAs sad that their previous rich grifters are trying to distance themselves from the movementEnglish0·3 months agoThis commenter may be saying something we already knew, but it’s nice to have the confirmation that Anthropic is chock full of EAs:
(I work at Anthropic, though I don’t claim any particular insight into the views of the cofounders. For my part I’ll say that I identify as an EA, know many other employees who do, get enormous amounts of value from the EA community, and think Anthropic is vastly more EA-flavored than almost any other large company, though it is vastly less EA-flavored than, like, actual EA orgs. I think the quotes in the paragraph of the Wired article give a pretty misleading picture of Anthropic when taken in isolation and I wouldn’t personally have said them, but I think “a journalist goes through your public statements looking for the most damning or hypocritical things you’ve ever said out of context” is an incredibly tricky situation to come out of looking good and many of the comments here seem a bit uncharitable given that.)
TinyTimmyTokyo@awful.systemsto SneerClub@awful.systems•Renowned Tumblr folklore expert Strange Æons covers Yud's Potter FanficEnglish0·4 months agoSorry, when she started taking Yud’s claims to be a “renowned AI researcher” at face value, I noped out.
TinyTimmyTokyo@awful.systemsto SneerClub@awful.systems•Yudkowsky: eugenics is now "the third most important project in the world." After AI doom and anime, presumably.English0·5 months agoOne of the most important projects in the world. Somebody should fund it.
The Pioneer Fund (now the Human Diversity Foundation) has been funding this bullshit for years, Yud.
TinyTimmyTokyo@awful.systemsto SneerClub@awful.systems•Post your current Ziz news and discussion hereEnglish0·5 months agoLots of discussion on the orange site post about this today.
(I mentioned this in the other sneerclub thread on the topic but reposted it here since this seems to be the more active discussion zone for the topic.)
TinyTimmyTokyo@awful.systemsto SneerClub@awful.systems•Aella's timeline on the current Zizian issue (warning: she sure does love her deadnames)English0·5 months agoI should probably mention that this person went on to write other comments in the same thread, revealing that they’re still heavily influenced by Bay ?Area rationalism (or what one other commenter brilliantly called “ritual multiplication”).
TinyTimmyTokyo@awful.systemsto SneerClub@awful.systems•Aella's timeline on the current Zizian issue (warning: she sure does love her deadnames)English0·5 months agoThe story has now hopped to the orange site. I was expecting a shit-show, but there have been a few insightful comments from critics of the rationalists. This one from “rachofsunshine” for instance:
[Former member of that world, roommates with one of Ziz’s friends for a while, so I feel reasonably qualified to speak on this.]
The problem with rationalists/EA as a group has never been the rationality, but the people practicing it and the cultural norms they endorse as a community.
As relevant here:
-
While following logical threads to their conclusions is a useful exercise, each logical step often involves some degree of rounding or unknown-unknowns. A -> B and B -> C means A -> C in a formal sense, but A -almostcertainly-> B and B -almostcertainly-> C does not mean A -almostcertainly-> C. Rationalists, by tending to overly formalist approaches, tend to lose the thread of the messiness of the real world and follow these lossy implications as though they are lossless. That leads to…
-
Precision errors in utility calculations that are numerically-unstable. Any small chance of harm times infinity equals infinity. This framing shows up a lot in the context of AI risk, but it works in other settings too: infinity times a speck of dust in your eye >>> 1 times murder, so murder is “justified” to prevent a speck of dust in the eye of eternity. When the thing you’re trying to create is infinitely good or the thing you’re trying to prevent is infinitely bad, anything is justified to bring it about/prevent it respectively.
-
Its leadership - or some of it, anyway - is extremely egotistical and borderline cult-like to begin with. I think even people who like e.g. Eliezer would agree that he is not a humble man by any stretch of the imagination (the guy makes Neil deGrasse Tyson look like a monk). They have, in the past, responded to criticism with statements to the effect of “anyone who would criticize us for any reason is a bad person who is lying to cause us harm”. That kind of framing can’t help but get culty.
-
The nature of being a “freethinker” is that you’re at the mercy of your own neural circuitry. If there is a feedback loop in your brain, you’ll get stuck in it, because there’s no external “drag” or forcing functions to pull you back to reality. That can lead you to be a genius who sees what others cannot. It can also lead you into schizophrenia really easily. So you’ve got a culty environment that is particularly susceptible to internally-consistent madness, and finally:
-
It’s a bunch of very weird people who have nowhere else they feel at home. I totally get this. I’d never felt like I was in a room with people so like me, and ripping myself away from that world was not easy. (There’s some folks down the thread wondering why trans people are overrepresented in this particular group: well, take your standard weird nerd, and then make two-thirds of the world hate your guts more than anything else, you might be pretty vulnerable to whoever will give you the time of day, too.)
TLDR: isolation, very strong in-group defenses, logical “doctrine” that is formally valid and leaks in hard-to-notice ways, apocalyptic utility-scale, and being a very appealing environment for the kind of person who goes super nuts -> pretty much perfect conditions for a cult. Or multiple cults, really. Ziz’s group is only one of several.
-
TinyTimmyTokyo@awful.systemsto SneerClub@awful.systems•That tracing woodgrains peice on David Gerard is outEnglish0·1 year agoI’m noticing that people who criticize him on that subreddit are being downvoted, while he’s being upvoted.
I wouldn’t be surprised if, as part of his prodigious self-promotion of this overlong and tendentious screed, he’s steered some of his more sympathetic followers to some of these forums.
TinyTimmyTokyo@awful.systemsto SneerClub@awful.systems•That tracing woodgrains peice on David Gerard is outEnglish0·1 year agoTrace seems a bit… emotional. You ok, Trace?
TinyTimmyTokyo@awful.systemsto SneerClub@awful.systems•MetaMed, slight return: BayesMed, featuring Jann Tallin. AI workflows for clinicians. No medical advice. No sign of Vassar.English0·1 year agoBut will my insurance cover a visit to Dr. Spicy Autocomplete?
TinyTimmyTokyo@awful.systemsto SneerClub@awful.systems•That tracing woodgrains peice on David Gerard is outEnglish0·1 year agoSo now Steve Sailer has shown up in this essay’s comments, complaining about how Wikipedia has been unfairly stifling scientific racism.
Birds of a feather and all that, I guess.
TinyTimmyTokyo@awful.systemsto SneerClub@awful.systems•That tracing woodgrains peice on David Gerard is outEnglish0·1 year agowhat is the entire point of singling out Gerard for this?
He’s playing to his audience, which includes a substantial number of people with lifetime subscriptions to the Unz Review, Taki’s crapazine and Mankind Quarterly.
Apparently the NYT hit-piece’s author, Benjamin Ryan, is a subscriber to Jordan Lasker’s (Cremieux’s) substack.