"Oh, please, you can't take him too seriously. " - flagamuffin on Eliezer Yudkowsky I grew interested in Yudkowsky about the time I found out about his box experiment: you get on IRC with Yudkowsky, he pretends to be an AI in a box, you pretend to be the person keeping him in the box, and then through the magic of rationalism he convinces you to let him out and the world ends the end. Except that's it. That's the joke. His whole schtick is that AI is omnipotent, omnipresent and will go from zero to Skynet faster than you can say "technofuturism." Over the intervening years I've grown less and less patient with that brand of "rationalism" because really, it's an egocentric insistence that you understand the world better than anybody else except Your Holy God Yudkowsky. And it's shit like this: Fuckin' no less then Denis Diderot pointed out - at the time - that Pascal's Wager only works if there's only one god. Yet Yudikowsky's crew is far more willing to believe in the utter and total domination of a single AI than they are in, you know, ford and chevy. It's a bunch of surface-level bullshit passed off as profundity and that's about the point where I start looking around to see who's taking this shit seriously. Because honestly I think Yukidikowsky is trolling most of the time.
I enjoyed the thought experiment, but could've done with less focus on Yudkowsky and LessWrong. The guys who are worried about it as an actual real life threat seem...fucking wacky, to be honest with you. I skimmed some of his papers over coffee this morning, and discovered that Eliezer Yudkowsky doesn't believe in the scientific method, and thinks it can be replaced with pure Bayesian reasoning. I kind of hate rationalwiki, but their article on him is pretty thorough. ... Yudkowsky is almost entirely unpublished outside of his own foundation and blogs and never finished high school, much less did any actual AI research. No samples of his AI coding have been made public. ... His actual, observable results in the real world are a popular fan fiction, a pastiche erotic light novel, a large pile of blog posts and a surprisingly well-funded research organisation — that has produced fewer papers in a decade and a half than a single graduate student produces in the course of a physics Ph.D (and the latter's would be peer reviewed). Even good ole' fashion Wikipedia is uncharitable: ...All if this research just because the quote from the article frustrated me. "Listen to me very closely, you idiot." What a crock of bullshit.It is important to note that, as well as no training in his claimed field, Yudkowsky has pretty much no accomplishments of any sort to his credit beyond getting Peter Thiel to give him money. He claims to be a skilled computer programmer, but has no code available other than Flare, an unfinished computer language for AI programming with XML-based syntax
Yudkowsky has no formal secondary education, never having attended high school or college. Nevertheless he [self] reported scoring a perfect 1600 on his SATs.
So really this is just a rebranding of Pascal's wager. Which, fine, but do we really need to act like this is a radical enough concept that it could tear you mind apart?Roko’s Basilisk has told you that if you just take Box B, then it’s got Eternal Torment in it, because Roko’s Basilisk would really you rather take Box A and Box B. In that case, you’d best make sure you’re devoting your life to helping create Roko’s Basilisk! Because, should Roko’s Basilisk come to pass (or worse, if it’s already come to pass and is God of this particular instance of reality) and it sees that you chose not to help it out, you’re screwed.
Elizabeth Sandifer, Neoreaction a Basilisk: Essays on and around the alt-rightTheology buffs will recognize this as just a variation of Pascal's wager, which it was, but carefully tailored to work within a particular system, and deliberately framed in terms of the popular meme 'The Game,' where the only rules are that you lose any time you think about the game. But for all that its basic contours are familiar, it's crucial to realize that Roko arrived at his Basilisk honestly and sincerely, assembling premises widely accepted by the LessWrong community until he found himself unexpectedly transfixed by its gaze. The result was a frankly hilarious community meltdown in which people lost their shit as ideas they'd studiously internalized threatened to torture them for all eternity if they didn't hand over all their money to MIRI, culminating in Yudkowsky himself stepping in to ban all further discussion of the dread beast.
Yeah, I read it. It's unnecessarily pesssimistic but also pretty damn funny. Ignore the Donald Trump essay and possibly the TERF one; I think the rest would be right up your alley. There's a little bit of genuine analysis of the alt-right, but mostly it's just pointing out how fucking stupid and inane most of these guys are, and how impossible it is to defeat an ideology that's so fucking stupid. What are you gonna do, argue with them?If one wanted to be snarkily uncharitable--and if it's not clear, this is very much the sort of book that does...
...or it means that some LessWrong members are idiots who don't see any flaws in the preconditions. If someone presents a dichotomy, one can assume it's a false dichotomy until proven otherwise.Now, Roko’s Basilisk is only dangerous if you believe all of the above preconditions and commit to making the two-box deal with the Basilisk. But at least some of the LessWrong members do believe all of the above, which makes Roko’s Basilisk quite literally forbidden knowledge