I'm *finally* reading Neoreaction a Basilisk.
I'm 100 pages in but it suddenly occurred to me that a liveblog would be a good idea. I've been having thoughts at such a rate that I can't remember them without writing them down.
One observation of EY that I'd made prior to NAB but that I keep getting reminded of: EY shows some pretty clear signs of unresolved trauma or even PTSD, and HPMOR's EY!Harry really really rubbed that in.

I mean this with seriousness and compassion: he should see a therapist.
I confess to feeling like I need to use a secret decoder ring to read NAB, just because of the philosophical and literary references being casually pulled out and used as load-bearing analogies. (1/3)
To some extent this is a "me" problem, as I don't have much exposure to English-language literature outside of what gets taught in an American high school and I have approximately zero interaction with the traditions of philosophy except via frantic Wikipedia-ing. (2/3)
Still, I feel like NAB contains a little bit of self-sabotage here, by putting so much weight on analogies with Paradise Lost that a "normie" trying to learn about neoreaction wouldn't be able to follow. (3/3)
“This forces us to consider white culture as a set of perpetual ruins—as something that has always been lost, and that can only be apprehended as a tenuous and incomplete reconstruction.”

This passage makes me think of Tolkien and his Gondor.
We're talking Turing & Gödel again, and I'm reminded of just how ridiculous EY's position on superintelligent AIs is. Not just "as smart as a human genius", not just "so far beyond a human that it can hack your brain by typing at you"... (1/6)
... but "is capable of answering questions that are clearly computationally intractable and/or Halting-undecidable". (2/6)
I mean... the future AI will be able to retroactively predict *every* action you take, even the ones where you were just a quantum nudge of a neurotransmitter away from choosing B over A? (3/6)
Centuries into the future, Omega will be able to decompile my quantum "dust" (waste heat) and accurately predict my behavior down to the positions of my neurotransmitter molecules? And therefore I owe it money? (4/6)
Hey, at least when Vernor Vinge created the Transcend, he made it clear that he was violating everything we know about how computational complexity interacts with physics. (5/6)
Like a futurist who can't quite convince himself that we'll never travel faster than light, EY doesn't know enough to realize that his God-Emperor AI is a contradiction of the physical laws of computation. (6/6)
This is the most I've ever been exposed to Blake, and the first time I've heard his mythology clearly explained. Nice.
You can follow @dlkingauthor.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: