I& #39;m *finally* reading Neoreaction a Basilisk.
I& #39;m 100 pages in but it suddenly occurred to me that a liveblog would be a good idea. I& #39;ve been having thoughts at such a rate that I can& #39;t remember them without writing them down.
One observation of EY that I& #39;d made prior to NAB but that I keep getting reminded of: EY shows some pretty clear signs of unresolved trauma or even PTSD, and HPMOR& #39;s EY!Harry really really rubbed that in.
I mean this with seriousness and compassion: he should see a therapist.
I mean this with seriousness and compassion: he should see a therapist.
I confess to feeling like I need to use a secret decoder ring to read NAB, just because of the philosophical and literary references being casually pulled out and used as load-bearing analogies. (1/3)
To some extent this is a "me" problem, as I don& #39;t have much exposure to English-language literature outside of what gets taught in an American high school and I have approximately zero interaction with the traditions of philosophy except via frantic Wikipedia-ing. (2/3)
Still, I feel like NAB contains a little bit of self-sabotage here, by putting so much weight on analogies with Paradise Lost that a "normie" trying to learn about neoreaction wouldn& #39;t be able to follow. (3/3)
“This forces us to consider white culture as a set of perpetual ruins—as something that has always been lost, and that can only be apprehended as a tenuous and incomplete reconstruction.”
This passage makes me think of Tolkien and his Gondor.
This passage makes me think of Tolkien and his Gondor.
We& #39;re talking Turing & Gödel again, and I& #39;m reminded of just how ridiculous EY& #39;s position on superintelligent AIs is. Not just "as smart as a human genius", not just "so far beyond a human that it can hack your brain by typing at you"... (1/6)
... but "is capable of answering questions that are clearly computationally intractable and/or Halting-undecidable". (2/6)
I mean... the future AI will be able to retroactively predict *every* action you take, even the ones where you were just a quantum nudge of a neurotransmitter away from choosing B over A? (3/6)
Centuries into the future, Omega will be able to decompile my quantum "dust" (waste heat) and accurately predict my behavior down to the positions of my neurotransmitter molecules? And therefore I owe it money? (4/6)
Hey, at least when Vernor Vinge created the Transcend, he made it clear that he was violating everything we know about how computational complexity interacts with physics. (5/6)