On Attaining Enlightenment

One of my favorite moralizing essays of all time is The Inner Ring, by C. S. Lewis. In it, he explores the common failure mode of being drawn into ever-narrowing circle of the “elite”, until you have lost sight of everything you valued or wanted before other than grubbing for more and more status. It’s a great essay and I’m not doing it justice; you should go read it.

The idea of an “inner ring” is a very useful concept to have crystallized, and one of the ways it’s come in handy lately has been keeping me out of a particular failure mode that I see a lot of in the Lesswrongosphere (yes, this is going to be a meta political, not meta-political, post. Get out while you still can). I’ll approach this first by analogy.

At some point, Eliezer Yudkowsky realized that regular rationality was insufficient. He realized that people’s cognitive biases and lack of proper epistemological grounding were inhibiting their reasoning, causing them to fail at basic rational behavior and belief in all sorts of ways. Wow! What a discovery! He was one of the Elect! If he wanted to, Eliezer could have spent the rest of his days hanging out with Robin Hanson and a couple others on Overcoming Bias (in private threads, of course), developing impenetrable codes and occasionally dropping dark hints in the wider skeptic community that there was something Truly Special going on over on OB… if, of course, you could prove yourself worthy of participation. Then he could congratulate himself on repelling the threat of Entryism and continue embroidering his dark robe for the next Bayesian Conspiracy meetup. This would probably have been a very satisfying way to use his newfound skills, and eventually he would be in the enviable position of having total contempt for every part of society other than himself and his friends. It does, indeed, feel good to be the best.

Luckily for all of us, Eliezer had something to protect. Whatever else you may say about Eliezer, it wouldn’t have been in his character to so flagrantly ignore the dictum that has become a watchword among his followers:

“The primary thing when you take a sword in your hands is your intention to cut the enemy, whatever the means. Whenever you parry, hit, spring, strike or touch the enemy’s cutting sword, you must cut the enemy in the same movement. It is essential to attain this. If you think only of hitting, springing, striking or touching the enemy, you will not be able actually to cut him. More than anything, you must be thinking of carrying your movement through to cutting him.
— Miyamoto Musashi, The Book of Five Rings

In this case, Eliezer’s intention was not to slice up a Samurai dude, but to avert the risk of a catastrophic intelligence explosion. Again, whatever you think of this goal, it is good for us that he pursued it: fuelled by his desire to recruit new Artificial Intelligence researchers, he wrote the Sequences, carefully explaining all the bright new (and not so new) ideas he had about rationality and decision-making and having good arguments and… all of it. He clearly explained his ideas, throwing open his palace of thought to the barbarian hordes: other people who weren’t his cool friends. This was a very good thing, and it allowed his ideas to be examined, poked, prodded at, dissected and expanded upon, growing an even more correct memeplex and enabling a lot of people to think more clearly about a lot of things. This was probably a good outcome, in the balance of things. In my opinion, he could hardly have asked for better.

The point is: Eliezer’s behavior in writing the Sequences is what it really looks like when someone really is a lot wiser than you are, and is handling that wisdom in a responsible way. As beautiful and romantic as the idea of a Great Zen Teacher who only accepts the Best Students to learn his Mysterious Oriental Secrets is, that is simply not the way that someone with a real insight in a community of smart people should behave. Note that in modern academia, which for all its flaws does produce a lot of useful truth, Pythagorean Brotherhoods have become rather passé. The prevailing attitude there is the same one I think the aspiring rationalist community should take: If you figured something out, tell us about it! There is no need to construct a maze of tunnels barring us from entry. Not only is it antisocial behavior to disengage from the exchange of ideas in this way, but there is a greater danger: Not everything that seems like a fantastic insight is really worth building a temple to. In fact, almost all insights are in fact really sucky when viewed outside ones own head, and those that aren’t are probably still deeply flawed. To recognize these deficits, it is important to engage with third parties who have no particular investment in your ideas, and who can judge them fairly on their merits. Needless to say, the eager acolyte who has just been inducted into your Secret Discussion Room will not point out that your entire worldview is flawed. And even if they could realize this… would you really listen?

It is now time for the game to be revealed. I am thinking specifically in this instance of the Neoreactionaries, who have all kinds of interesting ideas which are, as far as I can tell, nearly impossible to decipher from their sprawling complex of Continental-style writings and bizarre obfuscatory memes. The neoreactionary worldview, despite supposedly offering important insights, is very reticent about what those insights are or how one might glean them. Nearly every description of outsider engagement with that community that I’ve read involves such hilarious escapades as being instructed to read what even those sympathetic will agree are spittle-flying hate-rag blogs, but in such a way as to pierce past all the many levels of irony and misdirection therein. To be frank, and this is no reflection on the actual ideas disguised thereby (of which I know little), I doubt that what lies beneath would really be worth it. If your ideas are so good, neoreactionaries, then why hide them in a black temple in a nuclear bunker on the moon? Why not just lay out your ideas and let us evaluate them directly? It can be fun to be the only one who’s right, but no one is the only one’s who’s right because nobody is completely right. Finding out what the problems are with your ideas by discussing them with outsiders is honestly a pretty great strategy. That’s what the neoreactionaries did to me, when I was still in the grip of a rather noxious strain of Social Justice. It’s only fair that they should have access to the same service.

Again, this is not an indictment of neoreactionary ideas; it cannot be, because nobody but the Elder Council themselves seems to have the foggiest notion what the “actual” neoreactionary beliefs even are. Scott did an admirable job trying to fix this with his Nutshell post, by the grace of Mike Anissimov in actually explaining some things. I will not even dignify with a rebuttal the idea that Moldbug is a remedy for a lack of clarity and high barriers to entry. Ahem.

Really, honestly, I would love to have a straightforward primer and some honest engagement from the neoreactionary sphere, free of bizarre status games in which a postrationalist is deigning to explain their Ancient and Mystic ways to a mere idiot Yudkowsky-bot. We are all aspiring rationalists here, and I should hope we would be able to have an honest and friendly mutual search for truth. Even if the rest of us have not joined your blood brotherhood sworn to specific political beliefs

For now, this is my advice to anyone who is being beckoned into a Dark Enlightenment treehouse:

If in your working hours you make the work your end, you will presently find yourself all unawares inside the only circle in your profession that really matters. You will be one of the sound craftsmen, and other sound craftsmen will know it. This group of craftsmen will by no means coincide with the Inner Ring or the Important People or the People in the Know. It will not shape that professional policy or work up that professional influence which fights for the profession as a whole against the public: nor will it lead to those periodic scandals and crises which the Inner Ring produces. But it will do those things which that profession exists to do and will in the long run be responsible for all the respect which that profession in fact enjoys and which the speeches and advertisements cannot maintain.
— C.S. Lewis, The Inner Ring


5 thoughts on “On Attaining Enlightenment

  1. I’m no expert on NRx but I think a lot of them subscribe in some form to Straussean ideas about esoteric and exoteric teaching.

    Also the movement is explicitly anti-egalitarian so it shouldn’t really be a surprise that there’s very little attempt to make it easily accessible.


  2. I like you writing style. I probably don’t have too much to add as I don’t know anything more than the basics about Neoreactionaries, but your central argument seems fairly sound. I hope you continue to write articles.


  3. I think the other thing that bugs me about NRx smugness is that, even conditioning on NRx being 100% correct as a philosophy, I *still* wouldn’t expect many people to arrive at that truth. It’s like: okay, fine, you worked out this incredibly counterintuitive set of views after countless hours of poring over arcane blogs. Great. Good for you. But don’t act all *surprised* when you present people with your absurd and repulsive sounding worldview and they dismiss it as absurd and repulsive. If I were to discover that, contrary to all common sense and reason, the best way to grow crops was to paint them blue and read them Garfield comics, I would *not* be laughing scornfully at the sheep who hadn’t figured this fact out. I would be starting every sentence with “Okay, I know this sounds crazy, but…”

    Anyway, I agree with your point completely: total anti-evangelism is usually not a good sign in a group.


  4. There are a few things going on here.

    1) Neoreactionaries, all the way back to Moldbug, have known that half of their task is dealing with their readers’ lizard-brains. To most of their potential audience, their beliefs (or the beliefs that will be wrongly read into them) are signals of hopelessly low intelligence, sexual dysfunction, or orclike evil. Moldbug consistently employs the rhetorical strategy of implying (or outright stating) that he is not elthedish to the people who will be likely to read him: he’s a Jewish Brahmin atheist who went to a good college and can signal high intelligence and so on, not some uneducated, Limbaugh-listening piece of white trash who can be safely dismissed out of hand. Postrationality has the same problem: it has to avoid getting pattern-matched to New Age babble.

    2) Moldbug is trying to optimize for readability, unlike, say, Auster or Bonald. Who reads Auster or Bonald? Who reads Moldbug? Presumably Moldbug’s readers can notice this and figure out that they have to keep their audience interested — which often means adding more words, rather than editing them away.

    3) Most new ideas aren’t written about clearly. We think Hume is easy to read because he’s been around for centuries, many of his insights have filtered down into the broader culture, and people have excerpted the sentences where he makes his point and quoted them in textbooks and on Tumblr and so on. But if you actually go back and read Hume, you’ll find that he makes Moldbug look like Scott Alexander. And Kant is far worse than Hume. The same thing applies to Foucault, Nietzsche (who I find fairly clear, but apparently this opinion isn’t shared), Deleuze, and so on. But people are more likely to read clear primary sources than opaque ones, and more likely to read well-known primary sources whose ideas they already have some grasp of than obscure ones whose ideas they know nothing about, so they end up miscalibrated about how clear things tend to be in general.

    4) How would you know whether Pythagorean Brotherhoods are passe or not? (The secret society problem: how can you tell whether or not secret societies are common? If they’re not common, you won’t see them; if they’re common, you still won’t see them unless you’re in them, and even then, other people could be in far more of them than you. Perhaps there is some way of dealing with this in statistics, but I don’t know of it — and anyway, unless you’re an academic IRL, you won’t see them whether or not they’re there.)

    5) Yudkowsky is trying to do one specific thing: become a primary source for a set of arguments. There are other things that can be done. (I just provided an example here. And another example in the previous sentence.) Some points lend themselves better to demonstration than to argument; some things are intended to knock down something that already exists, rather than build something up; some things are intended to provoke thought, to ask a question, rather than to provide an answer; and so on. Consider Yudkowsky’s explanation of the benefit of setting HPMOR as fanfiction, rather than an original world: if he’d invented his own Azkaban, it would have come off differently. (I can’t find where he said it, but maybe someone else can.) If he had instead made an argument about Azkaban instead of writing it into HPMOR, it would’ve been even weaker than if he’d put it into an original world.

    6) Even if it’s possible to make a Yudkowsky-style argument for something, it’s often not the option with the best effort/effectiveness tradeoff for people who have full-time jobs.


    • I guess the lizard-brains argument makes sense for mass-market propaganda, but pulling that kind of thing when you’re talking to people within the rationalist community comes across (to me, at least) as rather irritating and possibly insulting. I am quite sure (~90%, although I’m not very well-calibrated) that Less Wrong would not chase you out with pitchforks if you tried to start a thread about Wicca or whatever. The Solstice is already more than tolerated, and there’s certainly been serious discussion of deliberate compartmentalization before. Same goes for whatever other weird stuff postrationalists are into that is even marginally compatible with LW beliefs. I still agree with Scott in being skeptical that it’s useful to have a differently-named splinter movement for such stuff.

      My argument is really about the rationalist community, and within that community I think that deliberate obscurantism and braggy secret-keeping about empirical beliefs are neither necessary nor welcome, and in fact signal that you are not willing to actually put your beliefs to the rather basic test of arguing them with a bunch of smart people who understand them. The fact that this would be possible is evidenced by the fact that people (including Scott) try to do this all the time anyway, even though it isn’t initiated or even encouraged (that I’ve ever seen) by NRx at all.

      You are totally right about Pythagorean brotherhoods. I haven’t heard anything about any estranged members or whatever spilling the beans on such a thing, but I suppose that a sufficiently well-run Brotherhood wouldn’t have that problem. It’s the perfect-crimes problem all over again!


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s