On Attaining Enlightenment

One of my favorite moralizing essays of all time is The Inner Ring, by C. S. Lewis. In it, he explores the common failure mode of being drawn into ever-narrowing circle of the “elite”, until you have lost sight of everything you valued or wanted before other than grubbing for more and more status. It’s a great essay and I’m not doing it justice; you should go read it.

The idea of an “inner ring” is a very useful concept to have crystallized, and one of the ways it’s come in handy lately has been keeping me out of a particular failure mode that I see a lot of in the Lesswrongosphere (yes, this is going to be a meta political, not meta-political, post. Get out while you still can). I’ll approach this first by analogy.

At some point, Eliezer Yudkowsky realized that regular rationality was insufficient. He realized that people’s cognitive biases and lack of proper epistemological grounding were inhibiting their reasoning, causing them to fail at basic rational behavior and belief in all sorts of ways. Wow! What a discovery! He was one of the Elect! If he wanted to, Eliezer could have spent the rest of his days hanging out with Robin Hanson and a couple others on Overcoming Bias (in private threads, of course), developing impenetrable codes and occasionally dropping dark hints in the wider skeptic community that there was something Truly Special going on over on OB… if, of course, you could prove yourself worthy of participation. Then he could congratulate himself on repelling the threat of Entryism and continue embroidering his dark robe for the next Bayesian Conspiracy meetup. This would probably have been a very satisfying way to use his newfound skills, and eventually he would be in the enviable position of having total contempt for every part of society other than himself and his friends. It does, indeed, feel good to be the best.

Luckily for all of us, Eliezer had something to protect. Whatever else you may say about Eliezer, it wouldn’t have been in his character to so flagrantly ignore the dictum that has become a watchword among his followers:

“The primary thing when you take a sword in your hands is your intention to cut the enemy, whatever the means. Whenever you parry, hit, spring, strike or touch the enemy’s cutting sword, you must cut the enemy in the same movement. It is essential to attain this. If you think only of hitting, springing, striking or touching the enemy, you will not be able actually to cut him. More than anything, you must be thinking of carrying your movement through to cutting him.
— Miyamoto Musashi, The Book of Five Rings

In this case, Eliezer’s intention was not to slice up a Samurai dude, but to avert the risk of a catastrophic intelligence explosion. Again, whatever you think of this goal, it is good for us that he pursued it: fuelled by his desire to recruit new Artificial Intelligence researchers, he wrote the Sequences, carefully explaining all the bright new (and not so new) ideas he had about rationality and decision-making and having good arguments and… all of it. He clearly explained his ideas, throwing open his palace of thought to the barbarian hordes: other people who weren’t his cool friends. This was a very good thing, and it allowed his ideas to be examined, poked, prodded at, dissected and expanded upon, growing an even more correct memeplex and enabling a lot of people to think more clearly about a lot of things. This was probably a good outcome, in the balance of things. In my opinion, he could hardly have asked for better.

The point is: Eliezer’s behavior in writing the Sequences is what it really looks like when someone really is a lot wiser than you are, and is handling that wisdom in a responsible way. As beautiful and romantic as the idea of a Great Zen Teacher who only accepts the Best Students to learn his Mysterious Oriental Secrets is, that is simply not the way that someone with a real insight in a community of smart people should behave. Note that in modern academia, which for all its flaws does produce a lot of useful truth, Pythagorean Brotherhoods have become rather passé. The prevailing attitude there is the same one I think the aspiring rationalist community should take: If you figured something out, tell us about it! There is no need to construct a maze of tunnels barring us from entry. Not only is it antisocial behavior to disengage from the exchange of ideas in this way, but there is a greater danger: Not everything that seems like a fantastic insight is really worth building a temple to. In fact, almost all insights are in fact really sucky when viewed outside ones own head, and those that aren’t are probably still deeply flawed. To recognize these deficits, it is important to engage with third parties who have no particular investment in your ideas, and who can judge them fairly on their merits. Needless to say, the eager acolyte who has just been inducted into your Secret Discussion Room will not point out that your entire worldview is flawed. And even if they could realize this… would you really listen?

It is now time for the game to be revealed. I am thinking specifically in this instance of the Neoreactionaries, who have all kinds of interesting ideas which are, as far as I can tell, nearly impossible to decipher from their sprawling complex of Continental-style writings and bizarre obfuscatory memes. The neoreactionary worldview, despite supposedly offering important insights, is very reticent about what those insights are or how one might glean them. Nearly every description of outsider engagement with that community that I’ve read involves such hilarious escapades as being instructed to read what even those sympathetic will agree are spittle-flying hate-rag blogs, but in such a way as to pierce past all the many levels of irony and misdirection therein. To be frank, and this is no reflection on the actual ideas disguised thereby (of which I know little), I doubt that what lies beneath would really be worth it. If your ideas are so good, neoreactionaries, then why hide them in a black temple in a nuclear bunker on the moon? Why not just lay out your ideas and let us evaluate them directly? It can be fun to be the only one who’s right, but no one is the only one’s who’s right because nobody is completely right. Finding out what the problems are with your ideas by discussing them with outsiders is honestly a pretty great strategy. That’s what the neoreactionaries did to me, when I was still in the grip of a rather noxious strain of Social Justice. It’s only fair that they should have access to the same service.

Again, this is not an indictment of neoreactionary ideas; it cannot be, because nobody but the Elder Council themselves seems to have the foggiest notion what the “actual” neoreactionary beliefs even are. Scott did an admirable job trying to fix this with his Nutshell post, by the grace of Mike Anissimov in actually explaining some things. I will not even dignify with a rebuttal the idea that Moldbug is a remedy for a lack of clarity and high barriers to entry. Ahem.

Really, honestly, I would love to have a straightforward primer and some honest engagement from the neoreactionary sphere, free of bizarre status games in which a postrationalist is deigning to explain their Ancient and Mystic ways to a mere idiot Yudkowsky-bot. We are all aspiring rationalists here, and I should hope we would be able to have an honest and friendly mutual search for truth. Even if the rest of us have not joined your blood brotherhood sworn to specific political beliefs

For now, this is my advice to anyone who is being beckoned into a Dark Enlightenment treehouse:

If in your working hours you make the work your end, you will presently find yourself all unawares inside the only circle in your profession that really matters. You will be one of the sound craftsmen, and other sound craftsmen will know it. This group of craftsmen will by no means coincide with the Inner Ring or the Important People or the People in the Know. It will not shape that professional policy or work up that professional influence which fights for the profession as a whole against the public: nor will it lead to those periodic scandals and crises which the Inner Ring produces. But it will do those things which that profession exists to do and will in the long run be responsible for all the respect which that profession in fact enjoys and which the speeches and advertisements cannot maintain.
— C.S. Lewis, The Inner Ring

This is not my first post

Eventually I will put something on this blog. Until then, you can check out my painfully banal tumblr if you want. I am also on twitter. Yes, I use different usernames for everything and yes, it makes my life more difficult. It’s out of a completely vestigial anonymity-related reflex.

Oh, and I’m on Less Wrong.