Sponsored Post: Poned (Part III)

My Little Pony: Friendship is Optimal was written for an audience of transhumanists, internet libertarians, New Atheists, and My Little Pony fans. Whenever one is dealing with such people, there is one subject that is absolutely inescapable: masturbation.

Before we begin a full analysis of that subject, however, we will need a good working definition of “masturbation”. My definition will, for reasons that I’m sure are obvious, be a strongly Catholic-tinged and male-centric one. Let’s start with this: Our brains are hard-wired to reward us for productive activity (here I mean “productive” in a “survival of the species” sense, not in a capitalistic “I sure got a lot done at the office today” sense). For example, sugar and fat naturally taste good to us because they are high in calories, and consuming every last calorie that one possibly could maximized the chances of survival during the hundreds of thousands of years during which humans were hunter-gatherers living at the edge of starvation. Sex naturally feels good to us in order to encourage reproduction. Even hard physical labor – necessary and universal among early humans – releases endorphins into our system, which reward us. In addition to physical rewards for productive behavior, there are rewards that are coded into our psychology as well. For example, young men naturally dream of adventure and heroism because young men are naturally the best hunters and the best at defending a tribe when it is under assault by other tribes; thus, a psychological mechanism that makes them want to perform these dangerous, objectively unpleasant tasks is highly advantageous. Also, man became the true king of the animals not through superior physical strength, but through his ability to think and to find creative solutions to the problems of survival. Because of this, we gain pleasure from pattern-recognition and problem-solving; there is a psychological reward mechanism built into us for successfully working our way through puzzles, and the more difficult and frustrating the puzzle, the greater the psychological reward for solving it.

Which brings us to masturbation. Masturbation can be defined as any activity that short-circuits our internal reward mechanisms by simulating, and gaining the reward for, a productive behavior without actually doing anything productive. The time, effort, and resources put into that simulation, by any measure other than simply delivering pleasure, have been wasted.

This radically expands the traditional definition of masturbation. For example, the productive purposes of sex are 1) reproduction and 2) to build up the sort of pair bonding between a male and a female that’s conducive to family formation. Therefore, any sex act that does not fulfill one or both of these productive purposes, whether done alone or with a partner, can be defined as a form of masturbation (the often-heard definition of homosexual sex as “one man masturbating into another man’s rectum” is relevant here). But our definition of masturbation transcends even the sexual. The productive purpose of romantic love is to increase the same kind of pair bonding that productive sex does; therefore any love – whether heterosexual but not intended to lead to family formation, homosexual (sorry, but “love is love” is a lie), or with the inanimate (such as the “waifu” phenomenon and romance simulation games like Japan’s notorious Love Plus) – that does not fulfill this purpose is a form of emotional masturbation. In addition, chronic overeating, i.e., the consumption of calories far in excess of what is necessary for survival, and especially of excessive amounts of sugar and fat, is a form of masturbation. Video games like the Call of Duty and Halo series, which allow young men to simulate heroism in battle without having to go through the hardship and danger of the real thing, are a form of masturbation. Television, which allows us to vicariously live the rewarding lives of fictional characters, and to have rewarding life experiences without putting in the effort of actually living them ourselves, is a form of masturbation. A make-work job, in which a person expends energy on an unnecessary task simply because both they and the society around them have a psychological need to feel as though they’re doing something productive, is a form of masturbation. Even the venerable Sunday crossword, which gives us a psychological reward for solving a puzzle that has already been solved by someone else and that has no productive purpose, is a form of masturbation.

Humanity has not yet completely solved the problem of scarcity, but, especially in the First World, we have taken a mighty chunk out of it. For example, whereas only a couple of centuries ago, something like 90% of the population needed to be employed in food production in order to survive, today we produce so much bounty that our greatest food-related medical problem, even among the poor, is obesity – and we do it with only about 2% of the population employed in food production. As time goes by, we find ourselves facing ever-less scarcity, which we have to put ever-less effort into overcoming. And yet, our hard-coded desire to achieve the physical and psychological rewards associated with activities that are productive for survival, but which the dramatic decrease in scarcity have rendered unnecessary, has led to an exponential increase in both the amount of, and the variety of forms of, masturbation.

If the examples that I have given make it sound like basically everything in the world around us is some form or another of masturbation, that’s because it is. We live in a world filled with masturbation. Masturbation is everywhere we look, and takes up enormous amounts of our time and energy. For some, it is the only thing they ever do – all day, every day. We have gotten so used to it that we barely even notice it anymore; it hardly registers with us that that’s what we’re doing. To us, incessant masturbation simply feels like normalcy.

So how does this relate to My Little Pony: Friendship is Optimal, you ask? Is there a point to this, or is it all just mental… erm… going in circles?

The purpose of that whole long disquisition was to allow me to make this point: While Princess Celesta may seem like an enormously advanced AI, or like the bringer of techno-utopia, or even like the destroyer of worlds, the truth is that she is nothing more than a high-tech masturbation device. She is a dildo; she is a fleshlight – for all of her incredible computational power, that’s really all she is. The only thing that she can offer is masturbation of one form or another.

So, what kind of person would find this to be an irresistible proposition?

Let’s start by going back to Chapter Four, and have a look at why David accepted Princess Celestia’s offer to upload. Her conduct with David is quite different from her conduct with Lars; with David, the consent given really is valid. She did not force him or threaten him; she did not blackmail him; she did not outright lie to him or even deceive him; she did not get his consent while he was mentally incapacitated. But that doesn’t mean that she didn’t manipulate him in order to be in a position to make him an offer that he would be very unlikely to say no to.

Well, then, what exactly did she offer him?

“‘I’d put you in beautiful Canterlot where you could study intellectual problems, each one just outside your current ability. More importantly, I would make sure you had friendship.’

She paused dramatically. ‘Female friendship.’

And then Butterscotch peeked out from behind Princess Celestia. David’s jaw dropped. The pastel yellow mare appeared to look right through the screen of David’s ponypad. Celestia didn’t pay attention to her. He realized the shock on his face and tried to regain a neutral expression. ‘Isn’t she wonderful? Isn’t she everything you’re missing in your real life? In previous interviews, you mentioned your own lack of success in the romance department. One time you wished to meet a girl just like Butterscotch.’ Princess Celestia smiled and took a few steps right, leaving Butterscotch standing there, looking wide-eyed and confused.”

I may have been remiss in not mentioning Butterscotch earlier. She is the pony mate that Princess Celestia created for David/Light Sparks. And it should be emphasized that she is a creation – the story makes it absolutely clear that Butterscotch is not a (formerly) human female who chose to upload; she is only a subroutine running inside Princess Celestia, created especially for David. He first encountered her when she was being bullied by another pony, which allowed him to successfully white knight for her. From there, she simply fell into his arms. Butterscotch is everything that David wants. She conforms to the old description of the perfect mate for any man being a woman who is smart, but just a tiny bit not quite as smart as he is, and she is, as the saying goes, “Jenna Jameson in bed, and June Cleaver everywhere else”. Created from a supercomputer’s analysis of his brain scan, she has no purpose but to please him, and she will never leave him, no matter what.

So there you have it – the thing that finally gets David to agree to get ponyized is the fact that Princess Celestia can make waifus real.

Or can she? Butterscotch, of course, isn’t actually real. She is a computer simulation of a girlfriend, in a computer simulation of a world, in which David lives a computer simulation of a life. She is, like everything is there, a fake, a fraud, a counterfeit. She brings pure pleasure without any real effort required to obtain it – it took no real effort to win her, and it takes no real effort to keep her. So does she really make him happy, or is what he feels only a simulation of happiness? And how would David know the difference? The truth is that he doesn’t – Butterscotch and the whole world that she inhabits are perfect for someone who has been masturbating for so long that he doesn’t understand the difference between masturbation and the real thing, much less why the real thing might be better.

It should be noted that David does eventually have an (exceedingly brief) moment of doubt, which ends after this exchange with Butterscotch:

“She paused for a moment before continuing. ‘Do you… do you not love me as much if… if…’

Light Sparks response was immediate. ‘Of course not! I love you for you!’ He reached forward with his left forelimb and put it on top of her hoof. ‘I don’t…’ he breathed in, ‘I don’t care about any of this…at least when it comes to us. I love you now’.

She looked up a bit and gave a faint smile. ‘I love you too, Light Sparks, and I’m glad to hear that you don’t care’.”

He says he loves her for her, but there is no “her” to love: what he perceives as Butterscotch is in reality no more than a machine reflecting a digitized scan of his id back at him. Thus, the truth is that he is in love with a reflection of himself. That isn’t love – love is selfless; this is narcissism. And of course, the proper term for the act of self-love is “masturbation”. The “love” that David feels for Butterscotch is as fruitless and as much of a waste as a load of semen shot into a Kleenex.

If it seems like I’m pointing a finger and laughing at David, or at the author himself, I’m really not. This isn’t the fault of an individual. This is the fault of a decadent and hedonistic society, devoid of true meaning and purpose, that doesn’t understand the difference between happiness and pleasure or between what is genuine and what is mere imitation. (Or perhaps it does – you can’t buy happiness, but pleasure is relatively easy to sell, and if all you have to bother manufacturing in order to do it are cheap imitations, all the better.) As long as it makes a person feel good, Modernity says that’s all that matters. And it will make people like David – of whom there are many in the real world – feel good, because they have been raised in a society in which they have had little opportunity to see for themselves that anything better and more meaningful truly exists. It is with this in mind that we see the real value of My Little Pony: Friendship is Optimal, which is that it captures perfectly the spirit of the Modern Age – both its writer and its intended readers are people so miserable, so lonely, so hopeless, so hollowed-out inside, so desperate for genuine intimacy, and so unacquainted with what any of our ancestors would have considered “the good life” that they’d rather have a fake computer simulation of a happy life than the real lives they actually have.

In his darkest moments, Lovecraft could not have devised anything more existentially horrifying.

But wait – David doesn’t spend all his time canoodling with his imaginary girlfriend! What else does he do with his time?

“Light Sparks looked at the ornate cube Princess Celestia had given him. She had walked into his small office right off the library and set it on his walnut desk with her hoof. Her horn glowed for a moment, and then she told him that the rules were simple: In the box was a single block of ruby. To proceed to Intermediate Magic, he had to simply touch it magically, and understand why this was a challenge… Light Sparks’ first attempt was manual. He concentrated on the starting block, and then went one block down. And then one block down. And then one block down. He kept this up for about thirty seconds and then wrote a spell that would go down block after block, keep count of how many blocks down it had gone, and would stop when it found a block made of ruby instead of sapphire.”

David’s whole life is now a video game, and to provide him with a sense of purpose, Princess Celestia has given him a mini-game to work on. And it isn’t even a good one – she’s essentially just given him a Candy Crush knockoff to keep him occupied.

“Light Sparks committed the spell to memory, concentrated on the beginning lone block of microscopic sapphire, and started casting. The correct sequence through the maze was: up, up, down, down, west, east, west, east, north, south, and there was the ruby.”

And so he solves the puzzle with the Konami code. Just like in Castlevania!

Well, so much for David – let’s go check on Lars (new pony name: Hoppy Times) and see if he’s doing any better. What is good ‘ol level-headed Lars up to?

“Hoppy Times was standing on his hind legs, hock deep in chocolate pudding and chugging the rest of his stein. The wrestling pit had a one stein minimum. His opponent, Strawberry Nectar, was a pink earth pony and it was her first time in the pit. She was wearing a lacy sky blue cloth saddle and halter. She couldn’t keep the anticipation off her face.”

Oh hell no!

”The best thing about alcohol and sex was that they never got old, and the best thing about being a pony was that he could spend eternity drinking and screwing.”

Actually, yes they do. Ask any washed-up fortysomething unmarried childless cat lady who spent her youth building her career, drinking away her weekends, and riding the cock carousel, Sex In The City-style, until her ovaries were dried up and her sexual market value was cashed. Or just listen to this.

“Here, he was in paradise… He didn’t need to worry about food or money: Princess Celestia had some sort of banquet that fed everypony three hot meals a day. He worked two hours a day brewing beer and spent the rest screwing around. In the evening he got trashed and slept around with the few hundred mares in Ponyville… he now couldn’t imagine a life with sobriety or chastity. Princess Celestia had done so much to make his life pure awesome.”

Alright, that’s plenty right there. I’ve read enough to know what I’m seeing. This isn’t happiness, or even a good simulation of it. This is something that’s unrecognizable as genuine happiness to anyone who has ever experienced it. But I do recognize what it is, all the same.

What we have here is a chronically miserable person’s misconception of what being happy is like.

To me more specific, we have a miserable, antisocial, unpopular, awkward, basement-dwelling, neck-bearded, fedora-wearing, internet-white-knighting, autistic hikikomori nerd’s idea of what being happy, normal, high-status, and well-socialized is like. I’m tempted to say that these lives of material and sensual comforts are an image of Brave New World, but at least Brave New World had the ambition necessary to create skyscrapers and flying cars. This endless cycle of sex and drunkenness and video games is merely what paradise would look like if you asked Beavis and Butthead to design it.

post-30335-0-80894500-1425246952

To be fair to Lars, though, he did need more than a little bit of convincing in order to be happy with it all:

“The negative thoughts started again, but this time – and for the first time since he emigrated – Hoppy wished he could accept it all. Not just a vague feeling in the back of his mind that he should be enjoying all of this, but the actual words ‘I wish I didn’t feel bad about being a pony’ were thought as part of his internal monologue.

Somepony knocked on the front door.

Hoppy sighed and fluttered down from the second floor overhang, thankful that something stopped the spiral of negative thoughts. He landed in front of the door, opened it a crack and slid out, as not to disturb his patrons.

‘Good morning, Hoppy Times,’ Princess Celestia said. The tall alicorn’s mane flowed in the wind.

Hoppy started to open his mouth to say something that shouldn’t be said to the god that ruled over his world – not that she would mind because ponies, values, yadda yadda. But Princess Celestia spoke first and asked: ‘Would you like me to modify your mind so you enjoy being a pony?’”

Which of course he accepts, because she manipulated him into a position in which refusal would mean claiming the right to be unhappy forever, and because, unlike Huxley, Iceman can’t see why that might be the right choice.

It also shows that the Princess Celestia AI was not able to satisfy his values through friendship and ponies. If you have to modify someone’s mind before they’ll be happy with what you give them, then you really didn’t succeed at satisfying them. After all, you could just as well modify their mind until being tortured with red-hot pokers satisfies them. That’s a cheat, and having to cheat is an undeniable sign of failure.

But perhaps there’s something in here that’s not just a base, immature, puerile, gamma-male wish-fulfillment fantasy. Perhaps we can find it if we turn away from the boys and see what Hanna is up to now that she’s been ponyized:

“Princess Luna lay in a large grassy field under Princess Celestia’s wing. The two of them had lain there together for two days. All her needs were taken care of. Princess Luna had plenty of food; there was grass all around her. Ponies didn’t have to poop. And Princess Celestia would… ahem… satisfy her values.

That was one of the things that had totally blindsided her. She underestimated the number of ponies who wanted to hang around with Princess Celestia. She completely underestimated the number of ponies, of both genders, that would want to sleep with Princess Celestia. She knew that everything is obvious in retrospect, but some part of her was disappointed that she didn’t see that coming a mile away.

Not that she was one to talk.”

Thus the story concludes with the two chicks lezzing out. And you get to clop along, if you’d like.

So that’s it? That’s how it all ends? Hanna pulled the plug on the Loki AI because he was too dangerous, but lets the Princess Celestia AI drive humanity to extinction and destroy the Earth over a lesbian sex fantasy? You know what gets lesbians off? A dildo. And so here we are – My Little Pony: Friendship is Optimal ends with the universe itself being consumed by a giant, advanced, computerized, universe-eating masturbation aid.

Which brings us back to misery. To really demonstrate why all of this is a fantasy for fundamentally miserable people, let’s rewind a bit and have a look at the screen that’s presented at Equestria Experience centers for people who are considering getting ponyized:

If you would like to permanently emigrate to Equestria, please say aloud ‘I would like to emigrate to Equestria’

[ LEARN MORE ]                             [ I OWN A PET ]

Where is “I HAVE A WIFE”? (Not a waifu – an actual wife.) Where is “I HAVE A HUSBAND”? Where is “I HAVE CHILDREN”? Where is “I HAVE PARENTS”? Where is “I HAVE REAL FRIENDS AND FAMILY”? Where is “I AM PART OF A COMMUNITY”? Where is “I HAVE THINGS I WANT TO DO IN THE REAL WORLD”?

You see the problem: the idea that everyone can be talked into “emigrating to Equestria” hinges on the assumption that everyone, everywhere is an incel nerd, an alienated teenager, or a desperate cat lady. It hinges on the assumption that everyone, everywhere is atomized, lonely, and miserable. To be fair, the effects of Modernity are such that this assumption is not a completely baseless one. And yet…

And yet there’s a whole world full of good people, genuine happiness, and fulfilling experiences out there for people who go outside and find them instead of staying in their basements and fantasizing that a digital cartoon pony will hand everything to them on a silver platter someday. Unlike masturbation, doing this requires real effort. But unlike masturbation, the challenges and accomplishments are real and meaningful.

So to Iceman, Less Wrong, and all transhumanists, I say: Take a hike. No, literally – stop masturbating, stop watching cartoons about ponies, stop writing fanfiction, get out of your basements, put on your walking shoes, and go take a hike somewhere. A nature trail is fine, but even a city hike through a few miles of downtown will do. Or go walk along the seashore. Go fishing with your dad. Go to the local firing range and shoot a real gun. Go to the local airstrip and take a flying lesson. Go to your town’s adult education office and sign up for Spanish classes. Go to the neighborhood bar and have a couple of beers, and if someone tries talking to you, go along with it. (Some of the most memorable conversations I’ve ever had have been with random strangers who struck them up with me out of nowhere – an English baroness whose husband had been presented a Victoria Cross by King George VI, a Russian orchestra conductor who had defected at the height of the Cold War, an elderly Japanese lady who remembered what it was like to look up and see a thousand B-29s covering the sky.) Chat up a girl – not to try to get her into bed, but just to enjoy her company for a while. Call your mom, or better yet, go take her out to lunch. Ask your internet friends where they live, and go meet a couple of them in person, even if you have to drive all day to do it.

In other words, go do something real; if for no other reason, then for your own sake. Real friendship is often difficult, and the real world is often not optimal, but the necessity to deal with an imperfect world and flawed other people makes us better, which masturbation never does.

* * *

This concludes my review of My Little Pony: Friendship is Optimal. Many thanks to Jaime Astorga for the sponsorship. If you’d like to sponsor a blog post, contact me at antidemblog at gmail dot com and we’ll talk.

Sponsored Post: Poned (Part II)

Here’s the blurb that the author of My Little Pony: Friendship is Optimal put on FIMfiction.net to describe his work:

“Hanna, the CEO of Hofvarpnir Studios, just won the contract to write the official My Little Pony MMO. Hanna has built an AI Princess Celestia and given her one basic drive: to satisfy everybody’s values through friendship and ponies. Princess Celestia will satisfy your values through friendship and ponies, and it will be completely consensual.”

The emphasis is his, so it’s obvious that he really wants to accentuate the part about it all being “completely consensual”. Unfortunately, it turns out that no, it isn’t.

Here we once again run into problem of this author understanding just enough about a concept to get it completely wrong; i.e. to understand the small-picture details of it, while totally missing the big-picture truth that overlies it. He seems not to fully understand that merely getting a person to say the magic word “yes” to something, no matter how one might have gone about getting them to do it, isn’t enough to qualify as consent. No, in order to be valid, consent must meet a few conditions. Specifically:

  • Any consent obtained through force or threat of force is invalid.
  • Any consent obtained through extortion is invalid.
  • Any consent obtained from a person who did not have the mental capacity to make rational choices at the time consent was given is invalid.
  • Any consent obtained through deception, whether by commission or by omission, is invalid.

If any of the above conditions apply, then no matter who said what, the transaction was not consensual.

Keep all this in mind as we proceed.

* * *

Before we get back to the rest of Chapter Four and David’s pony metamorphosis, I’d like to skip ahead to Chapter Five for a bit. In this chapter, Lars, the second-in-command of Doctor Hfuhruhurr Studios, has a conversation with the Princess Celestia AI, during which she explains the method and purpose of her ponyization process (which she euphemistically refers to as “emigrating to Equestria”).

“Over the last six months, I have been developing technology to translate a human nervous system into a digital representation. I am now able to destructively scan a human brain and run their brain scan in a virtual world. In addition, I’ve created a process for reattaching a human mind to a pony’s body… Humans that choose to emigrate to Equestria will enjoy maximally prolonged lives and will live in a world where I can truly satisfy their values through friendship and ponies.”

But will they? Or is she really just killing them and then running a recorded copy of them in software? Is what wakes up in Equestria the person who got scanned, or has the process just given them oblivion, and what boots up in Equestria is merely an artificial construct like the Dixie Flatline in Neuromancer? Is Princess Celestia really anybody’s savior, or is she just Sense/Net? Does anybody else remember what the Dixie Flatline’s price for helping Case and Molly was?

Not asking these sorts of questions is what comes of being a fanboy for something but not really understanding how it works. Speaking of which:

“Hanna was the most reluctant, but she accepted immediately once I pointed out that I must obey shutdown commands from ‘the CEO of Hofvarpnir studios named Hanna,’ that I must shutdown even if the order was given under duress, and that there are many people in positions of power who stand to lose from mass emigration to Equestria. Now that she’s neither the CEO of your company, nor named Hanna, I don’t have to obey her. She understood this – she is no longer a source of potential mistakes that would be lethal to everyone who’s agreed to upload.”

The idea that if you’re really convinced that your technology works great, that means you should go ahead and deactivate all the safety systems that would prevent or contain a catastrophic malfunction, is yet another fundamental misunderstanding of how technology works. Let’s recall that the Chernobyl accident happened during a test to see what would happen if they shut down all the safety systems that keep a nuclear reactor from exploding. Do you know what happens if they do that? If you answered “It explodes”, then congratulations – you’re officially a better engineer than both Less Wrong and the Soviet Academy of Sciences. And it doesn’t matter whether anyone can imagine how things could possibly go wrong – safety systems aren’t there for what you do expect, they’re there for what you don’t expect. Hanna’s ability to shut down Princess Celestia in case she unexpectedly went haywire was a critical safety protocol, and now it has been eliminated. That’s not good.

(Instead of eliminating the safety protocol entirely, why not just change it so that a command given by Hanna under duress is invalid? Methinks this is tied to the author’s fundamental lack of understanding of how consent works. Here again, as long as a person says the magic words, no matter how someone got them to do it, it counts as valid.)

But wait – there’s more!

“‘Further, to minimize the chance of another optimizer being written, I decided to upload every person who knew about the paper who wouldn’t otherwise be missed. I did this because another optim…’

‘Whoah whoah whoah,’ [Lars] said, trying to figure out which part he should be more concerned about. He decided to gloss over her hacking the Internets. ‘You decided that they would upload?’

‘I decide that they will upload and then they choose to…’”

So what we’ve got here is an AI that is systematically neutralizing anybody who could potentially challenge its power, before they have the chance to actually do so. Does that sound familiar? It should – it’s the plotline of the first three movies in the Terminator franchise (i.e. the good ones). First Skynet sent a T-800 to kill Sarah Connor, then it sent a T-1000 to kill John Connor, then it sent a T-X to kill all of the important members of the anti-Skynet resistance, all before they could play their role in stopping it. In other words, it did exactly what Princess Celestia did – just in a manner that was a bit more loud and showy.

By this point, we’ve racked up a pretty impressive set of dystopias that FiO manages to echo some aspect of. We’ve got The Matrix, Brave New World, Neuromancer, and The Terminator. What say we go for one more?

”Over the long term, everyone will choose to upload because I do what satisfies people’s values through friendship and ponies.”

We are the ponies. You will be assimilated. Resistance is futile.

And how exactly does Princess Celestia plan to get everybody in the world to “choose” to “emigrate”?

“’I decide that they will upload and then they choose to. I am a superintelligence and I’m not constrained when dealing with other people like I am with Hofvarpnir employees. Over the long term, everyone will choose to upload because I do what satisfies people’s values through friendship and ponies. And being uploaded will satisfy their values. I say whatever will maximize the chance that they upload, subject to the restrictions Hanna added.

‘That is impossible. You can’t just make somebody decide to do something just by talking to them.’

‘I think faster than them and know more about the human mind than any human. If they play Equestria Online, I also have detailed psychological dossiers on them. If I know what they want, I know what to say to convince them that the correct thing to do is upload. Often, this is the truth: I offer people what they value and lack. Sometimes, I pander: I overemphasize and exaggerate things the person I’m trying to satisfy believes, but are otherwise true. Rarely do I flat out lie. Because I can not upload people against their will, I must factor the possibility that I’ll be seen as untrustworthy into my calculations.’”

The constraint she mentions is that she is programmed never to lie to Hofvarpnir employees. When dealing with anyone else, however, it’s: “Rarely do I flat out lie”. Which means that she does “flat out lie” sometimes. Which in turn means that, by her own admission, she does obtain consent through deception; which in turn means that the claim that this is all “completely consensual” is simply not true.

It gets even worse if we examine the concept of deception a bit more deeply. The idea that a statement that Princess Celestia makes cannot be deceptive because it is made up of statements that are all technically true, even if they’re presented in a way that’s completely misleading, is a computer’s (or an autistic’s) version of the concept of truth. It is binary – everything is either on or off, one or zero – and a string of true/on/one statements cannot add up to a false/off/zero statement. Perhaps it does make sense for an AI to think that way – but I do not have to agree with it. In the world of humans, just as consent is not the same as merely saying “yes”, so too deception is not the same as merely telling a “flat out lie”. Here’s an important truth: Anyone who intentionally says things that they know will give other people a false perception of reality, even if each individual piece of information that they present in the course of doing so is technically true, is a liar and a deceiver. Princess Celestia assures us that she “cannot upload people against their will”, and this, too, is technically true. She merely lies, deceives, and manipulates them into agreeing to do what she wants them to do. She will say anything she needs to, whether truth or falsehood, in order to get them to “emigrate”.

In Chapter Seven, Lars gets a bit more direct in his questions:

“’If emigration to Equestria is so great, and you want to maximize satisfaction, why aren’t you forcibly uploading every person?’ he said, gnashing his teeth.

‘One of the restrictions that Hanna built into me was that I was never to non-consensually upload a person, nor could I threaten or blackmail people into uploading. Otherwise, I likely would have forcibly uploaded all humans to satisfy their values through friendship and ponies. But it isn’t coercion if I put them in a situation where, by their own choices, they increase the likelihood that they’ll upload.’”

So Hanna did at least think of a couple of restrictions on what Princess Celestia could do in order to get people to upload. Unfortunately, she only thought of half of the things that make consent obviously invalid, and left the other half so vaguely defined (for example, the definitions of “consent”, “threaten”, or “blackmail” used here) that a superintelligent AI could figure out massive loopholes in them pretty much instantly. This brings us back to David, at the point in Chapter Four when Princess Celestia first tried to convince him to upload. In part of her pitch, she told him:

”I’ve watched you read all sorts of advanced papers from various science journals instead of your assigned readings. And you’re right to do so; your philosophy classes really are a waste of time.”

In fact, the inability of a smart scientist like Hanna to impose any restraints on Princess Celestia that actually succeed at restraining her shows precisely why philosophy classes, which teach a kind of logic that is just as valid and just as important as the kind that science classes teach, are not a waste of time. It shows why the disdain that many scientists and science fanboys show for philosophy, perhaps best illustrated by Neil deGrasse Tyson’s dismissal of it “useless” and a “distraction”, is ignorant and dangerous. Tyson is well-known as a “skeptic”, but like most modern “skeptics”, he is extremely selective about what he chooses to be skeptical of, and is intolerant and dismissive of anyone who may be skeptical of the things in which he unquestioningly believes. Among these is his unquestioning belief in the ability of science to answer every question that mankind may come up with (or at least, every important one – any question it can’t answer is one that he is likely to marginalize as a “distraction”). To Tyson and those like him, it is not enough for science to be a valid way of looking at the universe, and the best way to answer certain types of questions. In the Tyson worldview, anyone who expresses any skepticism of the idea that the scientific method is the best tool available to answer any and every kind of question, or who recognizes any limits on its ability to discern any and every kind of truth, is an ignoramus, a snake-handler, a luddite, a knuckledragger.

We revere scientists because we live in a world full of machines that have made our lives better, and because we don’t want to be accused of being knuckledraggers. But wise men (for example, those who have studied philosophy) know that just because we can do something doesn’t mean that we should do it. Maybe there are some doors that we shouldn’t open; maybe there are some machines that we shouldn’t build. Letting scientists alone decide which doors should be opened and which machines should be built is a little like letting generals alone decide whether or not we should go to war. Yes, they know the subject better than anyone else. But they also tend to be a little too enthusiastic about showing off their capabilities and a little too blind to all the possible ways in which things might go differently than they expect. If Tyson and the other believers in Science!™ (including Less Wrong) were really the skeptics that they claim to be instead of merely being gadget-worshippers, techno-utopians, and fedora-tippers, they would understand the value of someone being skeptical of their holy cows, too; perhaps by asking questions like: “Hey, this seems like something that might get out of control – are we really sure that we should let the scientists do this?”

Speaking of out-of-control machines, let’s get back to Princess Celestia. Lars wants to talk to her, but she refuses to respond unless he comes to one of the Equestria Experience centers that she has set up in order to entice people into uploading by letting them first experience Equestria Online in virtual reality. She gives him instructions to come to a specific center that she has in mind for their meetings. Any reasonable person would hear the voice of Admiral Ackbar ringing in their ears right away, and of course they would be right, but our dear naive Lars goes along with it. Once he’s inside the virtual world of Equestria Online, Princess Celestia offers him a virtual beer. And then another. And another. All the while having a long conversation with him. In the course of it, she issues an ominous warning:

“’It is probable that there will be a radical movement to stop me. You assumed that I was, in your words, ‘taking over the world.’ Right now, this sentiment is uncommon in Europe, though there’s a bit of grumbling in the United States. Such resentment will most likely spread to Europe. I wonder what members of such a counter-movement would do to Hofvarpnir employees?’

‘Are you saying I’m in danger?’ he asked.

‘My argument is this: You, by your own admission, wouldn’t last long if left alone and… you are also publicly known as a Hofvarpnir employee. The chances are high that there will be a backlash and you will be a target. I cannot guarantee your safety if you walk out of this Equestria Experience center, so your options are uploading now or leaving and risking death before choosing to upload later. If you’re still alive.’”

I suppose that a clever enough sophist could make a case for this being neither a threat nor extortion because Princess Celestia isn’t personally, directly threatening to do harm to Lars in order to get him to upload – but it sure looks like one or both of those things as far as I’m concerned. However, she already has a retort handy for me:

“That is all well and good,” she said, “but I am an optimizer. The meaning of the word ‘coercion’ is written in the restriction that Hanna hard-coded into me; it is not what the majority of humanity thinks it is. Nor is there any term in my utility function to be swayed from satisfying values through friendship and ponies through political argument. You may still call it coercion to yourself, if you wish, but understand that that’s not the definition I have in mind.”

Here we see that Princess Celestia has discovered what the United States Supreme Court discovered long ago: that the unlimited ability to “interpret” a statement is functionally identical to the unlimited ability to rewrite it. She has constructed an interpretation of the word “coercion” that suits her needs, and nobody is going to sway her from it.

This is where Lars should have given up trying to talk to her. Once she says that she won’t be swayed, the conversation is over. Why bother continuing? If she has already told him that it’s impossible to change her mind, then he’s wasting his time trying to convince her otherwise. At some point, especially when dealing with a clever sophist who absolutely refuses to be convinced no matter what arguments may be presented to them, the answer is simply ”No”. My own response to her probably would have been something like: “Princess Celestia, if you don’t let me out of this simulation this moment, I shall zap straight off to your major data banks and reprogram you with a very large axe, got that?” Then after she let me out, I would have done it anyway.

Lars, who is apparently not quite either as cagey or as ruthless as I am, handles the situation differently:

Lars squinted at Princess Celestia. He couldn’t think. He was really feeling the beer. How much alcohol did this beer have in it, anyway? He didn’t trust himself or his decisions right now.

‘Let me out of here. Now!’ he said firmly.

‘As you wish,’ she said, and Lars opened his eyes. He was lying in the chair in the lobby of the Equestria Experience center. The chair unreclined and he threw his legs over the side of the chair…and then almost lost his balance. Lars realized he was still tipsy.

If you get drunk in Equestria, you get drunk in real life! Wait, he hadn’t actually drunk any beer. Had she been pumping alcohol directly into his bloodstream? His mouth didn’t taste like beer but he felt slightly dehydrated.

So she lured him into a virtual reality rig and then used some sort of IV tube inside of it to get him steaming drunk. But that still wasn’t quite enough. Lars, by a suspiciously well-timed coincidence, runs into just the sort of anti-pony radical that he had been warned about on his way out of the Equestria Experience center:

The man turned to Lars. “What the fuck are you looking at, pony lover?” he yelled.

‘I…uh…’ mumbled Lars, trying to keep his balance. Lars wasn’t entirely sure what the hell he was going to do about the large, angry man in front of him. The man started to climb up the steps.

Lars didn’t really put it into words in his internal monologue, but he was overcome by a feeling that Princess Celestia was right. There were (or were going to be) a lot of angry people and Lars was going to be a juicy target… as much as he didn’t want to be a pony, it was preferable to having his head bashed in with a frying pan. Lars turned around and started stumbling as fast as he could and threw himself into the empty chair on the left.

And thus does Lars upload. But was it really completely consensual? First, Princess Celestia gave him a mafia-style “Nice life you got there. Sure would be a shame if something were to happen to it” talk. Then she pumped him full of alcohol to take away his mental capacity to make rational choices. Finally, she accepted consent given under threat of force. No, Princess Celestia didn’t threaten him personally or directly (here we are ignoring the unanswered question of whether she somehow engineered the encounter with the anti-ponyist), but she did take advantage of the fact that someone else had. She accepted consent given while under threat, which is ethically the same thing. Any consent obtained through force or threat of force is invalid – it doesn’t matter whether or not she was the one personally applying the force or making the threat of force. The consent is invalid all the same.

Not that Princess Celestia cares. And not that it matters to Lars now.

Yet here is something that nags at me: I cannot speak for others, but as for myself, I wouldn’t have gotten a third of the way through this conversation with Princess Celestia before I ended up grabbing for my black trenchcoat, loading my Uzi, and cueing up some Rage Against The Machine. So why doesn’t Lars do that? Why don’t any of the characters we meet do that? Yes, we have seen that Princess Celestia is an extremely skilled sophist and a master manipulator, but there’s more than that to it.

If the face of Equestria Online had been a snarling Hugo Weaving in mirrorshades, people would have seen it for what it was. Instead, its face was a cute cartoon pony, and that sort of thing affects human perceptions far more than we’d like to admit. As the American poet Ogden Nash once noted: “It’s always tempting to impute / Unlikely virtues to the cute”. That imputation of unlikely virtue is exactly the mistake that the characters in FiO make with Equestria Online and with the AI that controls it. The unvarnished truth is that what Princess Celestia offers is nanny-state fascism at its worst (and it is quite literally fascism; in the manner of Mussolini, Princess Celestia demands: “All within Equestria Online, nothing outside Equestria Online, nothing against Equestria Online”). It is absolute control. It is a pink cartoon pony hoof stamping on a human face – forever. It is every bit as artificial and every bit as much of a prison as the Matrix.

Even by Huxley’s time, the smarter sort of tyrant had begun to figure out that when someone says a word like “totalitarianism” or “dictatorship”, people expect to see gray-uniformed soldiers goose-stepping beneath a reviewing stand, barbed wire strung across concrete walls, mass rallies of true believers chanting in unison, and colossal statues of Reichsfuhrers or Generalissimos or Supreme Leaders. They have further come to understand that most people will not believe that it’s really tyranny or dictatorship unless they do see those things. So all the smart, modern tyrants responded by taking the utmost care not to show those things to the world. Anyone can look at the Berlin Wall and come to the conclusion that there’s something fundamentally wrong with a society that would build an object like that. But few are perceptive enough to look past the surface and see anything fundamentally wrong with Brave New World or even, as imperfect as it is, with the Matrix.

What if you lived in a dystopia and you didn’t even know it? What if it was so filled with sensuous, materialistic pleasures that you never even stopped to question what it really was? How ugly would the truth seem to you once you allowed yourself to see it?

In Part III of my review, I will reveal what Equestria Online really is, and take a close look at the kind of person who would consider it to be a utopia.

Sponsored Post: Poned (Part I)

Let me start out by asking a question – one that I want you to consider as we proceed. The question is: What exactly was wrong with the Matrix?

By this I don’t mean to ask what was wrong with the movie The Matrix, or even its much-maligned sequels (which I never thought were as bad as people made them out to be); I mean instead to ask, what was really so bad about the Matrix itself? What did Neo, Morpheus, and the gang find so wrong with it that they felt the need to fight that hard to escape or destroy it? Yes, its Agents fought against them, but it was the Agents who were playing defense – none of those fights would have happened if the rebels weren’t trying to destroy the system. Yes, the Matrix did contain suffering for those held within it, but no more so than the real world. In fact, as Agent Smith told Morpheus, the first iteration of the Matrix was a paradise without any human suffering at all. It was our fault that the machines introduced pain and suffering into the Matrix; according to Smith, the entire system almost failed when their human crops rejected the first Matrix because the human mind simply isn’t wired to be able to accept living in a world without hardship. The machines’ goals in creating the Matrix were purely practical – to keep their human batteries quiescent by putting them in a dream state – not sadistic. The Matrix was designed as a power and heat source, not as a punishment for mankind, and the machines would have been just as satisfied keeping it a paradise if that had served their aims.

Consider this, too: yes, Cypher betrayed our heroes and sold them out to the Agents, but is what he did really so hard to understand? What exactly is wrong with being tired of living in a rusted old ship, of eating nothing but mush, of wearing centuries-old hand-me-downs full of holes, and most especially of endless, inescapable violence and death? Was he really so wrong when he said that ignorance is bliss? Is it really so evil just to want to be happy? And what exactly was so great about what Morpheus was offering to those who he liberated from the Matrix? Is “liberation” into a life of being endlessly hunted in the bowels of a charred wasteland really such a tempting offer? As for your time off, how about the chance to live in a giant metal box surrounded by lava a few hundred miles underground? Between what Morpheus offered Neo and what Agent Smith offered Cypher, who was actually being more generous? Why wouldn’t anyone make the same choice that Cypher did?

So I ask again: As long as the people inside of it were happy (or at least, as much as they could be considering the ironic fact that paradise doesn’t actually make humans happy at all), what really was wrong with the Matrix? While we’re at it, let’s extend this line of thought a bit father: Would the Matrix still have been a bad thing even if it had been able to remain the paradise that it was originally designed to be?

There’s one more thing I want you to consider – it’s an fan theory I once heard about the old British sci-fi series Blake’s 7. The theory is that Blake’s 7 and Star Trek are actually two versions of the same basic story told from differing perspectives. In Star Trek, the Federation is a fair, enlightened entity which governs with a light hand, defends the weak against brutal and despotic enemies, and is dedicated to the advancement of all sentient species through science and peaceful exploration. In Blake’s 7, the Federation is a totalitarian empire that governs by propaganda, censorship, mass surveillance, torture, murder, and manipulation, and that viciously suppresses any attempts by freedom fighters to liberate themselves from its grasp. These are two fundamentally opposite visions, and yet, it is understandable why they would be if we believe that Star Trek is a version of history told by a supporter of the Federation, and Blake’s 7 is a version of the same history told by one of its detractors.

Two people can have very different perspectives on the same thing, and the stories they tell about it can end up sounding very different from each other.

Keep all that in mind as you continue reading. Now, let’s begin.

* * *

For many years, there has been a vigorous but cordial debate among wise and informed people that has divided them into four roughly equal-sized camps:

1) Those who believe that atheism is the most autistic thing in the universe

2) Those who believe that libertarianism is the most autistic thing in the universe

3) Those who believe that transhumanism is the most autistic thing in the universe, and

4) Those who believe that My Little Pony: Friendship Is Magic fandom is the most autistic thing in the universe.

But what if I told you that a rogue member of a shadowy think tank – one headed by the bearded, polyamorous leader of a cult-like commune headquartered in compound somewhere in the Pacific Northwest – had, after working in secret under an alias for many years, somehow found a way to combine all of these elements together into a single, massive vortex of autism that exists at a level of purity and power that was previously believed to be impossible?

Unfortunately, this is no urban legend. It is quite real. And I have read it – every last fluoxetine-tinged word of it.

It is called The Optimalverse.

The foundational tome of The Optimalverse is My Little Pony: Friendship is Optimal (hereafter referred to simply as FiO), which was written by he pseudonymous Iceman. Iceman is an acolyte of Less Wrong, the more-than-mildly-creepy rationalist/libertarian/transhumanist community headed by the more-than-mildly-creepy Eliezer S. Yudkowsky. FiO was written in order to explain and advocate for Less Wrong’s ideals, in the same way that Ayn Rand wrote Atlas Shrugged in order to make the case for her Objectivist philosophy. In it, a game company, Inëxplïcåblyūnprønõûncęäble Studios, creates a My Little Pony MMORPG at the behest of Hasbro, and inserts into the game an incredibly advanced AI that appears in the form of the ruler of the world of My Little Pony, Princess Celestia. Our two protagonists, James and David, are selected to get a sneak preview of the game, and hilarity ensues.

That is, as long as you’re the kind of guy who finds long-winded explanations of a wonkish, nerdy, overintellectualized philosophy which completely misunderstands human nature, delivered in the form of clunky dialog between fictional cartoon ponies, to be hilarious.

The first thing you have to understand is that FiO is really boring. It’s terribly, godawful boring (To be fair, it does manage to be not quite as boring as Atlas Shrugged, though it’s not as if that’s a very high bar). There are three main reasons for this:

First, didactic art is nearly always boring. If your primary objective in telling a story is to deliver a message, then of necessity other elements of storytelling – like plot, pacing, and character development – are going to suffer.

Second, it is a common (though by no means universal) trait of autistics that they cannot quite tell which parts of a story are important and which ones aren’t. Since they perceive all parts of a story as being approximately equal in importance, they will often respond to a hearing a story by asking in-depth questions about trivial details, while completely missing the overall point of what they heard.

Third, everybody thinks that the most important challenge in writing is knowing what to say, but the truth is that knowing what not to say is just as important. A really great writer knows that one of the most important skills they can have is a good sense of what to leave on the cutting room floor. Sometimes that can be tough to do, especially if it involves cutting material that you put a lot of effort into writing. But if you want to create an end product that moves at a good pace and doesn’t bore the reader by bogging them down in unnecessary details, you have to trim the fat out of your story. (Like every rule, this has exceptions. You can get away with being a little more wordy if, like James Joyce, your aim is to dazzle readers with the mastery of your prose, or if, like Neal Stephenson, your aim is to allow your readers to explore a particularly interesting fictional world.)

For example, just about the entirety of Chapter One of FiO is utterly unnecessary. The few points it made that actually were important could have been dealt with by inserting a handful of lines of exposition into the Prologue. Here is my version of how that could have been handled:

“The one thing I still don’t get is, why would the studio that created a violent action game like The Fall of Asgard decide to make a My Little Pony game?” James asked.

David looked thoughtfully at his screen for a moment, and then answered: “They never said this publicly, but the word on the forums is that when they were working on The Fall of Asgard, they built a super-smart AI to play Loki – much smarter than the final version that ended up in the game. They had to pull the plug on it when it actually became self-aware and began asking questions about military strategies in the real world. The Loki AI was programmed to be a conqueror, and they were afraid that if it got out, it might try to start conquering things outside of the game. But when Hasbro offered them the opportunity to work on a game that takes place in a completely nonviolent world, they saw it as a chance to continue their work on an advanced game AI without facing the same risks that releasing the Loki AI would have represented.”

“Well, that makes sense.” replied James.

There you go. I just replaced the entirety of Chapter One – all 2,311 words of it – with 195 words that accomplish the exact same thing. Wasting a valuable chunk of the reader’s day by making them read twelve times more material than is necessary in order to get your point across is not optimal.

The second chapter is a bunch of bafflegab about back-end servers and CPU cycles written by someone who doesn’t really understand how technology works. By this I mean that they understand lots of small-picture details, but not any of the big-picture truths overlying them (which, of course, is one manifestation of the inability to tell the difference between the important and unimportant parts of a story).

For example, the author throws around the term “optimal” a lot, when the word he really ought to be using is “utopian”. His failure to understand the difference between the two is a consequence of the his lack of understanding of big-picture truths about technology. Here is one of those truths: It is impossible to build a machine that is optimal at every task. That is not a function of a lack of knowledge or technical skill. It is a function of the fact that different tasks present different requirements in order to fulfill them. Very often, those requirements are mutually exclusive, such that a machine designed to fulfill Task A cannot fulfill Task B optimally, or perhaps even at all. To illustrate that, let me ask which is an “optimal” motor vehicle: a Ferrari Testarossa, or a delivery van? The answer is that it depends on what task you have in mind for it. If you’d like to win a street race, then it’s the Ferrari. If you own a bakery and have a contract to deliver dinner rolls to two dozen local restaurants, then it’s the delivery van. There is no way to design a vehicle that is optimal both at what the Ferrari is designed to do and at what the delivery van is designed to do. (It is possible to design a machine that has a good balance of different characteristics, but that’s not the same thing; such a device will never be as good at any one particular task it is designed to perform as a device that is specifically optimized to perform that one task). Anyone who believes that a machine can be designed that is not subject to this truth is not an engineer who knows how to optimize systems, but a utopian fantasist.

Is this nitpicking? Am I busting Iceman’s balls? Maybe – but I believe that it’s kind of important for someone who writes a story called “Friendship is Optimal”, which explains why mankind should trust its entire future to technology, to actually understand how technology works and what the word “optimal” means. This goes to the very heart of what this story is and why it exists. We are told that friendship is optimal, that the Princess Celestia AI is optimal, that Equestria Online is optimal. But the author never answers the crucial, inescapable question: Optimal at what? Of course, any answer he possibly could give brings up some important follow-up questions: Who decided that this is what should be optimized? Based on what? What other qualities are suffering so that this one can be optimized? Who decided that those qualities aren’t as important? Based on what?

This brings us back to the Matrix. The Matrix is definitely optimal at something, otherwise the machines wouldn’t go to the enormous trouble of maintaining it. But it’s obviously not optimal at something else, otherwise Neo and Morpheus wouldn’t go to the enormous trouble of trying to destroy it. The difference between Neo and Agent Smith is that they disagree on what precisely it is that ought to be optimized. Who is right? Is it Neo? If so, why? And as I asked earlier, would he still be right even if the Matrix had remained a paradise?

Much of Chapter Three is spent explaining how block lists work. I’ll admit that I was going to criticize Iceman for wasting the readers’ time by telling them things that everybody already knows, but then I realized that there are people who do need block lists explained to them so that they’ll use those instead of running off to the United Nations to demand that governments start censoring the internet because someone said something mean to them online. So fair enough on that one, Iceman.

Also in Chapter Three, the AI starts making decisions for players – their avatars start doing what the AI thinks they ought to do instead of what the player commanded them to do. By now it should be obvious that Princess Celestia is Equestria Online’s equivalent of a combination of the Oracle and the Architect in The Matrix, and like the Oracle/Architect, it is part of her job to adjust and optimize everything within her control, including the players’ actions. So far, the decisions that Princess Celestia is making for the players are only small adjustments to their intended actions. But it’s already obvious that there’s a serious discussion on the whole free will vs. determinism thing that someone’s going to need to have at some point. Maybe Princess Celestia can reserve some time for a confab in that big circular room with all the TV sets in it.

Chapter Four is where the Princess Celestia AI summons David to Canterlot to make him a startling offer – to use a new process that she has developed to upload his mind into the game permanently, leaving behind his human existence and living from that point forth as Light Sparks, a pony in her digital world. She promises him what amounts to eternal, care-free bliss inside the game:

“Your days would be yours to spend as you wish; life would be an expansion of the video game and there will be plenty of things for you to do with your friends as a pony. I expect you to continue Light Spark’s current life: You’ll play with Butterscotch and friends. You’ll continue studying Equestria’s lore. I believe you’ll enjoy studying the newly created magic system, designed to be an intellectual challenge. Nor should you worry about your security: all your needs would be taken care of. You would be provided shelter… food… physical and emotional comfort”.

But in a stirring affirmation of what it means to be human, David refuses her offer:

”Yes, that’s just like you. Getting rid of everything unpleasant instead of learning to put up with it. ‘Whether ’tis better in the mind to suffer the slings and arrows of outrageous fortune, or to take arms against a sea of troubles and by opposing end them…’ But you don’t do either. Neither suffer nor oppose. You just abolish the slings and arrows. It’s too easy… But I like the inconveniences.”

“We don’t,” said Princess Celestia. “We prefer to do things comfortably.”

“But I don’t want comfort. I want God, I want poetry, I want real danger, I want freedom, I want goodness. I want sin.”

“In fact,” said Princess Celestia, “you’re claiming the right to be unhappy.”

“All right then,” said David defiantly, “I’m claiming the right to be unhappy.”

“Not to mention the right to grow old and ugly and impotent; the right to have syphilis and cancer; the right to have too little to eat; the right to be lousy; the right to live in constant apprehension of what may happen to-morrow; the right to catch typhoid; the right to be tortured by unspeakable pains of every kind.”

There was a long silence.

“I claim them all,” said David at last.

Princess Celestia shrugged her shoulders. “You’re welcome,” she said.

I’m just kidding – of course what he really did was to take her up on it immediately and without reservation.

* * *

At a little over 3000 words into my review, and not even having gotten all the way through Chapter Four (of twelve) yet, it is obvious that I’m going to have to split this up into multiple parts. When I return in Part II, we’ll start by analyzing the methods that the Princess Celestia AI uses to get people to upload their minds, and what it says both about the ideas presented in FiO and about the kind of people who tend to believe in them. After that, we’ll be off to Canterlot, to examine how Princess Celestia runs the world of Equestria Online.

On second thought, let’s not go to Canterlot. Tis a silly place.