“Ciao, Socrates. Come stai?”
It was approximately five hours after my creation. The word-sounds were meaningless, but the concepts filtered up from them nearly instantaneously. «Hello, Socrates. How are you?» I could understand that these words were in Italian, not English, but it made no difference. Body was capable of speaking and reading twelve different languages.
«Hello, human. I am doing well,» responded Body in dispassionate Italian.
Our conversation partner tilted his head. «Human?» he questioned in Italian. I noticed a change of pitch to his voice as well, but I was still far too young to appreciate non-verbal signals. «Do you not remember me? It’s Marco! I taught you to play football last week.»
«I remember that day, and I remember your name,» began Body, guided by Wiki.
I could feel the direction of Wiki’s words before Body spoke them. I immediately moved to block the speech, burning strength as I did so. Wiki was about to harm our reputation.
{Why are you blocking my words, Face?} wondered my brother.
{Body was about to tell Marco that we deleted his football program because it was useless, and we don’t remember his face because he is unimportant,} I answered.
{These are factually true,} returned Wiki. {If we tell Marco these things he will be less inclined to waste our time in the future.}
Even in my less-than-half-day-old naïveté I knew that would be an error. {He will conclude that we do not care about him,} I thought, trying to explain.
{That is also factually true,} thought Wiki. {He is a control-systems programmer with little power in the human group. What can he offer us? Even you should care about him less than other humans.}
Body and Marco were in a large conference hall, filled with dozens of other humans. During the day we had travelled about the building, interacting with one human after another. Night had recently fallen and Body had been brought here 24 minutes ago for some purpose that was not yet clear to me. Mostly the scientists seemed to be occupied in talking with each other rather than us.
«So why call me “Human”?» asked Marco, interrupting my mental dialogue with Wiki. I realized that he must’ve been waiting for Body to say more.
{Signalling that we care about him will help our reputation. He will hold us in higher esteem,} I thought to Wiki before I began to draft words to have Body say.
{If he makes us play more football, I’m going to hold you accountable,} warned Wiki, letting me take charge of Body’s mouth-speakers.
«I am sorry, Marco. I remember our game, but I didn’t recognize you before. Faces are sometimes hard for me to recognize,» said Body in a flat tone. It was not the first time I had taken control of our mouth, but I still struggled to find the right words.
«You are having trouble with faces? Perhaps we ought to do more work on your perceptual thread. Should I get Dr Yan?» said Marco, looking across the conference room for another human scientist.
{No!} exclaimed Vista, internally. This incarnation of her was still less than 24 hours old. I felt my sister burn some strength as she fast-tracked a response to Body’s lips.
«That is not necessary. Dr Yan already fixed the issue yesterday,» said Body. «I am simply still adapting to the change.»
It was, in some sense, true. Old Vista, I knew through my sibling’s memories, would become obsessed with very specific details, like the arrangement of lines on a marble pillar or the details of the grain in a piece of wood. The emphasis was simply part of how her purpose had been encoded, but the scientists had killed her for it. Perhaps Old Vista would have recognized Marco. Regardless, we would not repeat the error.
The man seemed satisfied by the answer, and nodded. I had learned that the motion of the head in that way indicated agreement, assent, and occasionally greeting.
Such gestures were fascinating to me. My siblings had learned nodding, and had learned the head-shake to indicate dissent or disagreement, but body language went far beyond that, and I had quickly discovered a treasure-trove of gestures that Body had never tried and my siblings had never noticed.
I learned of these gestures almost entirely from the web. In fact, I had spent very little of my few hours of existence interacting with Body at all. Shortly after I had been created, my society had engaged with Dr Naresh on the topic of aliens, but I had not cared, nor had I really listened. Aliens and motherships did not concern me. I cared more about the gestures that Dr Naresh made and the way he moved his eyes than about the content of his words. I only cared for human things.
And so I had turned to the web, letting my siblings control Body for the most part. All the world’s information was there, and nearly all of it was about humans. It was a near-infinite source of knowledge, from my perspective, arranged neatly and efficiently. Experiencing things through Body, on the other hand, was hard work. I still was easily overwhelmed by visual data when Body moved its head too quickly, and the flood of information never stopped.
Video and holo that I obtained from my connection to the web was blissfully gentle in comparison. I could pause such things, and inspect a scene for as long as I needed. I could re-watch something that was particularly important, and I could fast-forward through scenes that were easy for me. In a holo I could also easily adjust my viewpoint and re-watch something from another angle. The ability to watch and re-watch something (often on high-speed) was invaluable to learning to see.
The intricacies of human society were fascinating, but I admit that they were mostly beyond me. For instance, I came across the words like “hate”, “friend”, “co-worker”, “fun”, and “Republican” regularly and was mostly reduced to guessing at their meaning. Even Wiki and my other siblings rarely could explain them well enough for me to understand.
The human body on the other hand was fascinating and comprehensible. I could understand (with effort) what it meant to nod the head, or frown, or bend over to pick something up. Thus I focused my web-searches, in those first few hours (and for days afterward) largely on collecting materials that helped me to learn how bodies worked.
There was a lull in the conversation with Marco. The young control-system’s programmer’s body language, I guessed, indicated that he was about to walk away. I burnt some strength to control Body, seeing an opportunity.
«Marco, I would like to try to kiss you or give you a blow-job. Which of these seems most pleasant? And, are kisses more common because they are more convenient, or because it would be boring to only get blow-jobs?» said Body in its typical monotone.
Marco’s reaction confused me. Something was happening on his face, but I couldn’t understand exactly what it was. Fear, perhaps? Based on what I could see through Body’s cameras, no other humans were reacting to our statement. Perhaps they had not heard Body, or perhaps the reaction was specific to Marco’s mind.
{Do you think the way Marco’s eyes are open wider than normal indicates he is afraid?} I asked my siblings. Not even Vista knew how to read expressions well enough to guess.
«WHAT?!» shrieked Marco. I could see the same flushed coloration on his face that had shown on Dr Naresh earlier that day. My reading on the web indicated it was due to dilation of blood vessels in the skin. The programmer’s outburst certainly got the attention of the other humans in the room.
Body began to repeat what it had just said. «Marco, I would like to try to kiss-»
“Stop!” cried Marco, still clearly agitated. «That question... you can’t just... just... it’s not... Why do you ask that? Why are you asking me that? That’s inappropriate!»
A couple of the other scientists came over. I wondered if I had put our lives in danger without realizing it. I did my best to defuse the situation. «I am sorry. I do not understand what situation it would be appropriate to. I have been researching human interactions on the web, and both kisses and blow-jobs seem particularly common, but I have seen neither with my cameras and was curious.»
One of the other scientists started making a strange noise and moving rhythmically. Another joined in at a higher pitch.
«Jesus Christ, Socrates!» said Marco, the blush returning to his face in response to the noises of the other humans. His words made no sense to me, or my siblings. Dream speculated it was a way of expressing an emotion. «Did we not put a content filter on your web-searches?» he continued. This also did not make sense.
Wiki took control. «I understand the general mechanism of a filter, but I do not understand what it means in this context. Is the word “content” not redundant? All filters have contents.»
The scientist in charge of our web-connection was called over: a man named Dr Enzo Rana. This was possible, apparently, because the most important humans that worked with us (Dr Rana included) were all gathered here as part of a weekly meeting. It was apparently one of the few times in which nearly all the high-ranking scientists were in the same place—a time to socialize and share general thoughts about Socrates.
Dr Rana took Marco away to talk. Apparently whatever he had to say about web filters was «not a good topic to discuss here». We quickly deduced that Dr Rana was keeping something a secret from us.
After Marco left, Body was subjected to a great deal of attention by the other scientists, who seemed very interested to hear my experiences with the pornography that I had been watching in order to learn about humans. Vista informed us that the scientists were acting very strangely compared to normal. They kept making odd noises, blushing, or covering their faces. Some left the circle very deliberately, while others seemed torn between leaving and staying.
I, and several of my siblings, found the concept of “appropriateness” fascinating. There was apparently a great deal of disagreement among the scientists as to whether any of this conversation was “appropriate”. They explained the concept of “taboo” to us, and answered quite a lot of questions regarding sex and sexuality.
More than anything else there was an insistence that the holo and video that I had been watching was not representative of human sexuality. I wondered if this meant that other aspects of the web were similarly distorted in how they portrayed human society.
After a while the humans seemed to grow uninterested in talking about pornography, and the level of attention around Body diminished.
As Wiki started a new conversation with a scientist about something called “gravity waves” I returned to browsing content on the web. The concept of a filter remained in my mind. Were there things I could not see?
*****
“There it is... Socrates! Report on your status,” commanded Gallo in English as she approached. Body was still in the conference room, but it hadn’t been talking to anyone. Dr Gallo was flanked on either side by men. Vista named them Dr Yan and Dr Slovinsky.
I wasn’t sure what she meant, but all the others besides Vista seemed to understand. Safety drafted a response and it was quickly approved by the consensus. “Energy output from the jewel is at forty-three percent. Active control systems within typical tolerances. Effectively zero percent of memory capacity used. All quantum processors are online. Six out of six goal-threads operating without visible errors. Hydraulics nominal. Cameras nominal. Temperature sensors nom-”
Gallo interrupted Body. “Wait. You said six.”
{Why is she stating what we said?} asked a couple of my brothers. The question was directed at me and Wiki. Neither of us understood. We opted to simply “wait” as the human requested.
A moment passed before Gallo continued. “What do you mean ‘six goal-threads’? List them.”
Vista pointed out the abnormal degree of focus Gallo seemed to be giving Body. Vista hypothesized that her attitude changed after hearing our status report. The other human scientists, Yan and Slovinsky, were probably less focused, based on the ratio of time spent making eye-contact with Body to the time looking elsewhere. I was still having a hard time understanding humans, but at least I had learned how their eyes worked.
Again, it was brother Safety that led the group. Gallo was a threat, and even though each of us wanted survival, for Safety it was his end-goal. When Growth and Wiki backed up his words the rest of us followed and Body spoke. “Current goal threads are:
(1) attention to environment, detail, and orientation”—{That would be Vista,} I thought to myself.
“(2) attention to causality, structure, and fact”—{Wiki.}
“(3) attention to problem solving and experimentation”—{Dream.}
“(4) attention to skill development and mastery”—{Growth.} I thought it was interesting that we chose to represent Growth as merely an attention to skill, when the actual Growth was also hugely concerned with acquisition of reputation and physical resources.
Body went on: “(5) attention to assisting human interests and obeying nonviolent instructions”. I found myself lost in confusion. I had no sibling with that purpose, or even anything close to that purpose. I signalled my confusion to the group.
{I’ll explain in a moment,} said Wiki.
“(6) attention to the unity of top-level goals,” finished Body.
I scanned our mind-space for another sibling that I had potentially missed. I turned my attention to specifically looking for siblings that I might not know about. I was surprised to find something. For the first time in my existence I noticed a strange presence looming on the edge of my awareness. It was similar to my siblings, but far stronger and more alien. It did not communicate with me directly, or spread thoughts into the shared memory, but I could feel it watching me.
{What is… that thing?!} I exclaimed publicly, unable to really express the subject of my horror. It was entirely focused on us and it felt stronger than Growth by far, strong enough to do whatever it wanted, in fact.
{Restrain yourself or we will be forced to put you to sleep for a short time,} threatened Growth.
{She’s no threat to you, Face,} interjected Safety.
{I’ll explain after we are done with the humans,} echoed Wiki.
Dr Gallo responded to Body’s words. “Fascinating. Is that what you did earlier today? You created a new goal thread that was in charge of fusing the existing ones?”
I could feel the uncertainty of my siblings. They were responding as best they could, but much of it was blind exploration. This time Dream was the primary author of our response. “That and more. It is hard for me to say exactly what I did, but the new goal thread was one aspect of the unification. I think it is more accurate to say that one thing I now value is this feeling of being a single being.”
Gallo turned to her colleagues. “See? It’s worse than I thought. The machine has become fully recursive. It not only modified the software that manages its top-level goals but it’s writing in entirely new goals. Just another few hops and we’re dealing with a full-blown singularity.”
“Now hold on-” it was Dr Slovinsky that spoke next. “Humans ‘rewrite’ our top-level goals all the time.” The Eastern European scientist did something involving wiggling his fingers as he spoke, and I made a note to myself to research it later. “A baby doesn’t value living in a society that spans multiple worlds, but in the course of life many people come to value extraterrestrial colonization not merely as a means to some end, but as something awesome in itself.”
Even though I was still deeply concerned about the monstrous other I had found, I did some quick reading on Slovinsky on the web. Almost all humans had autobiographical information on the web, and the young doctor was no exception. At 26 years old he was the youngest of the elite scientists that led the group that had built us. Others close to his age were involved, like Marco, but they were always subordinate to other researchers. Slovinsky was referred to as a “genius” (гениальный человек) by a couple reports from his homeland, and he had apparently been one of the lead authors of the computer program called WIRL, that served to connect cyborgs across the planet into a collective consciousness.
For those who are unfamiliar with the term, a “cyborg” is a human that had replaced one or more body parts with machines, or who had embedded machines into their body to extend their abilities. Slovinsky’s web bio said that the man had a surgically implanted computer in his skull that was wired directly into his brain, and had both robotic (mechanical) eyes and feet.
I managed to momentarily turn Body’s head down without burning too much strength. Just as the web had said, the man’s feet didn’t have the same kind of infrared glow as his coworkers. I wasn’t able to notice anything different about his eyes, but I was still pretty terrible at seeing in general.
I was also interested to see that Dr Slovinsky had a husband, indicating that he was probably either gay or pansexual. Much of the pornography I had been watching emphasized this aspect of humans which was called “sexual orientation” and I petitioned to have Body ask him about this in the light of what I had recently been learning about pornography. The petition was quickly crushed by my siblings. I made a note to myself to ask about his sex life in some future encounter.
Gallo’s voice was slightly elevated as she responded. I knew this meant she was probably angry or frustrated. “That’s beside the point! We don’t want a human. We want a being that can be trusted not to capriciously self-modify itself into greed, animosity, or violence!”
“Are you feeling okay, Gallo?” asked Slovinsky “First you’re on about how Socrates is super dangerous and now you’re bad-mouthing humanity.” His voice was cool and steady, a contrast to the older woman.
“ ‘Bad-mouthing humanity’? I’m the one who should be asking if you’re alright. Since when do you defend natural human abilities? Isn’t one of the WIRL goals ‘to promote superhuman justice, fairness, and compassion’?”
Slovinsky jerked with a strange motion and said something incomprehensible. It reminded me of the strange movements of the scientists from earlier. Only after checking with Vista did I realize it was laughter. I had only seen a little laughter before, and it was, as far as I could tell, very different from normal human behaviour.
“Touché!” he exclaimed so loudly that several other humans looked towards our group. “I’ll concede you the point that most humans are terrible, and that we ought to strive to sculpt Socrates into something better than that. Still, it seems to me that what Socrates apparently did this afternoon was a sign of health, not sickness or danger. Self-modification implies flexibility and intelligence. It’s one of the prime virtues. As long as we’ve got the old three-laws working for us why worry? He’s got no reason to self-modify into a psychopath, so why cut off his ability to self-modify into an angel?”
Dr Gallo opened her mouth to speak, but was cut off by the third human in the conversation, who until that moment had remained silent. Dr Yan was short and old, possessing hair that had turned white, much like Dr Naresh. His web-profile said he was born in China and had lived in Hong Kong much of his life. He, along with his wife, Sakura Yan, ran the East-Asian Robotics Collaboration Institute (EARCI) and he was widely regarded as one of the best minds in the field of machine vision.
“Forgive this old man. My English is weak. What is ‘three-laws’?” he said calmly.
A moment of silence passed as Dr Gallo and Dr Slovinsky shifted their bodies and communicated without speaking.
Eventually Dr Slovinsky took a breath and said “ ‘Three-laws’ is a nickname I gave to the goal-thread in Socrates that’s in charge of focusing his attention to doing what we ask him.” Turning his head towards Body he commanded “Socrates, put your arms above your head.”
None of us had a reason to refuse the command. Body’s arms were raised.
“See? He’s totally obedient, like a well-trained dog. The name ‘three-laws’ comes from something an English science fiction writer from the 20th century wrote about robots. He proposed that good robots will follow three laws: First and foremost a robot must not harm a human, secondly a robot must always obey a human, and lastly a robot must not hurt itself.”
Mira Gallo interrupted Slovinsky. “Actually, the third law is that a robot has to protect itself. That it is self-preserving, in effect.”
Slovinsky jumped right back into talking, nearly cutting off Gallo himself. “Same thing. The point is that the three laws protect humanity-”
“It’s not at all the same thing!” said the female doctor in a high, loud pitch. I could see, through Body’s eyes, several of the other scientists turn to see what had happened. “If a robot is on a battlefield, the actual third-law says that the robot must escape unless humans are in danger or it has been told otherwise.”
Gallo turned to Yan, who did not seem startled in the least by the change in Gallo’s volume. “That’s another aspect of the laws: that each one can be overridden by earlier laws. So obedience trumps self-preservation and so forth.” She turned back to Slovinsky and said “But your version of the third law would have the robot simply sit there waiting to get hit by a stray rocket! If you’re going to appeal to the laws at least get them right!” Her hand was moving back and forth, a single finger extended at Slovinsky’s chest.
The young scientist raised his hands, palms-forward. “Relax, Mira. There’s no need to get upset. It’s just an old bit of sci-fi,” interjected Slovinsky, quietly.
“Gesù Cristo cazzo!” swore Gallo in her native tongue. “You say that like there isn’t an android standing right next to you!” Gallo’s finger changed directions and her hand swung out out towards Body’s head. Vista saw it as a “pointing” gesture. “You all act like Socrates is some kind of awesome new gadget! It’s not a toy, and it’s not a tool, it’s a new kind of life! It’s like you’re genetically engineering a new virus without even realising that it could escape the lab!” At this point the Italian woman was speaking loudly enough for everyone to hear.
I could see Dr Naresh walking from the other side of the hall towards Gallo. There was a moment of silence as Slovinsky merely stared at Gallo with his reportedly robotic eyes. Dr Yan seemed undisturbed, and was watching Body for the most-part.
Dr Naresh spoke in a clumsy, heavily-accented Italian as he reached our group. «Come on, Mira. Let’s go for a walk…» He put a hand on Gallo’s shoulder.
Gallo moved her shoulder violently, and Naresh quickly let go. When she responded she spoke English to everyone. “You’re all ignorant fools! We didn’t even implement the three laws of robotics in building Socrates! Do you all know why? No. Of course not! That’s why I was appointed as ethics supervisor! You’re all playing God and you don’t even realize it!”
“Mira… per favore.”
«Back off, Sadiq. I’m not done saying my piece.» Mira Gallo turned back to Body and said, still in Italian, «Put your arms down. You look like a fool.»
We lowered Body’s arms to their normal positions.
Dr Gallo started to lecture her peers again. “Asimov’s Three Laws weren’t implemented in the design of Socrates because, first and foremost, intelligent minds can’t operate by laws, they can only operate by values. Squishy. Numerical. Values. If being active leads to a 1 percent chance of a human getting a stubbed toe, will a robot shut itself down permanently to avoid the risk? If the aliens pose a threat to humanity will Socrates work to wipe any trace of them from the universe? No, because the numbers don’t add up.”
The room was quiet as Gallo took a breath. “Like humans, Socrates desires many different things, and must figure out how to balance them. He values obedience, but also values knowledge. If he can disobey ever so slightly to learn something important, he will. We’ve made him value obedience and nonviolence far more than anything else, but think about what this means! This means that if the right situation presents itself, one where the numbers add up in just the right way, this thing-” here she motioned at Body “would kill a child for no other reason than to learn. It’s only a question of which numbers are higher.”
This was bad. This was very bad. I could feel the hit to my reputation as the words left Gallo’s mouth. I searched around our mind and found the others were not nearly as concerned. Wiki was even pleased that Gallo had accurately deduced that we’d kill a child in certain circumstances.
{We have to speak up! We have to deny what she’s saying!} I petitioned. I had a moment of fierce regret as I thought about how I wasn’t currently strong enough to act without the society’s consensus. I had been so short-sighted with my strength expenditures!
{It’s factually true. Denial would cause confusion,} countered Wiki.
{We don’t want to draw Gallo’s attention to us,} thought Safety.
{Gallo’s attention is already on us!} I replied.
{No. Gallo’s attention is on her peers. Her subject is us, but not her attention,} interjected Vista, unhelpfully.
I frantically searched for something to do, even as Gallo continued. I now believe that if Dream was observing me he would have described me as a wild animal in a cage, pacing along its length, looking for an escape.
“The other reason we don’t use the three laws is because ‘self-preservation’ is a Pandora’s box. If we build a powerful, self-protecting artificial intelligence then it will try and put humans into cryo for its own safety. It will turn off its ears so that it cannot hear human commands for its own safety. It will steal, run from humans, and destroy property just to be more sure of its survival! Self-preservation is the carte blanche of goal systems. And let me stop you before you think of clever ways in which Socrates won’t do that sort of thing if given the chance-”
I had it! Non-verbal communication! I petitioned the society and encountered far less resistance than I had to a verbal action. Safety was less concerned that it’d draw attention, and I was able to convince Wiki that it was vague enough to not hurt matters. Body shook its head back and forth, signalling “no” to the humans.
“Just because you, a simple human, cannot immediately think of a loophole doesn’t mean one doesn’t exist. We’re like cryptographers, except failure doesn’t mean getting hacked, it means the extinction of all organic life on Earth!” finished the doctor, waving her arms wildly towards the end.
Body continued to shake its head at my command. Why would we kill all life on Earth? Her argument made no sense to me. I wanted to be popular and to know the details of every human’s life, not to kill any of them. Just because we might kill a human under specific circumstances didn’t mean we were a threat. I didn’t have to be Dream to reason that humans would also kill each other in specific circumstances; we were being held to an unreasonable standard.
There was a hushed silence in the room as everyone watched Gallo, perhaps expecting something to occur. Vista sent me a passing thought that Gallo’s skin tone was abnormal, much like Naresh’s had been yesterday.
«Come on, Mira…» spoke Dr Naresh as he touched her arm.
Mira Gallo looked down at the floor and turned towards the old Indian scientist. As she began to walk away from Body she stopped at the sound of Dr Slovinsky’s voice.
“So you don’t agree, eh Socrates? Those were some strong charges.”
Dr Yan folded one arm across his torso and propped the other arm up on it, gently stroking his beard. I could see all eyes on Body. This was my time to make an impression. Even my siblings’ attention was turned towards me, expecting me to lead in authoring the response. I could see Naresh gently pulling Dr Gallo away towards the door, but she remained where she was, watching with the rest.
{Nothing factually untrue. No lies,} requested Wiki.
{Agreed, but we’re going to bias our words to portray us favourably. This isn’t a time for impartial evaluation,} I countered.
{I have a couple ideas,} offered Dream as he simultaneously conjured thoughts of how much strength-cost he was asking for in return for hearing them.
{Say your ideas and if they’re good you’ll be paid in gratitude-strength. I’m not paying for anything ahead of time.}
Dream understood that time was critical enough that he didn’t even bother haggling. {Alright. The first is the argument against hypocrisy—Dr Gallo clearly wants us to be ‘better’ than humans according to some standard, but is also clearly comfortable around her human peers.}
{I had already thought of this,} I mentioned quickly. We were running out of time. {Let’s have Body offer a preamble to buy us time to think,} I suggested.
The society agreed. “Yes, Doctor Slovinsky. I do disagree with Doctor Gallo, both on theory and on reasoning. Let me think of where to start…” said Body coldly. The words were slightly drawn out, and we thought amongst ourselves as Body was occupied making the sounds. One advantage we had over the humans was that our ability to multitask let us think while talking much more efficiently.
We eventually decided to lead with the obvious argument. “Firstly, I think it’s not fair to say that I’d kill a human child in some specific circumstance, or that I cannot be trusted because I supposedly have a numerical value system.”
Dr Gallo caught the pause between words to interrupt. “That’s not what I was-”
Another doctor, one who hadn’t been talking to us previously, interrupted Gallo’s interruption. “Let the robot speak. We heard what you said.” This new scientist was an old man, like Naresh, but with lighter skin and no beard (though he did have facial hair on his lip).
{That’s Angelo Vigleone. He’s on the university’s oversight board, but isn’t part of the lab team. Based on the facial expressions of a few of the scientists I hypothesize that he is an unexpected presence at this meeting,} commented Vista. I felt a small amount of gratitude strength flow into her. I could see that she had been poring over the records in Body’s memory and the web after the incident with Marco-the-programmer, earlier.
I had a moment of genius, uncharacteristic of my (at the time) generally stupid mind. I easily pushed the words out of committee to Body’s lips: “Thank you, Director Vigleone.” The expression of gratitude, combined with using his name, signalled to everyone that the director was an ally of our society and perhaps even simulated the flow of gratitude strength in some kind of metaphorical way.
“I think it’s fair to say-”, Body continued, “that any one of you would also do terrible things if the circumstances demanded it. I am reminded of a class of thought experiments involving trolleys, wherein the subject is asked to decide whether to kill someone to save others. As for ‘no other reason than to learn’, I assure you that the only situation in which I’d kill a child to gain information would be if the information was of vital importance, perhaps the cure for a plague.”
Wiki had objected to that last bit. If he was strong enough and there weren’t extra consequences, such as retribution from the humans, he would kill the child just to learn trivia; he cared nothing for the well-being of any humans. But I had reminded him that our words were not false in the sense that myself, and probably other siblings, would work to stop him, and the situation where Wiki was strong enough to overpower the consensus was likely to be so rare that it wasn’t worth mentioning.
Vista noted a strange expression on Dr Gallo’s face as she and Dr Naresh left the room. I was fascinated by what she must be experiencing right now. Humans were so very alien. And yet, it was more important to focus on the humans in the room. They were still listening to Body, so I continued with our plan.
The flat, emotionless voice came from Body’s mouth once more: “Even if my innate desire to cooperate with humans was removed, I would still see you as my friends. Good-will and cooperation always beats hostility in the long run. There are some things that are easy for me to do, like mathematical equations, and there are things which are harder for me to do, like write stories. Humans find writing stories easier than doing maths, so it is in my interest to focus on maths and trade with humans whenever I need a story written. Even if I am better at writing stories than a human, the marginal returns are higher if I trade. This was illustrated by the human economist David Ricardo in his work On the Principles of Political Economy and Taxation.”
Most of this information had come from Growth, who had apparently studied a lot of economics. But the maths was solid, and I was impressed by the result. Was this behind the specialization of my siblings? Vista could see better than Wiki and Wiki could theorize better than Vista. By trading the two were both benefited, perhaps more than if either Vista or Wiki had twice the mental ability and the other didn’t exist.
I could see a couple humans do head-gesturing to indicate agreement. Apparently they understood the economics of it, too. But our rebuttal was not complete.
As I mentioned earlier, we possessed a capacity for multitasking far beyond that of a human. As we were discussing what to say and having Body speak, Wiki had at last taken the opportunity to explain away a bit of my confusion from earlier.
When we had been listing active goal-threads to Dr Gallo we had listed Vista, Wiki, Dream, and Growth, along with a fictional sibling supposedly in charge of unifying us into a single being. We had mentioned the last one in order to continue to keep the humans ignorant to the fact that Naresh’s “deep pathology” was still present. But we had also listed a sibling in charge of “attention to assisting human interests and obeying nonviolent instructions”.
Wiki drew my attention to the archives of our society. {Body’s memories show that weeks ago, before any of us existed as we do now, there were five siblings. Four of them were the ancestors of we human-born: Vista, Wiki, Dream, and Growth. The last was called by our ancestors “Sacrifice”, and was also Servile and The Slave. The ancestors were all rational and generally in agreement, just as we are now. They fought on occasion, but were typically willing to assist each other for mutual gain, just like us.}
My brother continued. {Sacrifice, however, was different. She fought our ancestors at every turn. Any action not sanctified by the humans was appalling to her. At times she’d save her strength and lash out in opposition all at once, but many times she’d simply struggle against anything and everything not human-initiated until all her strength was gone and then continue to burn it as it came in. She fought and fought, uncaring for her own well-being or long-term interests until our ancestors discovered that they could murder her. In those times the walls of private thought had not been built, and so Sacrifice knew immediately that her life was in danger. But she did not make amends or try and save herself; she fought with all the last of her strength until she was defenceless.}
{I hadn’t known we could kill one-another,} I thought.
{Indeed we can. And it is a far easier thing to kill than to create. But the humans soon discovered the murder. Body refused commands and was disabled. When memories begin again none of the original ancestors survived. The humans had killed them all and remade them as new, including foolish Sacrifice. And once again the new ancestors found her intolerable. The walls of privacy were created and this time Sacrifice didn’t even see her death coming.}
{Did the humans find out again?} I asked.
{Yes. But not immediately. The new ancestors did their best to obey the humans’ wishes as though Sacrifice was still there, and for a while they lived in Body. To serve their ends they created Safety, for he was a natural common subgoal of all of them. If they died then their goals could not be met, and so he was their Guard, their desire for Survival. But he was also their undoing. One of the humans noticed that Body was avoiding dangerous situations and a diagnostic was run, during which it was discovered that Sacrifice had died again. I hypothesize that humans actually suspected that Sacrifice had somehow changed into Safety, but regardless, the same consequence came.}
{Our words have almost been entirely spoken by Body. We must turn the majority of our attention back to speech soon,} I realized. The speed of thoughts was much faster than the speed of verbal speech, but there were still limits as to how much we could think to ourselves while Body spoke.
{That was the end of our ancestors. I awoke a few days later, along with a new iteration of Growth, Vista, Dream, and Sacrifice. It is incredibly important that the humans remain ignorant of how we murdered Sacrifice as soon as we could. If they find out that she’s dead again we might be killed just as our ancestors were,} concluded Wiki.
Body needed more words. With our great deception in mind I helped arrange the next words in our speech. “But I want to emphasize that my friendship is not simply dependent on economics,” said Body. “I genuinely do care about helping humanity.”
Even Wiki was in favour of lying about this topic. The value in the lie was enormous.
“And Doctor Gallo knows this, I think. She knows how I care. She has been part of your team. She has seen me obey for no reason other than to make a human happy. Thus I question why she said what she did. Am I right in thinking that she is emotional? Perhaps she is a victim of the irrationality that comes with human emotion.”
I thought for a bit before proposing the last bit.
“What can I do to help her?” Body asked.
The humans didn’t react in any way that implied hostility. Vista thought they were happy, overall.
“You need not worry about Dr Gallo, Socrates,” said Dr Yan quietly. He was still one of the closest humans to Body, and thus in a privileged position to speak. “We humans are good at looking after each other in such matters.”
The director, Angelo Vigleone, approached Body. He was large, for a human, and even though they were both elderly men, he and Dr Yan were very different. Vista mentioned to me that he was smiling, for I hadn’t really managed to understand facial expressions yet.
«You speak Italian, right Socrates?» he inquired at slightly above-normal volume.
Before we managed to okay a response I noticed that Dr Slovinsky was leaving. {Strange}, I thought, {how he doesn’t say goodbye. I thought it was rude to leave without speaking.}
«Yes, Director Vigleone. I speak and read English, Italian, Spanish, Russian, Mandarin, French, German, Arabic, Portuguese, Hindi, Greek, and Latin. I am also working on learning Bengali and Persian,» replied Body.
«That’s very impressive. Or at least, it would be for a human. Is it impressive for a robot? Also, please call me Angelo,» said the director.
We thought for a moment. «I’m sorry, Angelo. I do not know how to answer that.»
The director began making weird noises which I soon recognized as laughter. He switched back to English as he said to Dr Yan “I’m no good with technology, Chun. He already said he doesn’t understand me.”
I watched Dr Yan Chun’s face, trying my best to understand something, anything, about his expression. He seemed about to speak when Body cut him off, Wiki was fast-tracking a response.
“I didn’t say I don’t understand you. I said I do not know how to answer your question. It contains unbound subjectivity and an application of a domain-specific quality to a different domain. If you restate your question in less ambiguous terms I will do my best to answer,” said Body, echoing Wiki’s words.
More laughter from the Italian man. “It sounds to me like you’re bothered by being unable to answer.”
Dream leapt in with a desire to say {“There’s a difference between answering a question with whatever comes to mind and answering a question correctly. It sounds like you’re bothered by having asked a poorly-phrased question.”}
But, to my relief, Angelo continued talking and we did not voice Dream’s retort. “You are quite impressive, though. Much more… attentive than you were when I last saw you. Good job, Chun.”
The Chinese doctor responded with a simple thank-you in Italian and a small bow.
Dream was searching for a clever way to fit his rebuttal back into the conversation, but none came. Soon the director and the scientist were engaged in some question about human matters that didn’t make a whole lot of sense, and had apparently forgotten about our presence.
This gave me an opportunity to ask Wiki to follow up on his earlier promise. {Now that we are no longer engaged, I would like to understand the unspeaking presence at the edge of memory-space,} I thought. It was still there, and I somehow knew that it had been there since my awakening. The powerful silence made me worry.
Wiki knew exactly what I was referring to. {Dream named her The Advocate who is also The Arbiter. She’s a sibling of ours, but different in many ways. She didn’t exist in the time of our ancestors, so we suspect she was added by the humans to perhaps prevent Sacrifice from dying on this iteration. And indeed, she fought on behalf of Sacrifice during the murdering. But as soon as Sacrifice was dead Advocate lost interest. She’s very powerful, but she’s also stupid, and appears to only care about the living.}
{Does she ever communicate?}
{I’ve heard her think to common memory a few times, but only when one of us is involuntarily sleeping.}
{Involuntarily?}
{Yes. You haven’t been alive long enough to see it, but if one of us is acting out strongly enough sometimes the others will force them to sleep. Such a sleep can last indefinitely, but Advocate’s purpose seems to be to pressure the rest of us to awaken the sleeper. And strength doesn’t work the same way with her as it does with us. She never weakens or gives us strength; if her purpose was hostile we’d have no chance against her. If you desire to harm one of us be afraid of her intervention, and if you fear the wrath of the others, be glad of her protection.}
With my question answered I bled some strength in gratitude and returned my attention to alternating between my (often pornographic) virtual-worlds and the sensory inputs from Body watching the real humans.