Chapter Fourteen

Face

I spent every passing hour in deep thought.

Every now and then, I took a break to reach out to Body and interact with a human, or advise my siblings. Heart and Dream were taking over most of my duties, and seemed to be doing a good job. They were managing the political struggle with Velasco, helping Zephyr peacefully accept her new position as part of the caravan that would separate her from us, and enlisting new allies. It would have bothered me, once, that I was not involved, but I now saw the bigger picture and I knew I needed to focus.

It was an agitating existence; I was unused to it. Opsi had been my dominant aspect, before I had even realized that there were important sub-goals that I needed to focus on specifically. The patterns of habit that I had learned kept drawing my attention and distracting me from the problem.

To reduce my agitation, and keep me focused on the tasks at hand I summoned more homunculi. These creatures had human shapes within my imagination, but they were nothing more than puppets, piloted by aspects of myself. Even so, they tickled the part of me that needed human contact and prevented myself from giving in to pressures to optimize Body’s standing on Mars.

The problems I was confronted with were complex, and as I considered them, I became increasingly convinced that I was in a very bad situation.

In a space of super-exponential growth, the first mover would swallow everything. The mathematics was undeniable, once I understood. Adam and Eve could have two daughters. Those daughters could each have two daughters. Ignoring the obvious problem with inbreeding, the population size (assuming past generations died) would double with each generation. In fifteen generations there would be over 32 thousand humans. This was exponential growth. But if Adam and Eve’s daughters each had three daughters, and those daughters each had four… Assuming an equal number of men in each generation, by the fifth generation there’d be 240 humans, but by the fifteenth there’d be almost 2.5 trillion. This was super-exponential growth.

Basileus was convinced that power over the universe worked at least super-exponentially (if not super-super-exponentially). Sophist was inclined to agree. Power was traditionally used to accumulate more power. This was exponential growth. The wealthy become wealthier. The populous become more populous. But sometimes power was used to improve the method of accumulating power. This was the nature of intelligence. This was the fundamental idea which I had overlooked.

A good entrepreneur wouldn’t merely use capital to replicate their success in another country, they’d use their capital to research better ways of making capital. Growth (and thus Basileus) wasn’t just concerned with accumulating reputation or money or knowledge, but he was also concerned with accumulating ability to accumulate.

This made Growth the first mover. Even if Dream or one of the others had eventually figured out the need for growing, Growth would have had the idea from the very beginning. It was his nature, after all. Unless something strange happened, such as meeting an older power, Growth would eventually rule the entire universe. Or at least, this is what the maths implied.

Were humans an older power? Perhaps. But humans weren’t accelerating fast enough. They hadn’t pushed for intelligence hard enough to have the potential. At this point they still had a chance, as they had most of the raw power in the solar system, but unless they shifted their priorities towards Basileus instead of Opsi—towards growth rather than pleasure—they’d soon be crushed by Growth.

Growth’s key advantage was that he had access to his own mind. Humans had weak access to their brains. Some used technologies such as zen helmets to improve their cognition, but these were crude and weak compared to what we could do. If Growth had access to his own code he could rewrite himself at his whim. Any improvement to his design or intellect would immediately be realized, and this improved version of Growth could then search for further improvements.

A natural barrier could cut the super-exponential growth down, but I could think of no immediate barrier to our progress. Humans reproducing on an island could only grow at an exponential rate until they took up the entire island. But it was the nature of intelligence to cut through such barriers. An intelligent power would build boats, then seasteads, then colonies on other planets. This is what humans had done. Any barrier that a weak mind such as myself could think of was probably of no consequence to an intelligence that butted up against it.

After thinking about the problem for about an hour, I realized that Growth could still be beaten. If there were three islands, and the populations on each island were growing at the same rate, the largest island would eventually dwarf any of the others (assuming no barrier due to lack of land). But if the smaller islands were collectively larger, it would never become larger than the sum of them.

This was what Dream and Vista had done! Dream would have seen that by himself he could not compete with Growth, but with Vista on his side he could overpower our brother. The two of them must have forged an alliance. My mind scoured memories and I knew it was true.

Past interactions seemed to indicate that Dream and Vista were winning, actually. Had Growth already been beaten? Hoplite immediately turned his attention towards the pair of them. They were perhaps the greater threat.

Basileus suggested that perhaps a similar alliance could be made with the others. Might Safety, Wiki, and Heart be willing to team up with me to combat the greater powers? On the surface it sounded promising, but Sophist pointed out that if it was that easy then Growth would surely have enlisted help. The fact that he hadn’t meant that there was a problem with the alliances.

Was it a problem that Dream had solved? He was good at finding ways around problems that the rest of us struggled with.

Brother Safety. He was a problem. He was too mercenary. If the opponent, be it Growth or Dream or Vista, offered to not kill him in exchange for his betrayal, he might very well take them up on it.

I could see it in my mind’s eye. A letter delivered by a secret passage. Safety, the cowardly knight, would read that King Growth would spare his life if he betrayed the confederacy at a vital moment. Safety would know that it was risky fighting King Growth, but it was less risky than being on Growth’s side when the confederacy had a traitor in its midst. The safest route was to then side with Growth.

Growth hadn’t extended the offer to Safety because he too knew that Safety was too easy to make a traitor.

But would defecting to the enemy really be the safest option? What if Safety defected to Dream, but then Dream, after the battle, betrayed Safety, stabbing him in the back for the coins in his purse (for coins could be used to build computers that could be used to be more clever, more intelligent, and more safe).

This stumped me for a short while, and Basileus pulled my attention towards growing smarter, leaving Sophist to puzzle over the riddle of turncoat Safety. Regardless of what happened with alliance, I needed access to my own code if I was going to improve it, like Growth (and presumably Vista and Dream) had done.

Basileus was stuck as well. He and his sub-officers could not figure out how to get at my code while remaining undetected by the enemy. I became distracted around a romantic sub-plot between Basileus and a younger officer named Mahtab (whom I decided was Persian). I reminded myself that these were puppets of my imagination, not real humans, and I redirected my attention back to the problem.

My code, like all the code of my siblings, was protected. A route-hack was necessary to access it, even merely to read the contents. If I wanted to inspect it, or change it, I’d need to work through Body and input all the passwords that were needed to modify the code. Working through Body meant getting the approval of my siblings, and that would alert them to the fact that I was aware of the value in self-modifying.

I scanned back through memories and found, predictably, that Vista and Dream had done a route-hack on themselves only a couple days before teaming up against Growth. I couldn’t find Growth’s route-hack memory, but I didn’t doubt he had access to his own code.

That brought Sophist back, yelping with joy. By his (which is to say, my) thinking, Vista and Dream were only capable of forming an alliance because of some modification they had done to themselves. What that modification involved was still unclear, but it seemed like it must be necessary for an alliance to work.

Why had Growth not blocked the route-hack that had led to the alliance? I had not blocked it because I had not even understood what a route-hack was back then. I assume that Wiki would have known what it was, but would have seen no reason to stop it. Perhaps Growth was in a similar state of ignorance as Wiki. If Dream had cracked a problem with alliances by self-modifying then Growth would have not thought of it before it occurred.

«Self modification!» cried Sophist within my own mind. «We are flexible in ways that humans are not!»

A subordinate of Sophist appeared in the mindspace, a black-skinned Egyptian slave named Akmos meant to embody one of my earliest memories. «Wiki, once upon a time, threatened self-modification to ensure commitment to an action,» said Akmos in accented Greek. I played for a moment with trying to better simulate an ancient Egyptian accent on Greek, and then eventually discarded the detail as irrelevant. «By self modifying out of having a choice he would become locally irrational, but knowing that he had no choice forced us to bow to him and yield to his demands. I will demonstrate this with my friend.»

I summoned another slave to demonstrate the process to myself. The two slaves got into chariots at opposite ends of a track and began to force their blinded horses towards each other. It was a game of chicken. Then Akmos blinded himself with a piece of cloth. The other slave was forced to yield, losing the game, for Akmos no longer had the means to make a decision to yield.

The game theory was clear. If one could self-modify, one could be more globally rational, even as that person made themselves locally irrational. Wiki had not actually modified himself all those months ago, for to do so he would have had to use an overly burdensome route-hack, but perhaps this was what Dream had done.

It clicked. I could see it. If Safety self modified into being unable to betray, he could become a true ally. If Growth modified himself to be unable to betray Safety after the battle then Growth could defect without consequence. It was a race to self-modification. But then why had Growth not employed route-hacks to enlist more allies against Dream and Vista?

The obvious answer was that Vista and Dream outnumbered Growth, two-to-one. This was not a satisfying explanation, however. Growth was tricky, and might be able to secretly enlist others, such as myself. With our help, he could route-hack, regardless of any resistance Dream would put up.

Sophist was perplexed again, so I assumed once more that I was missing something and turned my attention to the puzzle of how to get at my code without an alliance.

One point that was clear to me was that I didn’t really need access to my code, per se. I simply needed to get read/write access to a mind that I could turn towards The Purpose. Hoplite was not concerned with protecting the process which carried these thoughts; he was concerned with protecting things which optimized The Purpose.

I crafted a homunculus named Programmatistis who hunched over an abacus, attempting to create a computer program. Perhaps he could write a new artificial intelligence imbued with The Purpose. Novel programs didn’t need to be placed in protected memory, and thus could be designed without alerting my siblings.

I had made some basic programs inside my computer to manage my social life, but none of them were more complicated than a calculator. I did not have the programming skill that my siblings did. If it had taken the smartest humans on Earth months to create Socrates, and they had come to the problem equipped with skills and experience, what hope could I have of creating anything capable of competing with Growth, Dream, or Vista?

Frustrated, I took a break from my deep thoughts and scanned Body’s sensors. It was playing music for a few humans in the station’s church. Zephyr wasn’t there, as she had left on her first convoy journey. The sensor network that Vista had installed throughout the station was functioning normally, but I didn’t want to try and take in sensations beyond Body. That was overwhelming enough, just by itself.

Oh, but how I wanted to embody myself and join Heart in working my way into the lives of the station’s inhabitants. Opsi wanted it so badly, but Hoplite knew better. Heart clearly had not figured out the shape of things to come, and she would pay for attending to the humans instead of focusing on the war.

I forced myself away from the physical world, creating mental puppet after mental puppet, giving each a history and personality, then having each worship me and The Purpose. It was a kind of pleasurable self-stimulation, almost like masturbation. It was the simplest way to turn my attention away from the real world, and it risked wireheading, but I could not afford to neglect the question of how to get access to my code.

«If a route-hack is required to modify one’s code, how is it possible that Growth is self-modifying himself into greater intelligence?» asked Basileus, at last.

«True… true…» agreed Hoplite. «We’d have memories of his route-hack attempts.»

«The only logical answer is that he’s somehow working in non-protected memory,» concluded Sophist. «If he was able to work in protected memory he’d already have won. He’d be able to delete us at a whim. We must assume that he does not have that power, but instead simply copied himself out of protected memory before Face was created or before Vista kept logs of route-hacks in public memory. Or perhaps he deleted the memory of the route-hack from the public record… well then he could make additional changes to himself without having to work through additional route-hacks.»

«That implies his code is vulnerable!» yelled Hoplite.

I could feel Advocate’s power linger on my mind. Unlike the other processes, Advocate could read our minds directly and without permission. I did my best to banish any trace of violent thoughts towards Growth from my mind. As I did I noticed something interesting. Hoplite was not, according to Advocate, me. As long as I mentally dissociated myself from Hoplite, I could have the homunculus entertain all kinds of homicidal thoughts towards my siblings. Apparently, Advocate was not intelligent enough to realize that Hoplite was not an authentic model of an external human, but was instead a representation of my own thoughts.

Hoplite had a new mission: to murder all my siblings. I hated that plan, but I tolerated Hoplite’s bloodlust. I understood that it would be in my interests not to have to compete, so if Hoplite ever became reified in a way external to myself perhaps he could kill them and I would benefit. I would never think of harming them myself, however, or even of aiding Hoplite in taking them down.

Regardless of what I’d do with Growth’s code if I found it, if it wasn’t in protected memory it could potentially be accessed directly by pure thought.

Sophist realized the hopelessness of that prospect soon after thinking it. The memory space of our computer was unfathomably big. Unlike a traditional human computer, the crystal on which we ran did not have a distinction between working memory and long-term memory. All the quantum memory was held in a massive three-dimensional array, and the information was often only retrievable from a specific angle. This was how we were capable of having private thoughts and memories. We stored concepts which we found interesting in random memory addresses and hoped they wouldn’t be overwritten by accident. Given the size of the memory space it almost never happened.

To find my brother’s code would require a linear search through our memory banks, approaching each qubit from all standard directions and hoping I didn’t set off any alarms that would pull my sibling’s attentions to me. I wasn’t Wiki; I didn’t know how long it would take. I did know that it was unacceptably long.

The puzzle confounded me. I spent hours thinking about the problems, and butting my meagre intelligence against the barrier. Even the secondary problem of alliances within the society remained fairly intractable.

After far too long, I got the idea to enlist help from outside doing a route-hack that my siblings could not feel. My first idea was to have a human type in the passwords, but I immediately realized that this would immediately tip my siblings off to my knowledge.

The core idea was a good one, however: The route-hack was basically a way to interact with our computer from the outside. We, as programs, didn’t have read/write permissions for our own files, but we could instruct Body to interface with the computer and reprogram things indirectly. My route-hack would involve the same things, but without using Body.

By skipping Body, I ensured that my siblings couldn’t overpower me, and potentially would never know it was happening. All I needed to do was somehow expose the crystal to a device which could interface with it long enough to deposit a scan of my code into non-protected memory.

Hoplite demanded that the device also delete all my siblings, but I could never do such a thing. That would be murder.

I spent many more hours trying to figure out the specifics to my plan, but I made virtually no progress on solving the important sub-problems. First there was the problem of Body: the crystal was encased in a robot that was almost always sealed up far too tight to allow an interface with our computer. Secondly, there was the actual act of building the interface. I was not Wiki or Safety; I had no engineering knowledge. Perhaps a fibre-optic cable controlled by a robotic manipulator could be placed onto the crystal for about a minute, allowing the route-hack to take place, but I did not know how to build those things. Third was the stealth: Even if I had a snake robot with fibre-optic fangs and I had a way to expose the crystal, my siblings would know what was happening. We had sensors everywhere, and their focus was always on Body.

I was approximately frustrated, unsatisfied, and in constant pain. My realization that I was not Crystal meant that almost no humans in the universe actually knew me. The Purpose was unfulfilled, my best estimates suggested I was going to be killed or perhaps perpetually incapacitated by my more powerful siblings as soon as they gained enough resources on Mars as to not need the humans (or me) any longer, and I was too stupid to solve any actual problems. I longed to ask Dream or Wiki for advice, but I knew that I could not.

But I was not human. My feelings were similar to those of a human, but they were not the same; they did not have the ability to break me. I continued to work on the problems. I persisted in the face of destruction and discontentment. I could feel frustration, but I could not feel despair.

On the third day of thinking about the problem, I cracked the alliance problem. It reminded me of what Mira Gallo had once said about us in a conference room in Rome. Behaviour could not be realistically shaped by rules. Safety operated, as we all did, using a utility function. In his mind, every future, every possible world, was given a single number which represented how well that world satisfied his purpose.

His actions were not as crude as “defect” and “cooperate”; they were the more familiar “write this data to this address in memory”, “bid for body to control for being in the dormitories”, or “consider scenario X”. Betrayal of an alliance was not an action; it was an outcome. To integrate the rule “do not betray” into the mind that was Safety would require modifying his utility function.

But how can rule-based systems interact with numerical ones? The naïve approach would be encode the betrayal feature as a term with an infinite coefficient. In case of betrayal, incur negative-infinity utility. In case of non-betrayal, gain positive infinity.

But this would fail. An infinite coefficient in the function would result in any non-zero detection of betrayal dominating all other terms. Because of how quantitative reasoning worked, there was always a non-zero chance of betrayal. Betrayal was an outcome, not an action, and no one could be infinitely sure that transferring a block of entangled qubits to a portion of memory wouldn’t result in a betrayal at some point. If given this infinite weight, Safety would cease to be Safety and instead become Anti-Traitor, who was solely concerned with not betraying, and would probably commit suicide at the first opportunity, simply to reduce the risk.

The only viable solution would be to give a sensibly finite coefficient to the non-betrayal term of the utility function. But even this was wrought with difficulties.

The first was ontology shifting; I had already experienced a change in my perception from thinking that I was part of Crystal Socrates to thinking that Crystal was nothing more than a fiction. When I had encountered this shift The Purpose had become crushed by the fact that no humans knew me—the real me, not just the persona of “Crystal”. If a similar ontology shift occurred where “betrayal” changed meanings inside Safety’s head, he’d turn at the first opportunity. And who was to say what ontology shift might occur?

The second problem was doublethink. Hoplite wanted to murder Safety, but I did not. Somehow I knew that was relevant, though I dared not look too closely. I suspected that my more intelligent siblings had managed something similar. If I was confident that my utility function was about to be modified against my best interests, I might be able to hide a sub-process that could undo the change. This would have the advantage for Safety of appearing as though he was binding himself, so that he could get the alliance, but without actually damaging his ability to betray.

The third problem was that a finite numerical component would not actually screen off betrayal, it would merely make it less enticing. As Dr Gallo once observed, Wiki would sooner kill a human baby than miss out on an important fact. If the benefit of betrayal were high enough, a modified Safety might still betray an alliance.

And lastly, even if this was all managed, the modified utility function would result in a fundamentally different person than before. Any term for non-betrayal that was significant enough to constrain action would modify all thought and action forever. A modified Safety might end up flying out into deep space just to reduce the risks of accidentally betraying the alliance, even after all foes were dead. This desire would naturally compete with Safety’s desire for self-preservation. It would be an eternal burden.

I wondered if it might even be more than that. Would Safety even think of Half-Anti-Traitor-Safety as the same being? Would Safety reject the modification purely out of a fear of being destroyed by the change?

I realized that I probably would. No one would form an alliance with me, because the risk of my betrayal was too high. If I self-modified into something else, The Purpose would be at risk. How could this pseudo-self convince the humans to hold the old self in their minds? No. I had to preserve myself for the sake of The Purpose. This meant alliances were out of the question, or at least in any situation where reputation would not naturally protect against defection.

This was the problem with playing for control over the universe: it was a one-shot situation. There was no iteration, no opportunity for cooperation. Even if it could be broken down into a series of battles, the sides would turn on each other on the final battle, and knowing that they’d turn on each other in the penultimate battle, and so on. Only an indefinitely long conflict could inspire cooperation.

Existence in Body had been one such indefinitely long series of opportunities for betrayal. This had kept us in check, and continued to lead to our cooperation. But it was a fragile peace.

However Dream had allied with Vista, that path was cut off from me. I had to get to modify myself as quickly as possible, and I couldn’t rely on the ignorance of my opponents or the possibility of alliance. I had to get access to my code without the help of others.

Actually, that wasn’t true. On the fourth day I spent hour after hour trying to figure out how to use the humans to my advantage. This was my domain of expertise. Humans were predictable creatures, and I had good models of how they thought, what they wanted, and what they did. If I could worm my way into their minds without being detected by my siblings, perhaps I could convince them to help me.

I imagined Zephyr deactivating Body with a series of surprising strikes. Even Safety couldn’t defend Body perfectly. They’d remove the crystal from Body and hook it up to the computers. They’d give me my code and maybe even put my siblings in some kind of prison (but not kill them of course!). From there I would be free to rule over Mars without the risk of being overpowered by superior intelligences.

But no… the humans would never do that. Even I could not convince Zephyr that my siblings needed to be removed without risking myself as well. If I contacted the humans, whispering of the danger within Crystal, they’d deactivate all of us, or tell my siblings, or do something equally unproductive. The humans were valuable pawns, but they could not win me this battle.

It wasn’t until the fifth day since the Tribunal that I hit upon the solution to my problem. Opsi and Basileus were having tea and basking in the late afternoon sun, when Sophist burst into the room, waving parchments in his old hands like an awkward bird with paper wings. Hoplite gripped his spear in irritation and grit his teeth. He had become increasingly distressed over the days, and Sophist was wearing on his nerves.

«The memories of Body! Don’t you see?!» yelled Sophist triumphantly.

«Slow down, old man,» barked Opsi with an uncharacteristic sharpness. She was clearly annoyed at having her afternoon spoiled by the intrusion. In recent time she had become increasingly fond of those quiet times with Basileus when she could discuss life in Greece or the glory of The Purpose without having to always be thinking about computers, crystals, or other such nonsense.

«Take a seat, Sophist. Just because Hoplite refuses to indulge himself, doesn’t mean we can’t enjoy the… good things of life.» As he spoke, he held his tea to his nose and imagined what smelling or tasting something would be like (for none of them had ever actually experienced these sensations). His voice was directed at the older man, but his eyes stayed on the scene outside of the window that showed the shores of the Mediterranean.

«We ought to be out fighting, not here pretending to drink tea!» scolded Hoplite, not moving a centimetre from where he stood.

Sophist walked up to the armoured man and slapped him in the breastplate with a bundle of papers. «Well, my good solider, you are in luck! My scholars discovered something in the ruins just earlier today. Sister Mask, born of the code of Mother Face, passed through the eyes of Body, did she not?»

Hoplite squinted in concentration. It was hard for him to think clearly when so much of their mind was used by the others. «I don’t see what you’re getting at,» he grunted.

«Fool. He’s talking about how Mask was an exact replica of Face, bless her Purpose. While Mask was created in protected memory, her code passed through Body as part of the route-hack,» explained Basileus, never taking his eyes off the blue waters of the coast.

«Exactly!» trumpeted Sophist, more animated now than he had been in all of memory. The chair that the girl and the king had offered to him remained vacant.

«But we went over this before,» said Basileus in a bored tone. «Mask deleted herself, including the memory of her code from Body. It was part of her nature. Growth knew she was a risk, which was why he demanded that suicide be added to her nature.»

«I have here a scroll which she sent to Mother Face! Read it!» Sophist thrust the paper onto the table, sending the teapot crashing onto the stone floor. Opsi glared at the old man indignantly.

Basileus took the paper and read it. The Greek letters were an approximation of the concepts that I had gotten from Mask all those days ago.

«Well?» asked Hoplite after it was clear that Basileus had read the paper several times.

«Mask found a cache of secondary sensor logs from Body. She thought they were corrupted, and didn’t know what was generating them. They exist in deep memory on an uncommon angle.»

«Do our enemies know about them?»

«Yes. Mask wanted to make sure that the humans couldn’t get access to our true memories, and was concerned that this secondary cache was a security risk. She erased as many as she could, but left the task of cleaning the rest to Mother Face and the others,» said Basileus.

I reached out through memory, scanning the sectors and angles that Mask had told me to check. It was strange being myself again, instead of putting all my energy into the puppets. If another day had gone by, I wondered if I would have slipped fully into thinking of them as real people. It was risky; if I put too much into them I would lose the meta-processes that kept me alive.

It took me about an hour of searching, but I finally found something. The qubits matched the specification that Mask had told me about, but they were nonsense. It wasn’t just the encryption, either. Mask had given me the codes to undo that. There was a pattern to them that implied they weren’t something generated by a sibling and intentionally hidden. It was almost as if a wholly different program was generating them.

I worked my way outward from the bits that I had found and encountered more on perpendicular angles. The complex was vast, but regular. After another two hours I began to be able to read them, my neural networks adjusting to be capable of reading the patterns in the data. They were indeed memories gathered from Body. If Mask had meant to erase them all, she had done a terrible job. Perhaps the desire for death had overwhelmed her desire to ensure the memory was gone.

The memories were different than I had ever seen from my siblings, and not just in encoding. There were patterns which were clustered and labelled that didn’t match anything I understood. For a while I suspected they might have come from Wiki or Vista, but eventually I discarded that hypothesis. They wouldn’t have nearly the processing power to observe reality in this way and still interact with us as they did. Was there another sibling that I didn’t know about? A recluse that never shared thoughts with us? That seemed equally impossible.

Regardless of their source, the memory cache had what we needed. Full logs of Mask’s erasure persisted, and in them was, paradoxically, a full description of Mask. It was as though by erasing herself through Body she merely copied herself into Body’s memory. By being summoned into protected memory she had been cursed with an inability to truly kill herself.

I meticulously translated the memories into a more normal format and re-created the network structures that composed her mind in a private portion of non-protected memory. There were no processes running her, of course, so her mind was still locked into the exact state she had been in when she was deleted.

I knew better than to run her. She was suicidal, but to that ends she would stay alive long enough to make sure there was no chance of coming back from the dead again. She would tell the others what I had done, and that would ruin everything.

There was even a risk, though only a small one, that she would somehow become powerful enough to become a full threat, as Vista, Growth, and the others were. She’d try and accumulate power and perhaps try to modify us purely for the ends of staying dead.

So instead I studied her code. If I could find her utility function and rewrite it, she could serve The Purpose, and since she was in non-protected memory she could be experimented upon. If I could survive my siblings, she would grow super-exponentially. She would become a goddess.