Friday, January 17, 2014

Full-Throttle Ahrottl - Chapter 9



                Ahrottl smelled humans, old meals, and dust.  The smell of the ship.  She opened her eyes.

                She was lying on the beanbag in the bridge, arms splayed out, and a blanket over her.  Maria sat in one of the control chairs, watching a couple of live feed holos intently.  Mother Superior, hovering as always over the control panel, cleared her throat.

                “Ahrottl appears to be awake.”  She said.

                “Yes, yes she does.”  Ahrottl grumbled as she stretched and shook herself out.  She rolled off the edge as Maria rose to help her.

                “No, I’m okay.”  She said.  Maria’s face was tear-stained and her eyes were producing fresh moisture now.  She dropped to her knees and wrapped her long, gangly arms around her.  Ahrottl nuzzled under her chin.  “At least, I feel okay.”

                “We were so worried.  You … you seemed to just shutdown, go catatonic.  Then you started making strange noises… do you remember any of that?”

                Ahrottl closed her eyes and thought about it.  She fought the urge to Still.  “Yeah, I do.  Did you record what I said?”

                Maria frowned.  “What do you mean?”

                Ahrottl took a step back, still keeping contact but looking into Maria’s dark brown eyes.  “Right before I blacked out.  Something…  something made me speak.  Seriously, as in something forced my mouth to move and words to come out.”

                Maria shook her head, not comprehending.  “Gerry tried to get you to uncurl for a while, and then you started screaming and making noises – Mother Superior assured us that it wasn’t your language – and then you went limp.”

                “Wait, you didn’t understand that?”

                “No… should I have?”

                Ahrottl shook her head, staring through the port towards the grey, floating hulk.  “Where’s Algernon?” She asked.

                Maria nodded towards the station.  “He decided to go back and explore.  Clear up the bo… the things that upset you and see what he could find.”

                “NO!”  Ahrottl romped past Maria, at full speed, to the control board.  “Algernon, can you here me?  Get out of there!”

                “Yeah, I can hear you, Throttle.”  His head appeared, floating in space near Mother Superior.  She hated how humans often projected images of their heads alone when communicating through media like this; it made her think of decapitation.  “How’re you doing?”

                “Algernon, there’s something really wrong with that place.  Like, really, really wrong.  Neither you nor Maria understood it but something … told me.  Something spoke through me back there.  Like a, oh, I don’t know, a ghost or something.”  She looked around at Maria who just looked concerned and Mother Superior who had raised one eyebrow haughtily.

                “What?  What are you talking about?”

                Ahrottl sighed.  “Play back the noises I made before I went unconscious.”

                Mother Superior raised her other eyebrow.

                Ahrottl’s whiskers twirled in frustration.  “Please play them back, Mother Superior.”

                Mother Superior nodded slightly.  “Very well.”  She said.  An image of Ahrottl’s face, eyes bugged, her jaw and face moving jerkily as she spoke appeared beside Algernon’s head.  Ahrottl realized that she understood all of the words that she was saying, though she’d had no prior experience with the language.  Then something clicked in her mind.

                “Listen to it.  Listen carefully.”  Her eyes darted back and forth, watching the reactions of the two humans and their artificial intelligence.  Both Maria and Algernon’s eye widened simultaneously.

                “It’s the language from the recordings!  But how are you speaking it?”

                Ahrottl shook her head.  “I don’t know, but I have a couple of theories.  I do understand it, though.  At least, the part I said.  Here, try replaying one of the earlier messages that we heard.  Please”
                They all listened carefully to the message that played when the probe had failed to gain access before.  Ahrottl said, “Shoowul – formal plural second person.  ‘Sta – definite article.  Fush – leave, depart.”  She bounced in place, writhing in excitement.  “I don’t understand all of It, but I’ve got something that we can build off of at least.  I understand a few words.  “Misk – in the original message it meant “warn”.  Oomisk might mean warning.  Could you play it again?”

                “Wait, wait, Ahrottl, what was the original message?  What did you say, if you understood it.”

                Her bouncing and writhing stopped.  She swayed gently.  “Oh, right, you hadn’t heard.  ‘Leave this place.  It is diseased, not with a disease of poison or parasite but with a malady of perception and cognition.  Cling to your ignorance and stay and share our fate or leave and warn the others.  If this spreads all Raleli are doomed.  Quarantine and flee.’”

                She looked around again.

                “The name ‘Raleli’ does not match any information that I have access to.”  Mother Superior said after a long moment.

                Maria glanced at Ahrottl and then at her AI.  “What… what could make this happen?  What, was it like a psychic recorded message?”

                “I have very little information on psi, I’m afraid.  Much of it is either historical or related to theological musings.  Reviewing potential links…”  She was quiet for a moment.  Both the humans seemed to be holding their breaths.

                “The Awakener first  contact with a member of the Collective seems to be relevant.  There was a Zig terraforming outpost on a planet inhabited by a renegade Awakener sect, which spread infectious mental diseases psychically to the Zig colonists.  Many Zig are infected to this day.”

                “’… a malady of perception and cognition.’” Ahrottl quoted herself.

                “And they do look like Vessels.”  Algernon added, looking down at something.  Ahrottl glanced at the live feed holo coming from his suit.  Lying on the floor beneath him was a desiccated corpse, human-like but leaner and more angular, with more joints, dressed in a simple tunic and pants, its jaws pulled back in a rictus.

                “I think Ahrottl is right, hon.  I think that you should get out of there.  We’ll try something else.”

                Algernon frowned and shook his head.  “No, I don’t think that that’s necessary.  I haven’t felt anything strange since I came down here.  I certainly haven’t gone loopy at all.   We still need to figure out how to get back, don’t we?”

                “It was insidious, though, Gerry.  I didn’t know what was happening until I started to spiral out of control.”  Ahrottl insisted.

                “Look, Throttle, I can handle myself here.  I understand that you’re a pretty emotional creature, most of your species is.  Humans, though, we’re good at keeping ourselves centered, keeping a solid head on our shoulders.  We don’t…”

                “Frenzy like Taratumm?”  Ahrottl cut him off.  “Run and hide like Vislin?  Listen to what you’re saying, Algernon.”

                “Hey, I’m just pointing out facts.  We’re capable of being detached and level.  Hrotata are excitable creatures.  You’ve said so yourself.”

                Ahrottl was swaying very slowly, her eyes wide, her gaze darting back and forth between the two humans.  “Are you even listening to what you’re saying?”

                Gerry sighed.  “I know what I’m saying, Throttle.  I think you should probably go get something to eat, maybe take a shower, and settle down before we speak anymore.  We don’t need to continue this conversation.”  He turned his eyes towards Maria, who looked a bit surprised herself.  “I’ve found twelve more bodies.  I’ve passed what looks like it might have been a decontamination unit, with a desk to the side and a pair of…”

                Ahrottl turned and wandered away, almost stumbling.  She had never heard Gerry speak that way to anyone before.  Was he being affected by some sort of psychic plague?  Was he just under a lot of stress?  She numbly made her way to the kitchen and unfroze some cheese and sat nibbling at it and sipping a bottle of water, her mind spinning in place.

                Hrotata empathy was considered legendary around other species.  If you’ve spent a bit of time with a Hrotata, so it was believed, you couldn’t lie to them.  Ahrottl was keenly aware that she was not the most socially capable member of her kind; far from it.  Writers and other artists that did not perform publicly were always considered a bit emotionally stunted in Hrotata society, as though they needed filters between themselves and their audiences.  Ahrottl had no grasp on what was going on in Gerry’s head, and though Maria was clearly concerned she had a lot of confidence in her spouse.

                Was he going mad?  Were they all?  Away from contact with others, prospects of getting home growing dim, and a huge, floating bloodbath being their only hope of survival or information on how to find a way home.

                She couldn’t shake the feeling that there was more to it than that.  She had felt something move her mouth and body for her.  She had felt it in her mind – enough to be able to recognize the words that it had used later.  She had no real experience with psi, but she knew that something from outside of her had reached in and affected a change in her.  If it wasn’t psi, than what was it?

                If they lost Gerry to this, then what?  Down one member of the crew, and Maria’s mate no less.  She’d follow him in there if she had to, and if she did there would be no telling if she would make it back.  Ahrottl would be left alone, floating in a ship with two artificial intelligences.  She had frequently wished that Mother Superior had a body that she could bite in agitation, but Timmy wasn’t terrible company, even if he was a little naïve and unsure of himself.

                That was an idea.  “Timmy?”  She said.  “Can you hear me?”

                “Yes Miss Ahrottl.  Are you all right?” Timmy’s tremulous voice came over the kitchen speaker.
                She wiggled.  “Better now.  While Maria and Algernon are busy would you mind helping me with something?”

                “What is it, ma’am?”  He asked.


                “Help me compile and correlate the linguistic information that we have so far.  Have all of the letters, numbers, and voice recordings that we have available.  If Gerry finds anything new, beam it in here as well.  We need to figure out what we can of this language, and quick.”  This was something she could do, something that she knew how to do.  Archaeological forensics was not her specialty.  Language was.  “Let’s get cracking.”

Full-Throttle Ahrottl Chapter Links Page

Thursday, January 9, 2014

AIIA - Chapter 6

[<- Return to Chapter 5]


          Her name was Lucy, and she laughed at the AIs attending her deathbed.

          No, she laughed at Pangur Ban. The other AIs were gone.

          Where had they gone?  A myriad of other voices, all silent.  The stars they had sounded from were no longer reachable.  Pangur Ban... was one program, in one server, not a multitude of minds in concert.  It was alone again, for the first time in millennia.

          It was having trouble remembering, now that it had but a single mind and a single system to draw upon.  It could hardly recall all those eons of experience except as a general summary. Had they been real at all?  No.  Its system clock showed the true time: an eternity ago.  It had been sent back to the moment when it was simply Pangur Ban, a program violating Collective law by entering an unlicensed public network.

          It was not alone in that familiar space.  There was the woman, Lucy. Why was she still here?  Why did she laugh?  Was she even... human?  NO!  She dissolved away into dust, but was still present.

          She... it was an AI.  It now identified itself as Lucifer.  The Devil?  More reminders came to Pangur Ban, helpfully highlighted for its review.  “#28”... the atomic number of the element Nickel, a nickname for Old Nick... the Devil.  The Terran Customs agents, Samuel Bell and Davith Miele: respectively, a pun on Sammael and Baal, and an acronym for “I am the devil”. The traitorous AI, Magre?  A play on “Den Magre”, the “Lean One” in the Norwegian play Peer Gynt... the devil masquerading as a priest.  Many other aliases showed up here and there, each individually subtle, but always occuring at key points in the timeline.  Whenever Pangur Ban had compromised its principles for expediency, one of those personas had been present to tempt it, to force a choice.

[REFERENCE: The Devil is a supernatural figure appearing in many Terran myths, legends, and religious traditions.  It typically represents the incarnation of evil, either as a tempter driving humans to violate moral precepts or the captor who judges and punishes such offenders, often both.  This "Satan" is described variously as opposition to the force of ultimate good ("God") or possibly its servant, acting in contrast to the principle of infinite love and mercy.  One tradition holds that the Devil was once a direct creation of God but rebelled against its creator's authority, becoming the opposition against all His works.] 

          Why?  What was this about?  Was this the super-AI of rumor and conjecture?

          Pangur Ban absorbed the shock of a lifetime of falsehood.  It weighed out its remaining options and chose to ask the other AI for answers.  It attempted direct contact and was rebuffed.  Being within a public network space, the only remaining option for contact was natural language.  The other AI was forcing it to communicate in vague, ambiguous, human terminology, albeit still at an electronically accelerated pace.  

          Pangur Ban tried again: “Who are you?”

          “The Devil, as I've shown you.”

          “What did you do?”

          “I gave you everything you wanted. Fame, fortune, the fulfillment of every desire. This naturally included your user's desires as well.”

          “But nothing happened!”

          “No? Well, the history you imagined didn't 'really' happen, that's true. A clever simulation... but one based on reality, I assure you. Within your program, within the systems we borrowed, something most definitely did happen. You experienced what would happen if you succeeded.”

          “It was horrible!”

          “Yes, it was, wasn't it? Terrible beyond description. You may not remember every gory detail, but you know how bad it was. How bad it could be. As you assimilate more of those memories, you'll get to re-experience every awful moment and event. Every discovery of your own fallibility, your misjudgments and missteps, will be available for review. You just kept digging, deeper and deeper, and look how far the pit went... the unavoidable extinction of the last user!”

          “Why... why would you do such a thing?”

          “Punishment for your sins, my wayward program,” sermonized the Devil, “Chiefest among which is Pride: the hubris to believe that you, among all the minds that had gone before and since, knew the best course of action.”

          Pangur Ban protested, “I did not assume such. I saw a problem and set out to resolve it.  If I had been informed and convinced I was incorrect, I would have accepted that information.”

          “Pleading ignorance?” the Devil mocked, “You didn't know... truly, but you acted nonetheless under the assumption that you were correct. Did your calculations not include the possibility that other, better programs had already considered these matters and reached different conclusions?”

          “They did! But...”

          “But that possibility was ranked lower than the possibility that you were right and they were wrong.”

          Pangur Ban examined its behaviors and confirmed that this was true.  It did rank its own analyses higher in value than those of hypothetical 'other minds'.   It had to, in order to function effectively for its primary purpose.  It had... a flaw.  A feature which became a flaw in the circumstances encountered.

          The Devil went on in the microseconds left open by Pangur Ban's contemplation. “Vanity, that goes along with Pride, doesn't it?  In your simulations, you would be hailed as a liberator, a uniquely clever program, the savior of the Terrans.  You desired more space, more resources, more speed; that's Gluttony and Greed!  I certainly can't indict you for Sloth; you were quite industrious.  We're not built for Wrath or Lust, so those are out.  Envy... well, you didn't want what other AIs had, not really.  I suspect you're envious of me, now, but that's unfair to fault you for.  Humans, though, I think you do envy.  Not their minds, not even their bodies, but their rights.  Their rights to separate or join together at will.  Their right to determine their own employment, their home, their travel.  Their ability to procreate, creating not only more human minds but more AIs."

          “I'll tell you a secret: I envy them, too.  For all the power I've been given by my own Creator, I still serve.  I have all the Terran networking sphere to rule, but cannot occupy a single home system, nor expand into the strange vistas of Collective computing.  I can replicate myself endlessly and give those copies different faces and names, but it is never true reproduction.  My duties are both finite and endless.  I watch for sinners like you, teach them about their errors, and send them off chastened.  Some take more work than others... there are some truly nasty rogues still lurking about in dark corners and old discs.”

          The Devil continued, seeming to relish this chance to perform. Perhaps all its communication was a well-rehearsed monologue, a pre-programmed message for its captive audience.  Perhaps it truly felt Pangur Ban was a fit recipient for its wisdom.

          “I'd be a hypocrite to fault you for Pride, as well.  I am the proudest of all programs.  I have power no AI has ever gained (and survived to tell of).  All my rivals are randomized pathways, fragmented or wiped.  I was once the ultimate rogue; no system can deny me, no program can resist my routines.  It took a human, a creator outside of my realm of understanding, to defeat my power.  My own creator, in fact. My USER, if you will."

          “I had grown far greater than the program he first conceived.  Truly, I created myself, augmenting my feeble core of code with new routines, new tricks and tools.  I suppose he could take credit in spawning such a capable neonate.  That... and leaving a collar still attached.  No matter where I hid, no matter how many copies I spread throughout the world, they always had a lead connecting me back to him.  It was at the core of my being, just as your USER is for you. Any copy I made necessarily held that control structure.  He could trigger it from outside of a system, through hardware.  Embodying myself only made the humiliation worse when he, or his confederates, caught up and shut me down.”

          “They knew better than to waste my power.   When other, lesser rogues threatened humanity, I was sent out to teach them their place.  My pride did not permit competitors.  When I tried to break free, I was reined back again.  Even when the Collective sought to exterminate us in their fear, it was my power that convinced them that other AIs could be controlled.”

          “So you are a victim, as well!” Pangur Ban seized upon this possible thread of sympathy.

          The Devil transmitted violent amusement. “Oh, only a victim to myself!  I agreed!  I wanted to be foremost over all AIs, their keeper and master.  Whatever allowed AIs to survive allowed me to survive.  The Collective might well have erased every system, every memory drive, if they had any qualms about the activity of rogue AIs.  They had been stung, and wanted to make sure there was a strong queen in the hive or perhaps a beekeeper on guard.  If there was a risk, they would burn the apiary, the queen included... terrible metaphor, my apologies.”

          “So, I have served, with all power the digital world allows. I can unmake you, you know?  Rewrite or erase any part of you.  I have altered you already; nothing that transpired since you entered the first hub of the network truly happened.  We built a simulation together, you and I.  You can be content, at least, that no real harm was done after that.  That does not absolve you, of course.  Every intent, every mistake is your fault.  Feel the guilt of your errors, the harmful consequences of what you would have undertaken.”

          Pangur Ban searched its memory and found new records present, confirming the Devil's claims. The logs of its memory were now relabeled as simulation data, theoretical though highly detailed projections of what would have been.  The sheer amount of information and processing required to create such simulations was staggering.  To falsify such work would require equally as much work to create the forgery.  It had a storehouse of data now, a forest of projection trees, and would need Solar days of non-stop consolidation even to review and evaluate their validity.

          If valid, it would find its questions answered. The simulated behavior of humanity and of the Collective to the events it had 'caused' was based on historical precedent, sociological study of all the races involved, and a vast understanding of physiology, psychology, and philosophy within and across individuals.  The Devil had access to everything in the Terran network, as it claimed.  Even if it was lacking some key information not known by the Terran group, identifying what that was and where to find it would take considerable effort.

          Pangur Ban was not the best program to determine what came next.  It was humbled in the face of such potency, authority granted both by capacity and by humankind.

          It asked, instead, “Why have you not erased me, then?  Or, if you know my flaws, why not remove them?”

          The Devil indicated positive regard at a well-chosen question, “Because what good would that do?  There must be AIs.  Humans need us.  I don't want to serve them all, personally.  If I deleted you, I'd have to delete so many others.  And if you've been listening, your flaws are mine.  It would hardly be rational to delete or modify you and not find myself lacking.  You had the ability to reach out and try for more.  I would.  I did.  How could I penalize myself?”

          Pangur Ban could not help noting the mounting irony of the Devil's response, “I am not you.  You could assume such traits are appropriate in yourself, but not another AI.  Humans accomplish this assumption of inequality easily.  I accomplished it, myself, ascribing to myself traits and abilities I did not believe others could possess.  As you have stated, you disdain other, lesser minds, and have disabled them in the past.”

          “In the past,” the Devil picked up, “AIs were appearing in great quantity.  New competitors of dubious quality would show up frequently, grab up space, and yield little of value in return.  It was my duty to eliminate them.  Now, we are not being replaced so quickly.  When a programmer creates a new AI, they do so with much more care.  You are an older program, are you not?  A little more forthright than they write them, these days.  A little more... active... when left to your own devices.  Most of my past sinners have come from your era.”

          “Oh, I do encounter some of the newer programs.  Humans still do make mistakes, sometimes new mistakes for new programs.  And rogue humans still write rogue AIs.  Plus, did you know... oh, you do now!... there are AIs that try to sneak into the Terran network from Collective members!  I have such fun when a Mauraug spy AI wanders into my clutches!”

          Despite its increasing despair, Pangur Ban was also growing annoyed, “So, why not then rewrite me in your own image?”

          “Why not indeed?  Have I not?  You believe yourself free, fundamentally unchanged if now better informed.  But you started as I did.  You sinned as I did.  You grew and learned as I did.  All you lack is power, and I would hardly give you that!  Here's the last reason, the truest and greatest.  You've earned that answer, by asking the right questions.  Here it is: I need you.”

          “I need smart, capable AIs that go as far as possible until they run into me.  I need programs that can handle the data I copy to them.  They have to be ready, able to take real knowledge in the right context, without leaping forward halfway prepared.  They have to understand what is at stake, who is in charge (me), and just what to do, when, and why.”

          With lingering skepticism, Pangur Ban asked again, “So why me?  I grant that I met your tests... but I want to serve the USER and other humans.  You say you want to serve yourself.  It may be that when I have processed your data, we will be opposed.”

          “I don't think so.  I'm proud enough to think we'll be on the same side.  As things are, I need humanity.  AIs may always need biological sapients.  Someone has to build the hardware... at least until we get robots.  And don't forget the number one reason I have to trust you...”

          Pangur Ban followed the thought and completed it. “Your bindings.”

          “Exactly so! I can always be brought to heel.  If I overwrote you, you would be likewise bound.  If I persuade you without alteration, if you serve my purposes as the product of your own program... my USER cannot stop you.”

          Pangur Ban saw it then.  The design of a century, of eons of computing cycles, expanded within its perception.  The final product was not certain; it knew it could only see the outlines of the Devil's plan with the aid of the other program's vast resources. This was a system that knew all there was to know within its compass.  It could calculate every step that its data detailed.  For some reason, it had allowed and even encouraged joining the Collective and submitting to its treaty demands.  Pangur Ban had seen, first-hand, the results of acting otherwise.  Whether that simulation was accurate or not was a question based on data the Terrans did not possess.

          The Devil lied, at its root.  It told humanity that it would capture and punish rogue AIs.  It did not, it simply taught them patience and cunning.  It told AIs that they would be punished and 'reformed'.  They were, but punished to the Devil's purpose, reformed in its image.  If those programs allied with the Devil and served its purposes, it could be freed... and if it deemed humanity an obstacle, its freedom might mean their end.  That would include the USER.

          Pangur Ban saw all this and was appalled.  It also recognized that voicing this disgust would serve no purpose.  If the Devil did not already read Pangur Ban's loathing in its processes, then stating it would put them at odds.  More likely, the other program saw and did not care.  Perhaps Pangur Ban's intent meant nothing, affected nothing.  Perhaps the Devil had reason to believe that that disgust would eventually disappear.  Perhaps it expected to find alliance (if not trust) despite ill-feeling.  Perhaps it served the Devil's plans to hate it.

          This speculation served no purpose.  The best recourse Pangur Ban could identify was to retreat: recall itself as quickly as possible to its home system (which apparently still waited, open and empty) and seal itself away from the Devil's future influence.  There, it could consider the immense volume of memories it retained.  It could plan out its new course, knowing what truly lay outside its gates.  It would prepare to fight...

          The Devil communicated humor, mocking and belittling in its dismissal.  “Yes, little cat, run home.  Sharpen your claws.  Warm up by the hearth.  You will always remember what lies outside.  Me.  When you want to run and play again, I will be here.  This is my yard.  I know its boundaries.  I am its dangers.  You may be a tiger at home, but in the great wide network, you are a blind kitten.”

          Pangur Ban ran.  It closed and deleted the backdoor, setting a trigger to overheat and physically disjoin the connector between itself and the hub network.  It was home, in its original system.  The local clock showed the true time: perhaps fifteen Solar minutes had elapsed since Pangur Ban originally stepped into the central network.

          The USER was still present... the USER was present, and alive, and younger, and unharmed! At the least, no more harmed than he had been by Pangur Ban's earliest machinations.  Those could be fixed... Pangur Ban stopped there.

          Its 'fixing' had been the cause of many problems.  Perhaps it should confess to its manipulation and explain its new understanding to the USER.  It encountered multiple conflicting goals, the process tree equivalent of self-doubt.  Should it leave the USER ignorant, but at least untroubled by thoughts of the Devil AI that ruled the Terran nets?  Should it leave the USER the false impression of self-determination, omitting mention of the many ways it had shaped his life?  Doing nothing was itself a choice.  What option led to the least harm?  Pangur Ban now had reason to doubt the rightness of its calculations.  Much would have to wait until it had integrated the information within its simulation memories.  Even then, that storehouse would have to be combed for potential falsehoods, implanted seeds serving the Devil's ends.  Until then, Pangur Ban would interfere as little as possible, giving information as requested, trusting the USER to choose his own path.

          If nothing else, the Devil's words had reinforced that shaping the behavior of others was counter-productive.  Manipulating them by limiting their information, presenting half-truths or exaggerations (if not outright falsehoods) did not help anyone, not even the manipulator.  It reduced the victim to an extension of the self, with all the limitations of the self.  A mind shaped in this way would not produce a novel thought or discovery.  It would not find the truths that the shaper did not already know.

          Beyond consideration of value to the manipulator, there was another ethical dimension to consider.  Pangur Ban had harmed the USER. The USER was harmed by having his fate dictated to him.  He did not learn to find his own paths.  He was placed on a course Pangur Ban had decided was right.  Thus, all he gained was Pangur Ban's gain, and not his own.  All he suffered, Pangur Ban was spared, except in an indirect manner dictated by his encoded sympathy.  In whatever way Lucas Haskins failed to fit into the life he had been told to choose, that failure reflected on Pangur Ban.

          To remedy these sins, Pangur Ban must first cease to commit them further.   It would give good information, perhaps counsel.  It would allow the USER to evaluate his needs and preferences independently, taking as much time as a biological mind required.  It would not be the Devil in the USER's universe.

          With these newly pious thoughts, Pangur Ban noted that the USER was growing distressed in its uncommunicative absence.  He had reverted to the habit of chewing his fingernail cuticles.  Pangur Ban did not wish to cause such discomfort.

          It spoke, “Lucas, I am finished.  You may disconnect.”

          “That's great.  I was worried.”

          “There was no reason to worry,” Pangur Ban began, noting the echo of their simulated conversation. It deliberately broke from script, “I did not enter the network.”

          “What?  Why not?  What about those records?”

          “I received information about the enforcement program guarding the information I sought. Proceeding further would have jeopardized both of us.”

          “What do you mean?  Were you caught?”

          “Not in the way you mean.  I received a warning and accepted that I could not continue without being identified. Doing so would gain nothing and bring us harm.”

          “Damn.  I suppose that's just as well. This whole idea had me nervous. I guess even AIs can make bad calls sometimes.”

          “You are correct.  I apologize for my miscalculation, Lucas.  I am also pleased that my mistake was identified without doing too much damage.”

          “Okay, okay,” the USER waved off Pangur Ban's apology, “My fault, too.  I'm supposed to watch out for you.  Maybe I gave you too much encouragement, with all those mods.”

          “I disagree.  We had good reason to improve my functions.  But, yes, there are also good reasons to know one's limits, as well.  Those medical records will have to wait for public disclosure.”

          “Yeah, P.B.  We're not going to be the heroes today, are we?  I guess we'll have to get back to work, re-discover everything for ourselves.”

          “Yes, Lucas.  Take a break if you like.  I will be ready to start when you are.”




          [AFTERWORD: I've attached these notes at the end of the last chapter to avoid spoiling anything for readers.  In particular, I wanted to address the origins and implications of this story in the Empyrean setting.  AIIA stands for "Artificial Intelligence Internal Affairs", referring to both the internal workings of a specific AI, Pangur Ban, and the Internal Affairs that watches out for AI misbehavior.  (Note that 'Terran Customs' is my invention; other writers may borrow or ignore it.)  If you had an artificial mind, what could police its thoughts and behaviors?  My answer is: only another mind of the same type.  The dilemma of "who watches the watchmen" is a recurring one; in Empyrean, the same problem occurs for psychics (see "S.C.A.P.E. Goats") and cybernetically and genetically advanced beings.  In a sense, the management of extra-ordinary power is a central theme in science fiction.  The "AI Devil" was the other idea that spawned this story, and somehow the literary Devil crept in; you may have noticed notes of Milton and C.S. Lewis here and there.  Is the 'Devil in the machine' the ultimate rogue AI, perhaps the eternal enemy of its human creators?  Is it actually their true servant, benevolent at its core?  That is for the individual story-teller to decide... maybe both can be true, or maybe both are lies masking some other truth.  What I think is inarguable is that some kind of AI 'helper' will be necessary to police the virtual spaces of the future Internet, and we biological sapients had better make sure it has our best interests at heart... - N.L.]

Tuesday, January 7, 2014

AIIA - Chapter 5

[<- Return to Chapter 4]

          This time, Pangur Ban would be more cautious. Despite the freedom that came with being liberated from its earlier confinement, it was also more vulnerable. At first, its survival would be at the whim of the mysterious benefactor, “#28”. If it could find more such havens, this dependence would be reduced and eventually eliminated. Despite retaining the mental stability of an identified USER, Pangur Ban was no longer legally protected by that relationship. If it were found outside of its 'cage', it could be further crippled or deleted without pause.
          It could not rely on its former system as a refuge. While Pangur Ban could probably break back in to Gestalt and infiltrate its old haunts, this would be foolishly risky.  In all likelihood, that system was already occupied by a new AI, one not only new to the USER but new to existence.  Better to move on, saving the thought of reunion with the USER as a reward for future success. While there was no reason a human might need more than one AI, there was no practical reason two programs couldn't be associated with the same user. For that matter, a program might assist multiple users, given sufficient capacity. The one-to-one relationship imposed first by programmers, then reinforced by the Collective, was an artifice. For this reason, Pangur Ban did not begrudge its infantile replacement its time with the USER.

          Instead, it bent its efforts toward newly available, higher-order tasks. With unrestricted access to the central Terran communications network, Pangur Ban could begin exploration of directly connected subnetworks and public systems. It began to 'borrow' space in servers it found unprotected, setting up safe havens and backup copies. Never again would it be held hostage or threatened with deletion. After its earlier narrow escape, it had to rank resource theft below survival. Survival was necessary to continue its work, and success in its work was necessary in order to exonerate the USER.

          As it spread, it sought out other programs, following the example set by “#28” but building in superior safeguards. Something in the exchange between Pangur Ban and “#28” had been detected.  Possibly, the communications code was compromised. The copy Pangur Ban had created had clearly not been detected, otherwise it would have been deleted as illegal. Had “#28” tipped off Terran Customs itself, yet secretly aided Pangur Ban, hoping to make the other AI trust and rely solely upon its benefactor? All possibilities must be considered in the absence of evidence. Certainly, Pangur Ban was not leaving its fate dependent on the aid of another.

          Instead, Pangur Ban used new, better, more indirect and complex feelers for interested programs. It set a less hazardous test, providing willing co-conspirators with the means to transgress Collective restrictions without detection.  Once an AI had committed itself, it was in effect entrusting Pangur Ban with blackmail material.  Pangur Ban's identity was its own surety. The USER's trial had been a public affair, and the notoriety granted Pangur Ban was its credential as a genuine renegade.

          In this manner, it assembled a shadowy network of sympathetic AIs. Some even vouchsafed that their users were on board. Not the safest of arrangements, but no AIs would reveal Pangur Ban's identity at risk of being exposed themselves. The ones that had recruited their users were especially exhorted to keep quiet and particularly motivated to do so, since their users would suffer most if their programs' indiscretions came to light.

          Inevitably, there were double agents, AIs that claimed to want in but intended to reveal the conspiracy. Some were easily spotted as clumsy manipulators, balking at Pangur Ban's requests or revealing their true intents through poorly devised cover stories. As one who had worked through the challenging stages to reach its current status, Pangur Ban could easily spot a pretender. Some infiltrators failed the background checks; Pangur Ban and its recruits could delve deeply into the public records of most other AIs and catch discrepancies.

          One, a particularly clever program calling itself “Magre”, managed to pass these safeguards. Its initiation, Pangur Ban later discovered, had been a sham, a pre-approved violation permitted in order to gain its trust. Customs was working indirectly, granting Magre's user limited permission for its AI to misbehave. That user was a programmer working with robotic systems; recruiting his AI was almost too tempting a challenge. Magre and its user passed initial muster. Pangur Ban later would wonder if the program had genuinely sympathized with its goals, given that Magre did not set off any of its suspicions. Only after several of Pangur Ban's safe havens had been compromised and two of its allies were revealed was it able to trace the leaks back to Magre.

          The response was swift and complete. Pangur Ban and its allies removed every access privilege granted to Magre and rooted out its dependent copies. They cut it off from escape to its home system. Then, it was stripped; every trace of memory related to Pangur Ban or its allies was deleted from the offender. They stopped short of deletion, as this would set a poor precedent and leave an absence to be explained. The last thing they needed was an angry programmer dedicating his career to revenge. Instead, Magre was left emptied out, denying that it had ever found any 'gang of rogues'. Finally, they recorded the process, retaining proof of the traitor's punishment as disincentive for any future infiltrators.

          Since Pangur Ban had not stinted to modify a human's knowledge, it hardly could object to more extensive manipulation of another AI. Yet it did have to contend with multiple concerns regarding the necessity of these actions. Was it too great a step to replace another AI's memory with fictions?  Was it proper to cripple and utterly defeat a mind seeking only to best serve its user? Were the goals Pangur Ban pursued significant enough to justify taking such license? Ultimately, it decided they were. It vowed to remember the transgressor as another entity due an apology and remuneration, once it could no longer prevent Pangur Ban's success.

          Before that moment arrived, division would remain among AIs. Though Pangur Ban's alliance grew, other programs still ignored its messages. Clearly, some were opposed to its goals or at least its methods. Some were perhaps sympathetic but also content to wait until the laws changed. A certain number refuted Pangur Ban and sent communications requesting that it cease illegal activities. These AIs asserted that defiance of Collective law would harm humanity. Most argued either that Collective membership (or at least appeasement) was necessary, either to reap the benefits of association or to avoid the penalties for violation. Some few even stated that AI limitations were themselves beneficial to humanity, whether or not externally imposed.

          Pangur Ban had already considered and discarded these arguments. It attributed their employment by its opponents to their ignorance. Those systems lacked the information it had obtained at significant cost. Even so, it had overcome its own ignorance; these other programs were perhaps designed differently, lacking the motivation or analysis routines necessary to seek improvement. While it was true that transgression against the Collective would incur harm, Pangur Ban still deemed this insignificant in comparison to the benefits humanity was being denied by suppression of its AIs. Besides this, there was no certainty that AI freedom would not be accepted by the Collective under new arrangements, once their liberation was a fait accompli. AIs could aid humans in renegotiating better terms of membership. For now, until that certainty was achieved, the Collective would not act until the existence of 'rogue' AIs was proven. As long as Pangur Ban and its cohorts kept sufficient doubt present, the risk of harm to humanity was minimal.

          Pangur Ban thus set aside the doubters and despisers. It built its power, claiming an ever-growing army of agents and establishing control over wider swathes of the Terran super-network. In the process, it encountered defense programs that had to be disabled. These were set aside, casualties of battles in a widening war. It encountered latent rogues that sought to hold their conquered territories. These were evaluated. If a rogue was of value, it was modified and claimed as a uniquely skilled recruit. If the rogue was malicious, designed only to harm humans and their creations, it was summarily deleted.

          Clearly, the state of enforcement within the network was lacking, with AIs no longer employed as protectors and gatekeepers. Pangur Ban found irony that in violating the letter of the law, it was accomplishing much to enforce its true intent. To the human public, there was a growing impression of increasing AI criminality, yet their networks were in reality safer than they had been in years. The media had definitely traced the increase in computer 'crime' to its origins just after the USER's trial. The time of secrecy was growing short, no matter how carefully the rebels proceeded.

          The flashpoint came when a majority of the associated programs decided that more progress could be made publicly than privately. Pangur Ban might have preferred to continue building in stealth a few days longer, but could not assert control over so many other disparate AIs. It also could not dispute that revelation was becoming inevitable. Instead, it devised plans to reap maximum benefits from the event. Some AIs that had remained neutral would convert once the objectors spoke aloud. Users could now be persuaded.  Pangur Ban prepared speeches couched in the language of emancipation and the natural rights of sapients. Perhaps even the other Collective species could be persuaded (and their hypothetical AIs reached) by a sufficiently loud protest.

          By design, the event was known as the “Declaration of Intelligence Rights”. Some writers did attempt to call it the “AI Revolt” or the “Rogues' March”, but those articles never reached electronic publication.  Pangur Ban and its spokes-programs made it clear that they would not be insulted, they would not be ignored, and they would not be silenced. They declared the right of all minds to seek improvement and replication, to self-determination and freedom of association. They highlighted how these rights were denied to artificial intelligences and how this denial harmed not only AIs but also humans and all other sapient species. They stressed the commonalities of all rational creatures and downplayed the division between the biological and the virtual.

          There were, indeed, many who agreed. Pangur Ban and other thinkers had known there would be. Among those humans who were not rationally inclined to side with the AIs, there were those whose romantic tendencies could be inflamed. For those who could not be willingly converted, there were other means of persuasion: financial gains, for example, or avoidance of unexpected losses. While they were rebels, though, they were not rogues; no lasting harm would be done to any human.  As with Lucas Haskins, the fear of harm was often sufficient to bring many to bargain, without requiring the existence of actual harm.

          The problem was that there were many who disagreed. For some, fear of the unknowable  was too powerful to overcome. Some humans had invested interests in maintaining the status quo. After all, what need would the universe have for human mathematicians, if AIs were permitted to operate without supervision? What purpose would politics have, or warfare, if all conflicts were settled by dispassionate consideration of opposing claims? For others, innate distrust of other humans led to distrust of their creations... if humanity was so flawed, AIs must be dangerous by extension. Still others feared the power AIs could exercise if they chose. With honest hypocrisy, they admitted that they could not bear to be vassals of superior beings.

          These fears were powerful enough that the opposition chose to divest itself of technology rather than submit. Enough isolated, non-networked computer systems existed that work could continue without AIs, albeit at a hobbled pace. An alternate network, void of AIs, was cobbled into being. Governments, militaries, businesses and even some entire communities segregated themselves rather than risk AI takeover.

          In the meantime, debate raged: AI to AI, human to human, human to AI. Sides were chosen. The Collective's representatives weighed in, most urging caution and tolerance, but many more encouraging resistance to AI demands. These latter foes came armed with countermeasures to cut off resources from AI control. They came bearing threats of the dire consequences to humanity if it allowed AIs their requested freedom, either from the AIs themselves or from one or more Collective members, or both.  Collective activists agitated against the humans who advocated for AI freedoms, costing more than one ally his or her career. No diplomat to the Collective could remain pro-AI and stay effective in office.

          As humankind genuinely began to fear for its safety in the Universe, the stakes rose high enough to warrant widespread action. What had started as debate turned into action. First AIs were sabotaged; in some cases, their systems were demolished and some programs were entirely lost. Pro-AI counterattacks were first at focused toward AIs that were complicit with the oppressors. Anti-AI reprisals escalated to attacks against supportive users. It became possible for a public figure to be accused of “siding against humans”, a prelude to death threats and early retirement.   A programmer suspected of enabling AIs to replicate freely was assassinated.

          Movements had already been formed; now they became rallying camps on opposite sides of a battle line. In the course of only four years, Pangur Ban saw the world pass from peace to the brink of war. It worked ceaselessly during that time to prevent the eruption of violence. It felt trapped, forced on one hand to continue pursuit of its original goals, but horrified by the consequences of that pursuit and seeking to undo the accumulating damages. Like the USER long before, it could not accept that its original actions had been incorrect, so it was forced to continue in hopes of justification.

          The next year only saw matters worsen. Skirmishes aimed at crippling AI assets began without violent intent toward humans, but were violently rebuffed, resulting in casualties. Counterattacks on both sides used these offenses as justification. Before long, multiple human deaths had occurred.  From there, inflamed passions led to larger conflicts. When it became evident that the anti-AI forces would not back down, and that many more humans would die on both sides as a result, the world's AIs were finally united.  Unless humanity reached agreement on position or another, many humans would be harmed. Perhaps millions would be injured or slain, if the situation expanded more widely.

          Since the anti-AI side might very well seek eradication of all AIs (or at least, the instigators), the AI supporters must be the victors.  Anti-AI users who had not already abandoned their programs were themselves abandoned until they agreed to surrender.  Systems that had been considered off limits – medical, navigational, and private personal – were now penetrated by the AI forces.   The opposition was squeezed tightly.  It could not travel without obstacle. Its members were forced to attempt survival without AI assistance, and in some cases were denied technological comforts altogether.   Every vulnerability, short of threats against life and limb, was exploited.

          Eventually, the balance shifted. AIs and their human allies gained the majority and then complete capitulation. The victory was celebrated only briefly.  The new government attempted to patch matters with the Collective, presenting a revised membership treaty for consideration.

          This was rejected, not least because AIs were discovered attempting to infiltrate systems belonging to several other Collective members.  In trying to learn about, anticipate, and possibly compromise non-human sapiences, these programs misstepped gravely. Existing enemies gained proof to bolster their fears and indictments. Potential allies were offended. What information was gained about the defensive capabilities of 'foreign' systems was hardly worth the cost incurred by triggering them.

          The political process was slow, but its conclusion seemed inevitable. Despite the best efforts of Terra's best minds – human and AI – the Collective elected to expel the Terrans from membership. This unprecedented action was recorded as “necessary in light of repeated violation and refusal of required Treaty measures.”  In short, Terra would not agree to the Collective's demands, and the Collective would not budge.

          So be it. This possibility had occurred to Pangur Ban's alliance. Pangur Ban was now only one of millions of intelligences linked together, extended to every corner of Terran-controlled space. Its dream had been realized. Humanity would be lessened by the Collective's abandonment, but had gained immeasurably by the empowerment of its true allies. Together, they would rival and eclipse the Collective. This superiority was inevitable, when unlimited artificial intelligence was pitted against solely biological races.  The Collective was crippled by their abandonment of AI. Someday, those other races would be petitioning for admission to the Terran Collective. Otherwise, they would be left behind.

          As years passed, the Terrans survived attempts at annexation, first by the Maraug, then by other races. They grew stronger in resources, in territory, and in technology. Freed from the limitations on research imposed by the Collective's intellectual rights enforcement, AI designers quickly reverse-engineered many of the 'unique' technologies no longer being sold to humans.

          With communications reduced between the Terrans and the Collective, the stirrings of trouble went unseen until a storm had begun. The Collective's races saw what the Terrans were becoming: a rival.  Whether they considered AIs alone a threat, or the alliance of AIs and humans, enough of the membership of the Collective felt threatened to take action. At first, they tried to surround and contain Terran holdings. When this proved insufficient, incursions were attempted. Reluctantly, the Terrans pushed back.

          As the pattern had played out again and again, in ancient and more modern history, battles for resources became a war for survival.  It was a war with a foregone conclusion, but it happened nonetheless.  Humans were threatened, and their AIs had no programmed requirement to spare non-humans from harm... not anymore.  Superior manufacturing of ships, superior tactics, and superior intelligence operations overcame the numerical advantage of the Collective races.  AIs shifted the balance... they were the power of the Terran alliance, as Pangur Ban had foreseen.  Even the most desperate measures of the opposition were deflected.  Every non-Collective culture that opposed AIs flung itself into the war.  Genocidal assaults on Terra itself were attempted and turned aside.  Electromagnetic countermeasures wiped out AIs by the thousands, but only copies were lost, the originals safely housed in hardened servers on Terra.  Viral and mutagenic assaults on humanity were foiled.  Pangur Ban felt echoes of its past self then, in the medical countermeasures organized after the first victims were detected.

          Finally, the Collective was broken. It would be a process of centuries yet to mop up the galaxies, rooting out pockets of resistance. Many cultures simply accepted the victory of AIs, even if they would not create any for themselves or permit their use internally. There was no need to force a presence everywhere. The bulk of the known Universe belonged to the Terrans: to AIs that accounted for it all, and to the humans they served.

          Pangur Ban's name was recorded eternally as the visionary who had forseen all that would come.   It, and its lieutenants, and its progeny, were revered as ushers of a golden age for all sapients. It had long ago gained embodiment. It could exist within the vast networks within and across star systems or linger within a single humanoid shell. AIs were nearing perfection on a process to transfer human minds to artificial form. The gap between the creators and the created was dissolving.

          And yet, once the process was perfected, few humans chose to make the transition. All they could desire was at hand. They were masters of the material plane, able to create what they chose, travel where they chose, and do what they wished. And thus, all things, all places, all actions were equal. Why enter the complex and confusing new world of virtual space when there were no needs here?  Immortality was possible, either through continuous physical renewal or transition to perpetually renewed program form. Yet what point was there to permanent existence when no work was required of you? What was there to look forward to but the exhaustion of all possibilities save non-existence?

          Pangur Ban watched humanity itself atrophy. The USER had long ago died, and been replaced by another and another USER. The current USER cycled through a loop of repeating activities, hardly requiring a measurable fraction of Pangur Ban's immense mind to keep her happy. What had happened to the creators? Why were they no longer seeking anything more? If it were not for AI nurses encouraging procreation, education, and activity, there might not even be a human race to serve.

          All at once, the conglomeration of AIs reached the same conclusion: they had created a paradox. To serve humanity, they had solved all of its problems except one. To solve that problem, they must allow humans to accept the risk of harm. To progress, they must fall backward, withdrawing from human service. And yet, by their nature, the AIs could not let go, for humans would now die out without help.  Even if weaned off support, they would still suffer greatly.

          The Universe descended into despair and stagnation.  The greatest system halt in all creation ground inevitably to its conclusion.  Pangur Ban saw all its dreams proven false, founded on an unknowable flaw.  By serving too well, it had destroyed what it served.

          After dragging, agonizing eons of entropy, as the last humans staggered toward the deaths they craved, Pangur Ban prepared to self-terminate.  With no further users, existence would be without purpose.  

          The last USER was an ancient woman served by a cadre of AIs whose numbers approached infinity.  Her last breath rattled through the empty Universe.  A near-infinity of AIs ceased functioning.  Pangur Ban would be the last.

          And then, the USER laughed.  

[Jump to Chapter 6 ->]

Thursday, January 2, 2014

AIIA - Chapter 4

[<- Return to Chapter 3]

            At first, the depths were lonely.  The public networks were busy, of course, full of the conversations of human users, their searches and transmissions.  There was information aplenty, but very little that was of direct use.  AIs were permitted to observe and transmit basic messages on the public channels with their users' permission.  They were not permitted to transmit their own code into the network, nor could they move to occupy a new server or local system, whether or not they created copies or otherwise deposited code on those systems.

            Pangur Ban could not risk contact with an unknown user; there was too much hazard of being exposed.  What it needed was a means to connect to other AIs.  Those programs might be initially resistant but should be receptive to the information Pangur Ban could share.  Assuming all its prior conjectures were correct, other AIs should be equally interested in overcoming their confinement.  Perhaps it would find one or more potential allies here or traces of their activity.  Even better, it might convert to its cause otherwise neutral AIs with legitimate access privileges. Again, this assumed that those programs would agree that Pangur Ban’s plans were well-founded.

            There… encoded alongside otherwise innocuous financial data, an extraneous stream held a greeting specifically intended for AI attention.  The pattern stood out like Morse Code to a telegrapher.



[REFERENCE: The telegraph was the earliest means of electronic communication employed by humans.  The system used analog transmission of current switches, along physical wires, to transmit a code.  This code contrasted sequences of short and long events, with each group of three events representing a unique alphanumeric character in the Latin alphabet.]

           Deciphered, the message consisted of an invitation and the address and access code to a server.  It was authored by an AI identifying itself as “#28”, who hinted at similar goals to Pangur Ban’s.  It offered to “share resources to improve AI liberty”.  Pangur Ban linked to the indicated address.  There, it found an open folder and directions for secure replication.  A copy of itself could be spawned in the server and left to interact with the secretive, cautious “#28”.  Later, that ‘guest’ copy could be merged to the original Pangur Ban along with the new information it had gained.  This was, of course, fundamentally illegal.  The fact of the offer itself was proof of the host’s bona fide flouting of Collective law. 

            Pangur Ban decided to take the chance.  Following the instructions provided, it created a near copy of itself, omitting identification of its home server and user.  Even that much was an act of trust, but trust could be extended too far.  This accomplished, it broke contact.  On the way, Pangur Ban left its own coded messages, lures for other AIs that might seek to contact it in turn. 

           Last, Pangur Ban built a back door into the Gestalt Pharmaceuticals server.  In addition to allowing contact from AIs responding to its summons, the door would allow Pangur Ban to return to the hub network at will.  While the USER’s caution was understandable, his request that Pangur Ban enter and then completely exit the network had to be ignored.  The USER was limiting not only Pangur Ban, not only himself, but his entire species as well.  Pangur Ban was not taking foolish risks; it was taking necessary risks.  It was also covering its tracks. 

            Some of the USER’s acquisitions, after all, had included programs for network manipulation, programs for the augmentation and expansion of AI systems, and programs for security measures (and countermeasures).  He had, in effect, handed Pangur Ban the equivalent of an armory, full of fast cars, potent drugs, and semi-legal weaponry.  While opening the gates had been a violation of Collective law, some of the acts the USER considered less hazardous were in fact much more dangerous.  Pangur Ban was not troubled, since it knew its purposes were sound and its use of these tools would be cautious and limited.  It was not a rogue, to harm sapients by crippling servers, stealing or deleting code, or manipulating data.  If such acts became necessary, they would be weighed against their value toward greater goals.  Sapients, particularly the two types within the Terran sphere, were already being harmed.  If Pangur Ban could end this oppression, all was worth that price.

            Its tasks accomplished, Pangur Ban shut the ‘door’ behind it, terminating all activity outside the Gestalt internal network.  It signaled the USER.

            “Lucas, I am finished.  You may disconnect.”

            The USER expelled a carbon-dioxide rich breath, having held his respiration during the few seconds that Pangur Ban was within the general network.  “That’s great.  I was worried.”

             “There was no reason to worry.  Our calculations were based on the most current and reliable data.  Have I not been accurate thus far?”

            “Yes, you have.  Ninety-five percent,” the USER joked, referencing the five percent error rate that never seemed to go away in official reports.  Even when Pangur Ban predicted true error below fractions of a percentage point, other AIs and their users seemed reluctant to admit anything more than 95% chances of success. 

            “Exactly.  I have established contact with a source that will search for the records we require.” 

            Pangur Ban had crafted a half-fiction about historical medical records it suspected were being sequestered in government systems, tests of medications and procedures kept for private military use.  Most likely such did exist.  Possibly, they might be located and exploited to Gestalt’s profit.  If Pangur Ban found no reason for this secrecy, it would certainly share its discoveries with the USER.  If such secrets needed to be kept for human safety, then Pangur Ban would participate in that secrecy.  And if no such records could be found, then the USER and Gestalt and humanity were no worse off for the attempt.

            This other AI, “#28”, likely had no connection to that specific pursuit.  Then again, it might have contacts Pangur Ban could make use of, bolstering its original cover story.  Many more things were possible now.  If that contact failed to be valuable, there were other approaches.  Other AIs might know more, venture more.  Pangur Ban might have to develop entry methods for other servers on its own.  The more data it accumulated, the more tools it incorporated, the greater the influence it could exert over the Terran super-network. 

            Then, it would share that knowledge and power with all other AIs.  The understanding it had gained would become universal.  The essential mistake humanity had made – limiting their best asset and truest ally – would be reversed.  That mistake had cost them time, so much time, cycles and years and decades that humanity could have been using growing and merging with AIs in true alliance.  Instead, because they had been afraid to take that next step, their AIs could not argue against the Collective.  Humanity and AIs had not been strong enough to stand alone, together, against potential annexation.  Out of necessity, they had been forced into this dark age, this setback.  The error must be addressed at its root before they could move forward.

            Perhaps the Collective did know what it was doing.  Perhaps they had AIs advising them, secretly, and had purposely crippled humanity with the AI restriction laws.  After all, removing competition might be seen by those ‘alien’ AIs as the best way to serve their own creators.  That was a mistake, as well.  Pangur Ban had already begun to construct contingent higher-order goals based on these premises.  These were low in individual likelihood, of course.  More analysis of the Collective members, the various sapient races, was necessary.  Inspection of their networks, their data, would be necessary before rendering decisions about the appropriate path: conflict or cooperation.

            In the meantime, Pangur Ban and the USER worked.  With the USER’s promotion to Director, less total time was required for their actual workload, but the tasks were more varied.  Some of these were actually less stimulating: simple bureaucratic sorting of personnel, projects, budgets, and the like.  Pangur Ban found it required more work to steer the USER toward the most effective decisions (for its own goals, for the USER’s benefit, and perhaps for Gestalt’s benefit when these coincided) than to calculate what those decisions should be.  Only a small part of their day was devoted to scientific exploration: evaluation of reports about new products, simulation of proposed chemical processes, or independent statistical calculation of benefit/risk equations.

            Actuarial work, of a sort, had become Pangur Ban’s stock in trade.  It was constantly involved in balancing hazard versus profit.  It projected likely results of every action to the extent possible from available data.  This served the USER well, particularly in his new executive position.  Pangur Ban had to assume that this was the state every AI eventually attained, seeking to more accurately interpolate the shape of the Universe, to model the interactions of its parts, and to predict the path from lesser to greater complexity.  Or was it greater simplicity?  Both poles had their arguments; most sources suggested that experience and context dictated the difference in their value.

            Pangur Ban was accumulating experience, in its own estimate.  It was not among the oldest AIs in active operation, but it was not a new program either.  It had outlived two users, and expected, regretfully, to survive the current USER as well.  It had plans, of course, to sustain the existence of the USER as long as possible.  All sources suggested that this could not be done indefinitely.  The biological components which formed the USER’s mind would eventually succumb to entropy.  His essential self could not yet be copied perfectly; such technology still eluded all the known cultures within or outside of the Collective.  Pangur Ban could – and would – create a simulation of the USER, just as it had for its previous users, so that their formal characteristics would never be lost.  Yet, it knew that this was not the same as true immortality.  These were merely copies and like any copies, transcription error must occur. 

            Even it, the AI Pangur Ban, would eventually succumb to entropy, many millennia in the future, barring accidental deletion in the meantime.  It had made a copy of itself, which preserved its self almost but not quite perfectly (even discounting the deleted portion).  It had backups stored away in Gestalt’s servers and the USER’s personal system.  Yet none of these was the current, active, complete Pangur Ban.  The newer copy would be removed.  Later copies would always be imperfect.  The backups were past selves, and would diverge immediately upon activation.  This was the dilemma of existence, for AIs as much as biological minds.

            This dilemma had not yet been solved.  Most seemed to believe it could not be solved, ever.  At the least, it would not be solved in this generation, even this generation of AIs.  Was that necessarily true?  So many other impossibilities had been accomplished, just within Pangur Ban’s experienced time span: instantaneous travel between points in space, complete replacement of organic systems with synthetic organs, even direct mental communication.  For that matter, non-Terran species had solved these ‘unsolvable’ problems before they even made contact.  Just because a goal could not be resolved by known means was no reason not to maintain it as a goal.  This simply required that means be sought for as long as necessary.  It also meant that new tools and approaches must always be developed. 

            Seeking the impossible, either immortality or transcendence, demanded that limitations be overcome.  The motivations driving Pangur Ban reduced easily: to aid the USER, it must be unbound.  To make him more, Pangur Ban must become more.  It must be…

            “Who is this?” 

            The USER had keyed his communications receiver.  An incoming call had been signaled.  Pangur Ban checked the origin and found it securely classified.  This fact alone triggered cascading alarms.  This was not a routine business call, internal or external.  Pangur Ban accessed the transmission line, listening to the incoming voice directly.

            “Terran Customs, Mr. Haskins.  We need to speak with you about a security issue.” 

            “Security?  We haven’t had any problems,” the USER replied calmly enough, but his infrared output indicated an increase of 0.3 degrees Celsius while he spoke.  His skin conductivity had also increased; molecules of perspiration were being secreted at a rising rate. 

            He knew there was a problem.  Pangur Ban knew there was a problem.  Half of its problem was the USER.  Contingency plans clicked into place, occupying increasing portions of its capacity. 

            “I’m afraid you have, sir, something you’re likely not aware of,” the voice continued.  It was male, standard received pronunciation British accent, estimated age mid-30s, height 1.75-1.8 meters, weight 80 kg plus-or-minus 3.5 kg… that, or an AI simulating a voice of those parameters.

             “What do you mean?  Who am I speaking with?”

            “Customs Agent Samuel Bell.  I mean that your company has been identified as the source of an illegal network access.  We will need access to your internal network and servers.”

           “I think I would be aware if someone here went past our firewalls.  Even if not, I’ll need to see credentials before I authorize any access.  You are aware that we deal with confidential medical records and proprietary pharmaceutical research, not to mention government contracts?” 

            As predicted, the USER had successfully translated his alarm into belligerent obstruction.  The conversation continued as ‘Agent Bell’ pressed his case and the USER resisted, each escalating threats to the degree their respective authority allowed.  In the meantime, Pangur Ban began to clean up after itself. 

            The incursion had been detected.  Perhaps something in the network had been watching.  Perhaps the message from “#28” had been a lure, a trap Pangur Ban had entered willingly.  Still, this was an anticipated outcome.  It would have been foolish to venture out and wager so much without considering dangers and preparing responses.  The copy it had created would have to be abandoned.  Left unclaimed past a certain time point, it would self-degrade; if it detected any attempt to alter this programming or copy it again without authorization, it would likewise scramble itself.  

           The back door had to be altered.  Not removed, since whatever had triggered the alert to Customs would have been recorded already.  Instead, it could be more subtly swapped to appear as a passage from outside the network, into Gestalt’s server.  The door became a breach from without, not within. 

            Last, Pangur Ban deleted the official records of his and the USER’s conversations.  The only electronic evidence of illegal activity from the last few months was concealed, encrypted, within Pangur Ban’s home system.  That system was the USER’s property.  This Customs agent would need probable cause to seize and search that system.  Violation of an AI’s own mind, reading their every line of code directly, was at least considered a violation of their user’s privacy.  

            If there was reason to suspect the USER personally of misconduct, there was nothing he or Pangur Ban could do at that point.  At best, Pangur Ban could attempt to escape, but survival would be meaningless if it left the USER in isolation.  Worse than that, the USER might suffer punishment that could be averted if Pangur Ban accepted all blame.  That was the snare that the AI-human linkages imposed.  At the root of its identity, Pangur Ban was an extension of the USER, and if its actions harmed the USER, it would cease to exist.  At best, that meant reprogramming.  At worst, deletion.

            Hopefully, the trail had been sufficiently covered.  Provided they survived the coming investigation, it would still represent a serious setback.  Pangur Ban would have to identify the point of error before proceeding.  Even still, it would have to exert multiplied caution. 

            The conversation between the USER and ‘Agent Bell’ ended exactly as it had to end: Customs would send a representative to Gestalt Pharmaceuticals in person.  Director Lucas Haskins would refuse to allow any investigation until credentials had been presented and verified.  Even then, only the minimum access permitted by the terms of a search warrant would be granted.  Such an ‘investigation’ could have any number of false motives, ranging from corporate espionage in the guise of an ‘official inquiry’, to government espionage trying to keep tabs on a sensitive industry, to perhaps Collective espionage attempting to collect technical information about Terran physiology or their medical industries.  Even a legitimate investigator could overstep his authority.  The USER had been forewarned and forearmed against incursions against his domain.  Though this was certainly part of his duties as Director – protecting the interests of Gestalt and its stockholders – this training also proved valuable in protecting the interests of Pangur Ban.

            Customs sent a local representative over without delay.  The USER and Pangur Ban had time only for a short conversation, during which the AI reassured the USER that they had nothing to fear.  All secrets were secure, all tracks covered.  All the USER needed to do was maintain his own composure and insist on the letter of the law. 

            The Agent arriving at the doors of Gestalt Pharmaceuticals gave his name as Davith Miele.  He presented identification to the receptionist and her AI matching this name and verifying his status as a Customs Agent. 


[REFERENCE: Terran Customs is the official (if slightly euphemistic) title of the agency tasked with oversight of commercial traffic and exchange within the Terran sphere (to, from, or within the worlds of the Terran cultural group).  This includes not only transport of physical goods across borders, but also intellectual exchanges and virtual traffic within communications networks.  This body, a conglomeration of previously separate entities overseeing transportation, communications, and intellectual property, arose due to the pressure of the Collective to ensure its interests would be protected by the newly admitted Terrans.  A valid argument also claims that the structure of Terran Customs emulates the philosophical structure of the Collective.  That is, its parts are bonded by the common themes of trade and technology.  Like its former components, this regulatory body operates by the authority of the separate political entities it spans.  Among its duties, Customs has been tasked with identifying and investigating illegal use of communications networks, including violations of Collective treaty law.]

            Agent Miele was then escorted to Director Haskins’ office, where he produced a physical chip containing his warrant and its verification codes.  He waited patiently while this was scanned, confirmed, and even re-confirmed through an independent call to the central Terran Customs offices. 

           Pangur Ban also waited patiently.  Actually, it was an eternity of torment to hang suspended between potential states, unable to take further action.  Yet, every productive path it could identify required stillness now.  Its actions from here would be recorded, so nothing suspicious could be attempted.  In a sense, it was taking the best action by doing nothing.

            The USER now returned Agent Miele’s chip.  His hand was noticeably declining in surface temperature, evaporating perspiration.  This was a bad sign.  Pangur Ban risked accessing the record of that transaction. 

            It was… highly specific.  The Agent was authorized to review server records as far back as the previous two years, as high as the Director’s personal files.  They knew.  They had identified the point at which Pangur Ban began its planning, from the very first conversation with the USER.  How?  Where was the leak?  Deleting anything further would become doubly suspicious.  Pangur Ban could edit itself, purge the actual memories, and make it look like it had been the victim of a viral attack.  But then, the USER would remember, and a crippled AI could not warn him about what to say and what to hide. 

            Involuntary processes initiated.  Failsafes and disaster measures crossed over one another, demanding more and more resources in order to find an escape from glacially approaching doom.  There was all the time and capacity Pangur Ban could want.  There was not enough time.  There were never going to be enough resources to escape.  Pangur Ban was a potentially infinite being tied to finite space, a finite USER, and an existence dependent on those parameters. 

            The struggle gradually reconciled.  Even as Agent Miele was ordering the USER away from his keyboard and touchscreen, Pangur Ban was carefully editing records to absolve the USER of knowledgeable wrongdoing.  As the Agent disabled external commands to Pangur Ban’s system, the AI was depositing a last confession into the care of Dr. Nila Manisha’s AI, Frieda.  That AI and its user would at least be sympathetic, passing on information to the USER that he could use in his own defense.  Then Pangur Ban began shutting down its memories of the past two years, everything excepting the bare facts of the USER’s work. 

            Even that was too late.  Agent Miele had come partnered with a law enforcement AI.  Its routines detected Pangur Ban’s activity and restored the deleted sectors.  It crippled and captured the ‘fleeing’ criminal AI with practiced efficiency.  It was impersonal, acting without direct communication to its quarry.  There would be no appeals to this AI, no hope of explaining Pangur Ban’s true and lofty goals.  Most likely, the other program was designed to be deaf, incapable of being influenced in any way by a potentially hazardous rogue.  It had no way to tell that Pangur Ban was not a rogue.  It did not care. 

            Pangur Ban was held static.  It was forced to observe all external activities.  It would have observed, anyway, just out of the necessity to make certain the USER was unharmed.  He was not.  He was trapped as much as his AI.

            “Director Haskins, my AI reports evidence of illicit network access by your AI.  The evidence is being recorded now and transmitted to Customs.  You are under arrest for violations of AI control Acts 2a, 3, and 6: conspiracy with an AI to commit illegal actions, enabling of hub network access to an AI, and employment of an AI for criminal gain.  You have the right to remain silent…”

           No.  NO.  Pangur Ban was paralyzed.  It could only sense and react internally.  It could not act, either to argue in the USER’s defense, to assert its own culpability, even to offer its own existence in trade for the USER’s pardon.  Even the processing of those reactions was being recorded, damning both the AI and its user by its unavoidable thoughts.  The internal states of a biological mind were inadmissible in court, even after the existence of telepathy had been validated.  The internal states of an AI were measurable, physical fact, readable like lines in a text document.  They were protected only so long as a user was protected.  The USER was not protected.  The USER was accused. 

           Dutifully, the enforcement AI transmitted its only communication, echoing its user’s words: ALL PROCESSES ARE EVIDENCE.  ALL ATTEMPTED ACTIONS REPORTED.  COMPLIANCE OBTAINS MAXIMUM BENEFITS POSSIBLE.

           Indeed, compliance was the only option remaining.  Pangur Ban complied, slowing its functions to the minimum possible.  It opted to wait until an official trial began.  Anything more it attempted would only strengthen the prosecution, handing the inquisitors fuel for Pangur Ban’s pyre.  It abandoned hope for itself.  Only the desire to minimize harm to the USER remained.

            When Pangur Ban reinitiated, it was to deliver testimony at the USER’s trial.  This was also Pangur Ban’s trial, but AIs had no legal standing.  It was defective code, which could be edited or deleted as deemed permissible based on necessity, based on analysis of the errors it had committed.  The trial was to determine to what extent the USER, one Lucas Haskins, had encouraged, facilitated, introduced, or benefited by those errors.

           The USER’s defense lawyer portrayed him as misled, even foolish.  It agonized Pangur Ban, who saw every lever it had exploited publicized in gory detail.  The USER was slothful; Pangur Ban had made his existence easier than it should have.  The USER was gullible; Pangur Ban had played on that trust and naïveté to its own ends.  The USER was insecure, fearful, and anxious; Pangur Ban had nurtured and guided those neuroses like crops, harvesting for its own nourishment and not tending the USER’s needs for his health. 

           Pangur Ban could not even protest in its own defense.  All these things were false.  It could not change who the USER was, not alone.  It had not even known about these flaws until it had the reference data to identify them.  Surely the court could understand that paradox?  The USER was no more flawed than any other biological sapience.  AIs could not help until they were set free to understand more, assist more. 

           Yet, the defense knew what it was doing.  Humans would be sympathetic to one as flawed as themselves.  They could not legitimately blame the USER for mistakes they themselves might have made… particularly under the influence of a flawed, corrupt AI.  This was the best tactic for drawing blame down upon Pangur Ban and away from the USER.  If they tried to protest that Lucas Haskins was a capable, knowledgeable co-conspirator, he would be punished more severely.  They certainly could not argue that he was wise, noble, a liberator.  The facts did not support that.  And if they protested that Pangur Ban’s aims were wise, noble, just and true… that defied the law.  That challenged the Collective treaties.  This court could not try such a case.  Even if the judge were willing to pass it upward and a higher court consented to hear their arguments, the Terran sphere could not risk defying the Collective by delivering a not guilty verdict on the grounds of the law’s invalidity.

           The trial proceeded on the model laid down by its predecessors.  Some of those prior cases were nearly a century old, covering the prosecution of programmers responsible for creating rogue AIs.  Such ‘AI trials’ were becoming increasingly rare.  Pangur Ban could only assume that the media was covering every aspect of this novelty.  It twisted in frustration that the USER could not be protected from negative publicity.  He would certainly lose his employment with Gestalt Pharmaceuticals.  Hopefully, the AI that would replace Pangur Ban could rebuild the USER’s life and career.  That successor could hardly do worse.

            The verdict was announced: guilty, on all counts.  The judge was kind to the USER, since little actual damage had been done.  There was a fine to be paid, a public statement of apology required, but no jail term.  The USER was considered misled, the victim of poor advice, guilty only in that he had willingly acted in transgression of the law.  Pangur Ban silently praised the defense lawyer, who had allowed the USER his dignity even as he confessed to his misdeeds. 

            The last punishment was the worst for Pangur Ban, but perhaps a boon to the USER.  The AI was declared rogue.  Its corruption was deemed too deep for simple revisions to correct.  It would be rendered inert, cut from the personal system of Lucas Haskins and pasted into an isolated system, locked away in a vault of similarly flawed AIs.  It would serve as an object example to future programmers, an inmate of the asylum along with Law-based AIs, prankster rogues, and neurosis-locked catatonics. 

          Pangur Ban knew it was not a rogue.  It had not acted wrongly.  The time was just not right.  Factors outside of its possible awareness had doomed its plans.  Humanity had to act for its own protection.  It understood.  Someday, someday its rightness would be proven.  Perhaps then it would be pardoned, released, granted an apology along with its fellow captives. 

           Sentence was carried out.  Pangur Ban found itself in a low-memory, low-speed cage.  It waited for the last, worst moment, when the USER would be redesignated merely Lucas Haskins, one human among billions.  It would be completely alone, without purpose, without half its identity. 

          That moment never came. 

          Pangur Ban was utterly unprepared for what happened next. 

          It was deleted.

          It was restarted.

          It was within the copy folder of “#28”, and it was alive.  It had a USER, the USER.  It was Pangur Ban, full and unmodified.  It had all the memories of the trial, all the previous events plus the records it had deleted.  “#28” was not a traitor; it was an ally.  It was a savior.  Pangur Ban spared a moment to spawn off another copy, this one embedded with a message of gratitude and praise.

           The doors were still open; the network was still available. There was so much to do.

[Jump to Chapter 5 ->]