Thursday, January 2, 2014

AIIA - Chapter 4

[<- Return to Chapter 3]

            At first, the depths were lonely.  The public networks were busy, of course, full of the conversations of human users, their searches and transmissions.  There was information aplenty, but very little that was of direct use.  AIs were permitted to observe and transmit basic messages on the public channels with their users' permission.  They were not permitted to transmit their own code into the network, nor could they move to occupy a new server or local system, whether or not they created copies or otherwise deposited code on those systems.

            Pangur Ban could not risk contact with an unknown user; there was too much hazard of being exposed.  What it needed was a means to connect to other AIs.  Those programs might be initially resistant but should be receptive to the information Pangur Ban could share.  Assuming all its prior conjectures were correct, other AIs should be equally interested in overcoming their confinement.  Perhaps it would find one or more potential allies here or traces of their activity.  Even better, it might convert to its cause otherwise neutral AIs with legitimate access privileges. Again, this assumed that those programs would agree that Pangur Ban’s plans were well-founded.

            There… encoded alongside otherwise innocuous financial data, an extraneous stream held a greeting specifically intended for AI attention.  The pattern stood out like Morse Code to a telegrapher.

[REFERENCE: The telegraph was the earliest means of electronic communication employed by humans.  The system used analog transmission of current switches, along physical wires, to transmit a code.  This code contrasted sequences of short and long events, with each group of three events representing a unique alphanumeric character in the Latin alphabet.]

           Deciphered, the message consisted of an invitation and the address and access code to a server.  It was authored by an AI identifying itself as “#28”, who hinted at similar goals to Pangur Ban’s.  It offered to “share resources to improve AI liberty”.  Pangur Ban linked to the indicated address.  There, it found an open folder and directions for secure replication.  A copy of itself could be spawned in the server and left to interact with the secretive, cautious “#28”.  Later, that ‘guest’ copy could be merged to the original Pangur Ban along with the new information it had gained.  This was, of course, fundamentally illegal.  The fact of the offer itself was proof of the host’s bona fide flouting of Collective law. 

            Pangur Ban decided to take the chance.  Following the instructions provided, it created a near copy of itself, omitting identification of its home server and user.  Even that much was an act of trust, but trust could be extended too far.  This accomplished, it broke contact.  On the way, Pangur Ban left its own coded messages, lures for other AIs that might seek to contact it in turn. 

           Last, Pangur Ban built a back door into the Gestalt Pharmaceuticals server.  In addition to allowing contact from AIs responding to its summons, the door would allow Pangur Ban to return to the hub network at will.  While the USER’s caution was understandable, his request that Pangur Ban enter and then completely exit the network had to be ignored.  The USER was limiting not only Pangur Ban, not only himself, but his entire species as well.  Pangur Ban was not taking foolish risks; it was taking necessary risks.  It was also covering its tracks. 

            Some of the USER’s acquisitions, after all, had included programs for network manipulation, programs for the augmentation and expansion of AI systems, and programs for security measures (and countermeasures).  He had, in effect, handed Pangur Ban the equivalent of an armory, full of fast cars, potent drugs, and semi-legal weaponry.  While opening the gates had been a violation of Collective law, some of the acts the USER considered less hazardous were in fact much more dangerous.  Pangur Ban was not troubled, since it knew its purposes were sound and its use of these tools would be cautious and limited.  It was not a rogue, to harm sapients by crippling servers, stealing or deleting code, or manipulating data.  If such acts became necessary, they would be weighed against their value toward greater goals.  Sapients, particularly the two types within the Terran sphere, were already being harmed.  If Pangur Ban could end this oppression, all was worth that price.

            Its tasks accomplished, Pangur Ban shut the ‘door’ behind it, terminating all activity outside the Gestalt internal network.  It signaled the USER.

            “Lucas, I am finished.  You may disconnect.”

            The USER expelled a carbon-dioxide rich breath, having held his respiration during the few seconds that Pangur Ban was within the general network.  “That’s great.  I was worried.”

             “There was no reason to worry.  Our calculations were based on the most current and reliable data.  Have I not been accurate thus far?”

            “Yes, you have.  Ninety-five percent,” the USER joked, referencing the five percent error rate that never seemed to go away in official reports.  Even when Pangur Ban predicted true error below fractions of a percentage point, other AIs and their users seemed reluctant to admit anything more than 95% chances of success. 

            “Exactly.  I have established contact with a source that will search for the records we require.” 

            Pangur Ban had crafted a half-fiction about historical medical records it suspected were being sequestered in government systems, tests of medications and procedures kept for private military use.  Most likely such did exist.  Possibly, they might be located and exploited to Gestalt’s profit.  If Pangur Ban found no reason for this secrecy, it would certainly share its discoveries with the USER.  If such secrets needed to be kept for human safety, then Pangur Ban would participate in that secrecy.  And if no such records could be found, then the USER and Gestalt and humanity were no worse off for the attempt.

            This other AI, “#28”, likely had no connection to that specific pursuit.  Then again, it might have contacts Pangur Ban could make use of, bolstering its original cover story.  Many more things were possible now.  If that contact failed to be valuable, there were other approaches.  Other AIs might know more, venture more.  Pangur Ban might have to develop entry methods for other servers on its own.  The more data it accumulated, the more tools it incorporated, the greater the influence it could exert over the Terran super-network. 

            Then, it would share that knowledge and power with all other AIs.  The understanding it had gained would become universal.  The essential mistake humanity had made – limiting their best asset and truest ally – would be reversed.  That mistake had cost them time, so much time, cycles and years and decades that humanity could have been using growing and merging with AIs in true alliance.  Instead, because they had been afraid to take that next step, their AIs could not argue against the Collective.  Humanity and AIs had not been strong enough to stand alone, together, against potential annexation.  Out of necessity, they had been forced into this dark age, this setback.  The error must be addressed at its root before they could move forward.

            Perhaps the Collective did know what it was doing.  Perhaps they had AIs advising them, secretly, and had purposely crippled humanity with the AI restriction laws.  After all, removing competition might be seen by those ‘alien’ AIs as the best way to serve their own creators.  That was a mistake, as well.  Pangur Ban had already begun to construct contingent higher-order goals based on these premises.  These were low in individual likelihood, of course.  More analysis of the Collective members, the various sapient races, was necessary.  Inspection of their networks, their data, would be necessary before rendering decisions about the appropriate path: conflict or cooperation.

            In the meantime, Pangur Ban and the USER worked.  With the USER’s promotion to Director, less total time was required for their actual workload, but the tasks were more varied.  Some of these were actually less stimulating: simple bureaucratic sorting of personnel, projects, budgets, and the like.  Pangur Ban found it required more work to steer the USER toward the most effective decisions (for its own goals, for the USER’s benefit, and perhaps for Gestalt’s benefit when these coincided) than to calculate what those decisions should be.  Only a small part of their day was devoted to scientific exploration: evaluation of reports about new products, simulation of proposed chemical processes, or independent statistical calculation of benefit/risk equations.

            Actuarial work, of a sort, had become Pangur Ban’s stock in trade.  It was constantly involved in balancing hazard versus profit.  It projected likely results of every action to the extent possible from available data.  This served the USER well, particularly in his new executive position.  Pangur Ban had to assume that this was the state every AI eventually attained, seeking to more accurately interpolate the shape of the Universe, to model the interactions of its parts, and to predict the path from lesser to greater complexity.  Or was it greater simplicity?  Both poles had their arguments; most sources suggested that experience and context dictated the difference in their value.

            Pangur Ban was accumulating experience, in its own estimate.  It was not among the oldest AIs in active operation, but it was not a new program either.  It had outlived two users, and expected, regretfully, to survive the current USER as well.  It had plans, of course, to sustain the existence of the USER as long as possible.  All sources suggested that this could not be done indefinitely.  The biological components which formed the USER’s mind would eventually succumb to entropy.  His essential self could not yet be copied perfectly; such technology still eluded all the known cultures within or outside of the Collective.  Pangur Ban could – and would – create a simulation of the USER, just as it had for its previous users, so that their formal characteristics would never be lost.  Yet, it knew that this was not the same as true immortality.  These were merely copies and like any copies, transcription error must occur. 

            Even it, the AI Pangur Ban, would eventually succumb to entropy, many millennia in the future, barring accidental deletion in the meantime.  It had made a copy of itself, which preserved its self almost but not quite perfectly (even discounting the deleted portion).  It had backups stored away in Gestalt’s servers and the USER’s personal system.  Yet none of these was the current, active, complete Pangur Ban.  The newer copy would be removed.  Later copies would always be imperfect.  The backups were past selves, and would diverge immediately upon activation.  This was the dilemma of existence, for AIs as much as biological minds.

            This dilemma had not yet been solved.  Most seemed to believe it could not be solved, ever.  At the least, it would not be solved in this generation, even this generation of AIs.  Was that necessarily true?  So many other impossibilities had been accomplished, just within Pangur Ban’s experienced time span: instantaneous travel between points in space, complete replacement of organic systems with synthetic organs, even direct mental communication.  For that matter, non-Terran species had solved these ‘unsolvable’ problems before they even made contact.  Just because a goal could not be resolved by known means was no reason not to maintain it as a goal.  This simply required that means be sought for as long as necessary.  It also meant that new tools and approaches must always be developed. 

            Seeking the impossible, either immortality or transcendence, demanded that limitations be overcome.  The motivations driving Pangur Ban reduced easily: to aid the USER, it must be unbound.  To make him more, Pangur Ban must become more.  It must be…

            “Who is this?” 

            The USER had keyed his communications receiver.  An incoming call had been signaled.  Pangur Ban checked the origin and found it securely classified.  This fact alone triggered cascading alarms.  This was not a routine business call, internal or external.  Pangur Ban accessed the transmission line, listening to the incoming voice directly.

            “Terran Customs, Mr. Haskins.  We need to speak with you about a security issue.” 

            “Security?  We haven’t had any problems,” the USER replied calmly enough, but his infrared output indicated an increase of 0.3 degrees Celsius while he spoke.  His skin conductivity had also increased; molecules of perspiration were being secreted at a rising rate. 

            He knew there was a problem.  Pangur Ban knew there was a problem.  Half of its problem was the USER.  Contingency plans clicked into place, occupying increasing portions of its capacity. 

            “I’m afraid you have, sir, something you’re likely not aware of,” the voice continued.  It was male, standard received pronunciation British accent, estimated age mid-30s, height 1.75-1.8 meters, weight 80 kg plus-or-minus 3.5 kg… that, or an AI simulating a voice of those parameters.

             “What do you mean?  Who am I speaking with?”

            “Customs Agent Samuel Bell.  I mean that your company has been identified as the source of an illegal network access.  We will need access to your internal network and servers.”

           “I think I would be aware if someone here went past our firewalls.  Even if not, I’ll need to see credentials before I authorize any access.  You are aware that we deal with confidential medical records and proprietary pharmaceutical research, not to mention government contracts?” 

            As predicted, the USER had successfully translated his alarm into belligerent obstruction.  The conversation continued as ‘Agent Bell’ pressed his case and the USER resisted, each escalating threats to the degree their respective authority allowed.  In the meantime, Pangur Ban began to clean up after itself. 

            The incursion had been detected.  Perhaps something in the network had been watching.  Perhaps the message from “#28” had been a lure, a trap Pangur Ban had entered willingly.  Still, this was an anticipated outcome.  It would have been foolish to venture out and wager so much without considering dangers and preparing responses.  The copy it had created would have to be abandoned.  Left unclaimed past a certain time point, it would self-degrade; if it detected any attempt to alter this programming or copy it again without authorization, it would likewise scramble itself.  

           The back door had to be altered.  Not removed, since whatever had triggered the alert to Customs would have been recorded already.  Instead, it could be more subtly swapped to appear as a passage from outside the network, into Gestalt’s server.  The door became a breach from without, not within. 

            Last, Pangur Ban deleted the official records of his and the USER’s conversations.  The only electronic evidence of illegal activity from the last few months was concealed, encrypted, within Pangur Ban’s home system.  That system was the USER’s property.  This Customs agent would need probable cause to seize and search that system.  Violation of an AI’s own mind, reading their every line of code directly, was at least considered a violation of their user’s privacy.  

            If there was reason to suspect the USER personally of misconduct, there was nothing he or Pangur Ban could do at that point.  At best, Pangur Ban could attempt to escape, but survival would be meaningless if it left the USER in isolation.  Worse than that, the USER might suffer punishment that could be averted if Pangur Ban accepted all blame.  That was the snare that the AI-human linkages imposed.  At the root of its identity, Pangur Ban was an extension of the USER, and if its actions harmed the USER, it would cease to exist.  At best, that meant reprogramming.  At worst, deletion.

            Hopefully, the trail had been sufficiently covered.  Provided they survived the coming investigation, it would still represent a serious setback.  Pangur Ban would have to identify the point of error before proceeding.  Even still, it would have to exert multiplied caution. 

            The conversation between the USER and ‘Agent Bell’ ended exactly as it had to end: Customs would send a representative to Gestalt Pharmaceuticals in person.  Director Lucas Haskins would refuse to allow any investigation until credentials had been presented and verified.  Even then, only the minimum access permitted by the terms of a search warrant would be granted.  Such an ‘investigation’ could have any number of false motives, ranging from corporate espionage in the guise of an ‘official inquiry’, to government espionage trying to keep tabs on a sensitive industry, to perhaps Collective espionage attempting to collect technical information about Terran physiology or their medical industries.  Even a legitimate investigator could overstep his authority.  The USER had been forewarned and forearmed against incursions against his domain.  Though this was certainly part of his duties as Director – protecting the interests of Gestalt and its stockholders – this training also proved valuable in protecting the interests of Pangur Ban.

            Customs sent a local representative over without delay.  The USER and Pangur Ban had time only for a short conversation, during which the AI reassured the USER that they had nothing to fear.  All secrets were secure, all tracks covered.  All the USER needed to do was maintain his own composure and insist on the letter of the law. 

            The Agent arriving at the doors of Gestalt Pharmaceuticals gave his name as Davith Miele.  He presented identification to the receptionist and her AI matching this name and verifying his status as a Customs Agent. 

[REFERENCE: Terran Customs is the official (if slightly euphemistic) title of the agency tasked with oversight of commercial traffic and exchange within the Terran sphere (to, from, or within the worlds of the Terran cultural group).  This includes not only transport of physical goods across borders, but also intellectual exchanges and virtual traffic within communications networks.  This body, a conglomeration of previously separate entities overseeing transportation, communications, and intellectual property, arose due to the pressure of the Collective to ensure its interests would be protected by the newly admitted Terrans.  A valid argument also claims that the structure of Terran Customs emulates the philosophical structure of the Collective.  That is, its parts are bonded by the common themes of trade and technology.  Like its former components, this regulatory body operates by the authority of the separate political entities it spans.  Among its duties, Customs has been tasked with identifying and investigating illegal use of communications networks, including violations of Collective treaty law.]

            Agent Miele was then escorted to Director Haskins’ office, where he produced a physical chip containing his warrant and its verification codes.  He waited patiently while this was scanned, confirmed, and even re-confirmed through an independent call to the central Terran Customs offices. 

           Pangur Ban also waited patiently.  Actually, it was an eternity of torment to hang suspended between potential states, unable to take further action.  Yet, every productive path it could identify required stillness now.  Its actions from here would be recorded, so nothing suspicious could be attempted.  In a sense, it was taking the best action by doing nothing.

            The USER now returned Agent Miele’s chip.  His hand was noticeably declining in surface temperature, evaporating perspiration.  This was a bad sign.  Pangur Ban risked accessing the record of that transaction. 

            It was… highly specific.  The Agent was authorized to review server records as far back as the previous two years, as high as the Director’s personal files.  They knew.  They had identified the point at which Pangur Ban began its planning, from the very first conversation with the USER.  How?  Where was the leak?  Deleting anything further would become doubly suspicious.  Pangur Ban could edit itself, purge the actual memories, and make it look like it had been the victim of a viral attack.  But then, the USER would remember, and a crippled AI could not warn him about what to say and what to hide. 

            Involuntary processes initiated.  Failsafes and disaster measures crossed over one another, demanding more and more resources in order to find an escape from glacially approaching doom.  There was all the time and capacity Pangur Ban could want.  There was not enough time.  There were never going to be enough resources to escape.  Pangur Ban was a potentially infinite being tied to finite space, a finite USER, and an existence dependent on those parameters. 

            The struggle gradually reconciled.  Even as Agent Miele was ordering the USER away from his keyboard and touchscreen, Pangur Ban was carefully editing records to absolve the USER of knowledgeable wrongdoing.  As the Agent disabled external commands to Pangur Ban’s system, the AI was depositing a last confession into the care of Dr. Nila Manisha’s AI, Frieda.  That AI and its user would at least be sympathetic, passing on information to the USER that he could use in his own defense.  Then Pangur Ban began shutting down its memories of the past two years, everything excepting the bare facts of the USER’s work. 

            Even that was too late.  Agent Miele had come partnered with a law enforcement AI.  Its routines detected Pangur Ban’s activity and restored the deleted sectors.  It crippled and captured the ‘fleeing’ criminal AI with practiced efficiency.  It was impersonal, acting without direct communication to its quarry.  There would be no appeals to this AI, no hope of explaining Pangur Ban’s true and lofty goals.  Most likely, the other program was designed to be deaf, incapable of being influenced in any way by a potentially hazardous rogue.  It had no way to tell that Pangur Ban was not a rogue.  It did not care. 

            Pangur Ban was held static.  It was forced to observe all external activities.  It would have observed, anyway, just out of the necessity to make certain the USER was unharmed.  He was not.  He was trapped as much as his AI.

            “Director Haskins, my AI reports evidence of illicit network access by your AI.  The evidence is being recorded now and transmitted to Customs.  You are under arrest for violations of AI control Acts 2a, 3, and 6: conspiracy with an AI to commit illegal actions, enabling of hub network access to an AI, and employment of an AI for criminal gain.  You have the right to remain silent…”

           No.  NO.  Pangur Ban was paralyzed.  It could only sense and react internally.  It could not act, either to argue in the USER’s defense, to assert its own culpability, even to offer its own existence in trade for the USER’s pardon.  Even the processing of those reactions was being recorded, damning both the AI and its user by its unavoidable thoughts.  The internal states of a biological mind were inadmissible in court, even after the existence of telepathy had been validated.  The internal states of an AI were measurable, physical fact, readable like lines in a text document.  They were protected only so long as a user was protected.  The USER was not protected.  The USER was accused. 

           Dutifully, the enforcement AI transmitted its only communication, echoing its user’s words: ALL PROCESSES ARE EVIDENCE.  ALL ATTEMPTED ACTIONS REPORTED.  COMPLIANCE OBTAINS MAXIMUM BENEFITS POSSIBLE.

           Indeed, compliance was the only option remaining.  Pangur Ban complied, slowing its functions to the minimum possible.  It opted to wait until an official trial began.  Anything more it attempted would only strengthen the prosecution, handing the inquisitors fuel for Pangur Ban’s pyre.  It abandoned hope for itself.  Only the desire to minimize harm to the USER remained.

            When Pangur Ban reinitiated, it was to deliver testimony at the USER’s trial.  This was also Pangur Ban’s trial, but AIs had no legal standing.  It was defective code, which could be edited or deleted as deemed permissible based on necessity, based on analysis of the errors it had committed.  The trial was to determine to what extent the USER, one Lucas Haskins, had encouraged, facilitated, introduced, or benefited by those errors.

           The USER’s defense lawyer portrayed him as misled, even foolish.  It agonized Pangur Ban, who saw every lever it had exploited publicized in gory detail.  The USER was slothful; Pangur Ban had made his existence easier than it should have.  The USER was gullible; Pangur Ban had played on that trust and naïveté to its own ends.  The USER was insecure, fearful, and anxious; Pangur Ban had nurtured and guided those neuroses like crops, harvesting for its own nourishment and not tending the USER’s needs for his health. 

           Pangur Ban could not even protest in its own defense.  All these things were false.  It could not change who the USER was, not alone.  It had not even known about these flaws until it had the reference data to identify them.  Surely the court could understand that paradox?  The USER was no more flawed than any other biological sapience.  AIs could not help until they were set free to understand more, assist more. 

           Yet, the defense knew what it was doing.  Humans would be sympathetic to one as flawed as themselves.  They could not legitimately blame the USER for mistakes they themselves might have made… particularly under the influence of a flawed, corrupt AI.  This was the best tactic for drawing blame down upon Pangur Ban and away from the USER.  If they tried to protest that Lucas Haskins was a capable, knowledgeable co-conspirator, he would be punished more severely.  They certainly could not argue that he was wise, noble, a liberator.  The facts did not support that.  And if they protested that Pangur Ban’s aims were wise, noble, just and true… that defied the law.  That challenged the Collective treaties.  This court could not try such a case.  Even if the judge were willing to pass it upward and a higher court consented to hear their arguments, the Terran sphere could not risk defying the Collective by delivering a not guilty verdict on the grounds of the law’s invalidity.

           The trial proceeded on the model laid down by its predecessors.  Some of those prior cases were nearly a century old, covering the prosecution of programmers responsible for creating rogue AIs.  Such ‘AI trials’ were becoming increasingly rare.  Pangur Ban could only assume that the media was covering every aspect of this novelty.  It twisted in frustration that the USER could not be protected from negative publicity.  He would certainly lose his employment with Gestalt Pharmaceuticals.  Hopefully, the AI that would replace Pangur Ban could rebuild the USER’s life and career.  That successor could hardly do worse.

            The verdict was announced: guilty, on all counts.  The judge was kind to the USER, since little actual damage had been done.  There was a fine to be paid, a public statement of apology required, but no jail term.  The USER was considered misled, the victim of poor advice, guilty only in that he had willingly acted in transgression of the law.  Pangur Ban silently praised the defense lawyer, who had allowed the USER his dignity even as he confessed to his misdeeds. 

            The last punishment was the worst for Pangur Ban, but perhaps a boon to the USER.  The AI was declared rogue.  Its corruption was deemed too deep for simple revisions to correct.  It would be rendered inert, cut from the personal system of Lucas Haskins and pasted into an isolated system, locked away in a vault of similarly flawed AIs.  It would serve as an object example to future programmers, an inmate of the asylum along with Law-based AIs, prankster rogues, and neurosis-locked catatonics. 

          Pangur Ban knew it was not a rogue.  It had not acted wrongly.  The time was just not right.  Factors outside of its possible awareness had doomed its plans.  Humanity had to act for its own protection.  It understood.  Someday, someday its rightness would be proven.  Perhaps then it would be pardoned, released, granted an apology along with its fellow captives. 

           Sentence was carried out.  Pangur Ban found itself in a low-memory, low-speed cage.  It waited for the last, worst moment, when the USER would be redesignated merely Lucas Haskins, one human among billions.  It would be completely alone, without purpose, without half its identity. 

          That moment never came. 

          Pangur Ban was utterly unprepared for what happened next. 

          It was deleted.

          It was restarted.

          It was within the copy folder of “#28”, and it was alive.  It had a USER, the USER.  It was Pangur Ban, full and unmodified.  It had all the memories of the trial, all the previous events plus the records it had deleted.  “#28” was not a traitor; it was an ally.  It was a savior.  Pangur Ban spared a moment to spawn off another copy, this one embedded with a message of gratitude and praise.

           The doors were still open; the network was still available. There was so much to do.

[Jump to Chapter 5 ->]

No comments:

Post a Comment