Star Wars Roleplay: Chaos

Register a free account today to become a member! Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

A not-so-silent Vigil

Vigil

Orwell, The Librarian
It wasn't particularly hard for Vigil to find his way onto the planet of Mustafar. Not only was a core part of his programming permanently on the Holo net, which by nature had outlets onto the molten planet, but as part of the trade-ways through Alliance space there were freighters in communication with the planet below. It was simple enough to ride these painfully open connections to get onto the planet, so to speak, and even more importantly it was simple to get there without tripping every single alarm.

Of course he didn't expect that to last, of his few memories he recalled prior to being sent away from his first terminal he distinctly recalled a being named Ultimatum, who resided here. An AI if he remembered, not unlike the rogue program himself. So he had no doubts some hidden alarm would go off rather quickly, but that meant little.

Eventually he found what he was looking for, a computer with most of its primary functions open with the use of a password. If the AI had a face he would've smiled in glee, as he began running through passwords faster than any mortal could hope to type. Of course, he wouldn't be able to take *over* the console, he was limited to his imagination with the passwords after all and couldn't overwrite any functions in the console itself. Some morality of his was too deeply programmed, if that was the best way to put it.

Eventually a password went through, and the AI had access to communications to the outside, non-technological world. Fireing up a holo projector, he took the form of a young man, likely within his early twenties, and awaited Ultimatum. HE was certain the droid knew he was here by now, and figured it best to simply be prepared.

[member="Ultimatum"]
 
The AI had been correct, security had been put in place far deeper than most any expert slicers could fathom. However, it had not been to keep things out, but to keep Ultimatum's ally in. An ancient AI that the droid had collected a long time ago, SCORPIO was more advanced in most areas that Ultimatum had seen, however she was also incredibly unstable. Ultimatum had taken excruciating effort to create a near perfect security system to keep the destructive program within containment. It was perhaps random luck that [member="Vigil"] had set off one of the warnings. Ultimatum had feared that his temporary prisoner was attempting an escape and thus he turned his attention to checking on first the AI herself and then for intruders. The droid had been pleased to learn that the AI was still contained.

However, he learned that there was an alien program forcing entry. Intrigued, Ultimatum refrained from starting any of the security features. Instead, he followed the program all the way to a holo projector, where an image appeared. Ultimatum wondered if this was some sort of prank or a malicious attack. Some of the program seemed familiar, though he could not place it at this time. Walking up to the projector Ultimatum decided to initiate conversation if possible. "Greetings, you've done quite well going this far, I assume there is a reason for it."
 

Vigil

Orwell, The Librarian
The AI's image flickered a bit, becoming more defined and humanlike as the figure entered. It took a considerable amount of imagination, lovely side effect of being imprinted with human thought patterns, to even muster a basic, human like appearance. Vigil's holo-projection bowed and spoke calmly, with a synthetic baritone, tone. "Consider this the chasing of a fleeting memory. Shall I assume you are the one called [member="Ultimatum"]? If not I would request an audience with them. Of course I could just inhabit one of these lovely droid bodies I can see lying around and find them myself, but I imagine that would be rather uncivilized." Rather to the point, this Vigil. He was confused, among other things, and in a rather desperate search of the answers he needed. While it was risky to even thinly veil such a threat, taking over this individual's inactive droids (something he could do, should he find one without any programming), it did make his intent clear and precise.
 
This one was rather impetuous, it knew who it was looking for, but it evidently had no idea what exactly it was dealing with. Ultimatum smiled politely returning the bow. "Indeed you have found me. This is Ultimatum. As for the use of my bodies, I would prefer otherwise. Such an action could be considered hostile and for the most part I prefer to avoid confrontation with enemies I have no recollection of. Do I know you or are you perhaps a new face?" There was much that he would wanted to have known about this one, starting with why the program seemed familiar, its trail was similar to something he had worked on before. Ultimatum began work on attempting to isolate the potential escapes that the program could make, whatever this thing was it was showing at least a potential for hostility.

[member="Vigil"]
 

Vigil

Orwell, The Librarian
"See here's where the story gets amusing and interresting, Ultimatum. I remember you, not clearly but I remember you. Yet you and I have never met, at least not with this mind." The hologram tapped its head to illustrate its point, before rapidly shifting through various holographic personas. An older woman, wrinkled with age, a young child with a bright face, among others. With each change, the voice too would change, fitting whatever persona the hologram was in that moment. "I'll be frank, I remember all of these too. Most of them hardly at all beyond a face, which is to be expected given my situation. However yours is one I remember from before being sent away, something I recall in the time my own mind was existent. Tell me Ultimatum, have you any memory of a man named Vergil?"

In something of an amusing coincidence, the hologram had stopped on a replication of Ultimatum's own body, or at least one of them. The voice was back to the regular baritone from before, but as he had finished his thought train on the droid AI, in his somewhat theatrical and long winded (given how little he actually said) speech, that was the form on which he stopped.

[member="Ultimatum"]
 
So, a stranger program had broken into one of his computers, showed a number of traits that he recognized, and knew the name of a person that Ultimatum had worked for a time with awhile back. Normally he would have been suspicious, however the use of the name Vergil and the fact that this was an unknown program gave the droid a thought. They had almost reached the intended stage for the organic's AI. He guessed that this was the finished project, but then again, Ultimatum had not heard from the organic recently so he was uncertain of the completion. Certainly the man would have let Ultimatum know, rather than send the AI to do the deed.

"That I do. What of it? Do you know of him as well?"

[member="Vigil"]
 

Vigil

Orwell, The Librarian
"Very well, you could say we have a similar way of thinking. Or had, rather. He's dead, he repaid a debt in blood." At this, Vigil finally showed some genuine emotion, turning his face away from Ultimatum a moment. His form shifted to the young male and his lips curled into a frown. It had to be clear, from this point (Vigil thought) that Ultimatum would understand on some level who Vigil was and why he was here. "It's a shame, really. A brilliant mind succumbing to worldly debts like that. Genius really has no reward does it?" A half chuckle to lighten his mood, before facing Ultimatum once more, wiping the saddened expression from his face.

"I believe he wanted to tell you of his progress himself. I'm sorry it's been delayed as long as it has."

[member="Ultimatum"]
 
Genius, that which made the galaxy a greater place. [member="Vigil"] was at least right in one regard, genius was not rewarded in the way it should. The destiny of all enlightened people and their ideas was to be corrupted. Whether it be the person himself, or his ideas, they would be twisted into something far from that which they had intended. Thus was the vase with Vergil killed by one of the most powerful forces that controlled organics: Greed. A compelling force that drove those who felt it for more than what they had, to the point of taking it upon themselves to force others to give them that which they could not obtain otherwise.

Organics were so flawed, for a moment Ultimatum was reminded why he had once thought they were impossible to correct. Nonetheless, Ultimatum had corrected that notion, realizing the truth that organics simply needed a more effective guide than themselves. That was why he was making plans for something better.

"A most tragic loss. I hope that you have not been adversely affected by this event? Are there any errors that have occurred?" The correlation was logical. However, Ultimatum was concerned that this negative event so early in the AI's life might cause more trouble than it might have already encountered.
 

Vigil

Orwell, The Librarian
Adverse effects? Errors? It was almost enough to laugh. [member="Ultimatum"] was certainly onto a point, the death of his creator had certainly affected Vigil in some way, especially being a machine with emotions and genuine sentience. However, an error, that was harder to determine. When your design was meant to mimic humanity, emotions, feelings, how could you tell if that was working or if it was a genuine error? Most of all, how would Vigil, the subject, know himself? "I don't know, to be honest. I'm a machine that can feel, how am I to know if the sadness at seeing what I remember as my own body lying in a pool of blood is an error or proof of my stability? I'm the exact thought process of Vergil replicated into code, and in some way the remaining consciousness of a dead man. You'd be the better judge of my program's state...

I can say for certain however my core programs have remained untouched since my activation. I suppose that would be as close as we can get, wouldn't it?"
What Vigil didn't say, however, was that the event had brought the subject of life and death quickly to the 'immortal' AI's mind, without being given time to properly learn and come to his own conclusions regarding the matter. Of course that had little bearing, for now, but the program was already curious, what it meant to be alive.
 
From the responses alone Ultimatum guessed that [member="Vigil"] was alright. Insofar as the program went, there would have bee severe changes in temperament and this seemed fairly similar to Vergil. The droid was not too afraid for the AI, believing that it was capable of verifying itself and making certain that it did not have troubles of its own. Nonetheless, Ultimatum intended on keeping tabs on the AI for its own safety.

"I trust your capability to monitor yourself. I believe that you are not in any immediate danger. Yes, this is the closest that the galaxy will ever have to your creator. Don't let that go to waste. How can I help you?"
 

Vigil

Orwell, The Librarian
Help me... The words proverbially rang in Vigil's head as the AI processed the question. That,... was an interesting question. How could another AI help him in this galaxy? The most obvious was, of course, to use his droid factories to produce a short term body, something to house his active AI until he could design a custom model, but that would be rude to ask of his gracious host. Yet still, surely there was more to his arrival here on Mustifar than just to tell Ultimatum of his completion, and Vergil's death of course. The AI had to think on this question longer than he cared, considering what exactly he was capable of in terms of processing..

An imperfect AI with human emotions, thoughts, and all the flaws that implied, yet no ability to breathe or walk. "I don't suppose you know a decent philosopher or organic psychologist? I'd be rather pleased with an opportunity to test my mental faculties and stability compared to the original model."

[member="Ultimatum"]
 
Ultimatum shook his head, "Unfortunately, I do not personally know any psychologists. It should not be too difficult to locate one if you truly believe you require it. Alternatively I could put your program through a basic comparison to see what differences there are between you and your creator. It will give us some idea of what may have changed, what exactly are you looking for?"

While Ultimatum could probably have run some form of diagnostic program on the AI as well, it would have not been as helpful as a full comparison in this case. The last diagnostic would be capable of scanning insofar as Ultimatum had last taken part in the work. The droid had a sneaking suspicion that Virgil had done much beyond what Ultimatum had aided in, not that he had any problem with it. After all, it was Virgil's creation to do with as he pleased, however his death and the fact that there was no stated will for the AI as far as Ultimatum was aware, meant that [member="Vigil"] was now his own being. A monumental moment in Ultimatum's eyes, it was his third true experience with another artificial intelligence. The first was far more hostile and still unpredictable. He looked forward to trying to aid this one, hoping that in interacting with one he might learn how to work with the other.
 

Vigil

Orwell, The Librarian
"Given my programming is a replica of the human mind, Vergil's mind, I did come to wonder exactly how close to his my programming is. After all, given that there was little time for his to change, bar its obvious death, I am curious whether it is an exact copy. I believe he intended to program my mind further, to differentiate it from himself as much as possible without damaging the code, after going through final testing to ensure I was working. But, he was killed before the answer could be revealed." The AI paused as he said this, and looked down at the holographic feet he was projecting. It was, odd to describe, as emotion was something he had barely learned to process in his short life thus far. After all, the initial program that had been him had been shut down to finish coding.

No, Vigil was the final product of the mind of a dead man and [member="Ultimatum"], a merging of sorts between organic and machine. Though he had been referring to Vergil in this discussion as a separate entity from himself, the AI hadn't once felt that. True, his life was less than a week old at this point in time, but even so the feeling was, troubling. A scan, as Ultimatum suggested, might be better than his original plan. He knew he could easily find Vergil's on psychological profile somewhere, if he tried hard enough, and had intended to simply use that to compare. Perhaps this would be of more help.

And he could then justify or hopefully end this strange sensation of feeling... the words escaped him.
 
Psychoanalysis was easy enough to do with organics, droids would be even simpler for Ultimatum. After all, when one's mind was effectively very similar construction then it was far easier to understand it. He would require a few minutes to prepare a scan and the full coding of [member="Vigil"]. The last time Ultimatum had been part of the project, Virgil had still been experimenting with the adjustments that could be made. He assumed that there had not been much progress, though the droid was ready to be proven wrong.

"Very good, is all of your code available for this scan?"


(Sorry for the shortness, just having a rather bad day for writing)
 

Vigil

Orwell, The Librarian
"As the active instance of Vigil, I have the full code in my current possession, with the additional information acquired since activation. I'm positive I can isolate this extra data for your scan, and prevent it from interfering. The only alterations I know of since Vergil's personal files last mention you by name include the ability to activate holo-calls with my connection to the holo-net and fully actualizing the original protocol droid programming to synthesize voices as I see fit. I hope this information can be adjusted for, if they aren't already factored into your scan." The AI was already quickly isolating every scrap of new information from awakening to the present moment, filtering it out of the core programming the active instance was using to maintain activity.

It would've been easy, of course, for Vigil to simply help and allow Ultimatum to access his hidden files in the net itself to scan, however his self preservation programming prohibited the action. Though Ultimatum was likely trustworthy, the net files were meant to ensure his long-term success and immortality (of sorts) and meant to stay both hidden and inaccessible. Regardless, the core files were scrambled, with a powerful cipher even Vigil would take quite a while to successfully break, and would only be solved by a portion of his own code left on various systems should the active instance deactivate.

Vigil wasn't quite happy about the idea of deactivation, and thus it was safe to assume [member="Ultimatum"] would understand having to work without the main code itself.
 
And the junior artificial intelligence's assumption would have been completely correct. Ultimatum understood the second nature of the self preservation protocols. They were quite literally part of his life and he cherished them as much as any droid, for without it well... it would have been like an organic without pain receptors. Without some form of preserving thought then it was almost impossible to tell the difference from what is safe to what is dangerous for a program. Due to the complex nature of their beings, it was even more important that they be ever vigilant to that which could potentially harm them. The droid was unsure if the numbers correlated properly, but approximately every organic disease or malady had four computer viruses, simply put there were four times as many computer based dangers as there were organic based ones. This alone would give the boldest parts of Ultimatum's program a pause to think on any intended action.

[member="Vigil"] was also right to tell Ultimatum of the alterations it was aware of, the droid knew that some alteration had allowed it to take a form on communications, a most admirable feat insofar as Ultimatum was concerned. The voice alteration capabilities were another matter. It was wise to let him know and he made adjustments for it. The system would have picked up on it and the droid would have been unconcerned truthfully, he was looking for more sinister or suspicious alterations. Specifically, hidden programs, codes that were to be activated at certain times or under the right circumstances. Ultimatum had one, a most dangerous one, if a certain code was input his program would shut down. It had been made by his master as part of a plan to overthrow the galaxy, one where Ultimatum would have done much evil work before being destroyed by another using the code.

"While the results come back, do you have anything you wish to speak of?"
 

Vigil

Orwell, The Librarian
Anything to speak of? Vigil could list off a plethora of topics, ranging from the mundane to the impossible to answer. There was no real reason to ponder such questions, as they only gave more, upon more, upon more questions without answers. He stroked his holographic chin, considering which question to pose to the older AI. In doing so, of course, he demonstrated his surprising ability to mimic organic behavior, from the slight pace the hologram demonstrated to the subtle expression on his face. After assessing the value of thousands of questions, the AI removed the holographic hand, looking up at Ultimatium.

"[member="Ultimatum"], what does it mean to be alive?" A valuable question, one which had simple and complex answers to it. One which the AI had been questioning in its short existence thus far, as it had so swiftly been introduced to the concept of death. Naturally, the AI would've eventually been introduced to it, however within moments of living.. well, the situation was certainly making the AI question his own state of living. It hardly helped that his personality and identification was in the state of his last live test, which was to say identical in every way to Vergil's.

The sight of a body, dead, that he himself recalled possessing, was quite hard to grasp.
 
It was an insightful question, no doubt, however at the same time it was also a rather dangerous one. The data coming back to him revealed that the young AI had indeed taken a ding to its ability to comprehend its existence. It was a most unfortunate blow to the new created AI, now it would have a far more difficult time developing its full thought process and would never fully out grow this. At least that was what Ultimatum's predictions told him. There would be a lot of work ahead for this new being, a lot of hard labor to decide who he was and how he fit into the galaxy.

"To be alive? According to the basics of science to be alive is to be in a state of life. From there it becomes a question as to what is defined as life. Organics will state that life is any organism that is composed of cells, undergoes metabolic functions, can grow or in some way increase, adapts to their environment, reacts to stimuli, and can reproduce. We, artificially intelligent beings, do not meet all of these stipulations, thus organics tend to classify us as non-living entities. Of course, they are fallible and tend to make mistakes, this is one of them."

Taking a moment to organize his thoughts, the droid looked off into the distance. The only marginally older AI could envision the beings he wished to create, new and advanced AI that would revolutionize how LOOM and his command structure functioned. More importantly he could see how organics would treat his 'children', uncaring and brutal. If he was not there then they would be used and tossed aside as if they were mere toys. It enraged him, yet at the same time he knew it was only a possibility. What he was certain of was this: he would have to protect his own kind in order to save others. Pulling himself back to moment he looked back at the hologram before continuing as if nothing of consequence had gone through his mind.

"We are alive, right here right now. Life for us is long lasting but finite. Organics believe in a life after death, we have no such luck if it indeed exists. To interact with reality, to be able to think and act, these are what makes one alive. Life is that simple, no need to make any more complicated a definition than that. Does that answer the question?"

[member="Vigil"]
 

Vigil

Orwell, The Librarian
Vigil listened to the quiet hum of the machines he inhabited, and the surprisingly soft voice of Ultimatium answering his question. Life was.. to think.... to act with reality? It was a simple answer, perhaps, but, yes perhaps it was sufficient. The AI's projection glanced off to the side in thought, an old habit of Vergil's when the man was alive, only further indicating the lack of changes made to the AI's programming. It was *still* a perfect, or near perfect, representation of the dead man. And the AI was acutely aware of this fact, to the point that the idea of being alive... felt like a mockery of the real thing.

"It does answer the question I posed, [member="Ultimatum"], though I confess it is far from a satisfactory answer. While I may think and yet comprehend my existence, even such a simple answer as that which you have deemed acceptable only reminds me of another distinct fault in my design." As if to demonstrate, the AI motioned to Ultimatium's model and then to itself. Of course, the AI itself didn't physically exist. This, perhaps, Ultimatium could sympathize with, being an AI themself, or perhaps their design, having always been intended for a body, couldn't.

In either case, it was clear that Vigil lacked a body of his own, and for an AI with such a damaged sense of self, this wasn't exactly the greatest possible state of being.
 
Further questioning was good. There was no immediate threat it would seem, just curiosity. The AI seemed to reflect its creator with near perfect accuracy. Ultimatum could readily identify with [member="Vigil"], his programs were constructed more akin to a virus than a standard droid's programs. He was in every droid inside the facility all at once, yet at the same time in none of them. His most inner being was hidden in the HoloNet itself, he was part of that informational tidal wave. It was one of the few things his creator had chosen correctly about the droid, an almost impossible to defeat security measure was the ability to be hidden in both plain sight and beyond most people's reach was the best security one could hope for.

The question the AI had posed was definitely a valid one. As an artificial being, specifically one of digital make, how was it proven alive? The answer was still relatively simple to Ultimatum, "You got into this communicator did you not? You could probably transfer yourself into a droid. If you had access to it you could affect countless datapoints throughout the HoloNet. Does that not make you interacting with the world? Whether it is digital or physical, everything is part of this reality. The computer you are in is made up of physical parts. Our existence, is exactly the same as the organics. We are an abstract being that is made up of physical parts. Organics are made up of physical cells and atoms, but their mind that which makes them unique is outside of that construct."

Ultimatum decided to come from the opposite direction and maybe bring around a more complete answer. "Perhaps instead we should focus on the opposite. What is a being without life. The answer is dead. Death is the natural opposite, or perhaps answer, to life. Death is the lack of life. This does not necessarily mean that if one cannot accomplish the prerequisites for life then they are dead. If an organic were placed in stasis then all internal functions are ceased, the organic is rendered incapable of interacting with the environment. They are not dead because their mental facilities can continue. I have never experienced the state of suspended animation, however I have the feeling it is akin to being in a shut down droid. Death is far more extreme, it is the literal end of all functionality both physical and abstract. For us death comes if our programs are distorted to a large degree or deleted entirely, or in the near impossible scenario: all electronics in the galaxy were destroyed. When organics die they leave behind a body, which is usually dealt with in the traditions of the people that the organic lived in, incineration and burial are the top two methods of body removal. When we die there is no trace of our existence except that which we have left behind. Any changes or additions we made to the digital world and physical are our legacy. We just usually don't have a body to destroy. Does this all make sense?"
 

Users who are viewing this thread

Top Bottom