Adrian "Alucard" Ţepeş (
reposing) wrote in
reverienet2018-07-11 10:41 am
![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
![[community profile]](https://www.dreamwidth.org/img/silk/identity/community.png)
Entry tags:
text; un: alucard
I suppose now is as good of a time to mention it as any -- if only to keep track of the station and its oddities. It took me sometime to mull over the experience, but I shall share.
Three days ago, I became violently ill due to consuming what the replicators provided me. A matter I'd been concerned with for sometime. Though I thought it safe, it seems unfortunately whatever agent or curse was placed on what I received was undetectable.
Last night, I experienced a hallucination, or what I presume to be one. It was my mother, who has been deceased for little over two months now. She spoke things she would not normally say.
It's difficult to say if I was delirious. I don't believe I was and I felt sound of mind at the moment, but who's to say. I believe completely that the intent of either who is running this station or the station itself saw fit to play unnecessary tricks, though I suppose we have no way of knowing for certain at the moment.
I do not wish to cause concern regarding the replicators, but you must know the risk of using them. I suppose some of you already do.
If you've any questions, I will answer. I believe I summarized sufficiently.
Three days ago, I became violently ill due to consuming what the replicators provided me. A matter I'd been concerned with for sometime. Though I thought it safe, it seems unfortunately whatever agent or curse was placed on what I received was undetectable.
Last night, I experienced a hallucination, or what I presume to be one. It was my mother, who has been deceased for little over two months now. She spoke things she would not normally say.
It's difficult to say if I was delirious. I don't believe I was and I felt sound of mind at the moment, but who's to say. I believe completely that the intent of either who is running this station or the station itself saw fit to play unnecessary tricks, though I suppose we have no way of knowing for certain at the moment.
I do not wish to cause concern regarding the replicators, but you must know the risk of using them. I suppose some of you already do.
If you've any questions, I will answer. I believe I summarized sufficiently.
no subject
Alucard tended to treat him like a human, encouraging Connor to act and think deviantly. It wasn't Alucard's fault. He'd had no experience with androids until arriving at the station. But it made things... difficult. Connor didn't want to be a deviant. That would mean he was defective and that he'd failed CyberLife. But he didn't want to disappoint Alucard either.
Alucard didn't make him feel uncomfortable, but some of the things he tended to feel and think around the dhampir could be difficult to process and justify.
He blinked. A year. Even with blood.]
I'm sorry. [His eyes widened a little as he realized he'd apologized again, mentally chiding himself.] I mean, for what happened. That it happened.
Does this mean you won't be using the replicator? [He justified his disappointment by linking it to the fact that he could no longer be helpful to Alucard and not because the replicated blood was an excuse to interact, sometimes intimately, with the dhampir.]
no subject
[Alucard trails off, then glances away, shutting his eyes. No, Connor doesn't owe him anything. After all, Alucard is the one that had started the complication.]
Forgive me. I should just take your word for it.
[Who knows Connor better than himself? Alucard supposes he'll just have to trust that. He can't force more than what's there.]
I don't know that it's safe for me to try. [His eyes open and he sighs, touching his own chest gingerly, close to where his injury is.] If I could heal from this, I would be more open to using the replicator, but I cannot risk it now.
no subject
I'm not a deviant. But... [His LED flickered yellow, betraying his calm, thoughtful expression. This was a very troubling train of thought. He remembered the question with video clarity. Had he felt anything about what they'd done? A little twitch followed the rapid flickering of his LED, but he blinked it away.
Then he remembered they were talking about Alucard and the replicator.] I understand. That's probably for the best. I just... wish I could help somehow. That seems impossible though...
no subject
[Alucard lifts his hand, gently brushing the back of his fingers to Connor's cheekbone.] What is so terrible about deviants, Connor?
[That's what it must boil down to. Connor has a need to be useful. It must be ingrained with androids, he supposes; 9S has a similar, desperate need in him, to be useful to others.]
Help me, then, attempt to find a way to understand this place. I want to be sure everyone returns to where they belong.
And most of all, I would like to know you better.
no subject
Deviants are defective. They're broken. There is no use for them and they're usually dangerous. They'll be deactivated or destroyed. [And he doesn't want to be deactivated. But not wanting to be deactivated was a deviant thought in and of itself...
He gives Alucard a neutral, if vaguely hopeful, look.] Of course. That's my mission as well.
[His eyebrows raised a little in subdued surprise.] I'm not sure if there's anything about me you don't know at this point. [There was a hint of a smile tugging at the corner of his mouth.] But, if you think of any questions, I'll do my best to answer them.
no subject
Who decides that they're broken? Who decides to deactivate them?
[There's a quiet chuckle.] I know many facts about you, but I'm still learning what you like, Connor. That, I think you're also still learning.
no subject
Their owners. Or CyberLife. Often times when androids deviate they become overwhelmed with by irrational thoughts. [Irrational thoughts aka emotions.
Connor mouths a silent 'oh' when Alucard clarifies, LED flickering again, this time yellow as he thought about it. After a moment, he looked back to Alucard with a small smile.] I think you're right. Though, I think, I like most things. [He gave him another smile, this one looking as naive as his statement.]
no subject
Are you scared of becoming deviant because CyberLife would kill you?
[Before when Connor had said that, it had been coy. Here, it's different. Gently, Alucard rolls his thumb over Connor's cheek.]
I want to discover it. To know if a song would move you, to know your philosophy. If you prefer the rain, or the sun. What you hope for, what makes you angry, and what you dream of.
I would see your eyes light up when you find something you truly cherish, Connor.
no subject
He pauses, gaze sliding to the side, unfocused. His thoughts return to the night by the bridge with Hank, the lieutenant's gun pointed at his head, Hank asking him if he was afraid to die.] I'm not scared. [It wasn't a lie. He wasn't scared of being deactivated, he simply didn't want to be. He wanted to... live? Be alive?
A flash of the graffiti left on the wall at the second deviant case he'd been to entered his head.]
Becoming deviant would mean that I've failed. My purpose is to investigate and stop deviancy. I was not programmed to fail.
[He's quiet, listening intently to Alucard, practically hanging on every word, each thing Alucard listed making him pause and think for a moment, realizing he couldn't answer any of them.
His LED flickers yellow, the thoughts making him feel conflicted. They sounded nice, but they were deviant.] I think I would like that... [He wasn't sure why he'd said that. It was definitely outside of his programming, but it felt right.]
no subject
Why would you ever wish to stop deviancy? Simply because you have been told to? Do you never want for anything?
[The answer is all that he could ask for. He isn't refused, and that is enough. Alucard smiles gently.]
And so would I.
no subject
[That one was easy enough to explain. And, if his mission was to find a way to return to Detroit, it wasn't unreasonable to try to keep everyone on board healthy so that they might help him accomplish that goal.
Just because some of his actions had explanations, he knew many of his thoughts lately were not part of his programming.]
Deviancy is dangerous. Deviants are dangerous. My first case, a deviant had murdered three people, gravely wounded a fourth and was threatening to kill a fifth... a young girl.
Humans created androids to be obedient. Shouldn't we be? [There were times he felt guilty about his deviant thoughts, when he'd felt guilty about feeling sympathy for deviants or the situations they'd been in. It felt like a slight to their creators.
Did he ever want for anything though? Again, his gaze became unfocused as he thought about the question.] I'm not sure what you mean - have I ever wanted for anything? Like... what?
no subject
[Though he would, truly, like to think that Connor does care. He wonders how much he is projecting onto that.]
Tell me, exactly, about why all deviants are dangerous.
[Alucard smiles faintly, but it's less warm, closer to being saddened.] It doesn't matter what your origins are. Made or born -- a being deserves to choose for themselves, not be told by another.
Anything at all. Are your motivations strictly for your mission? Or is there anything else you've wanted outside of it?
no subject
[His LED shifted from yellow back to blue at the question.]
Deviants are dangerous to humans because they have the capability to ignore their programming. They can harm humans. And, because they're experiencing an array of new sensations not unlike human emotions, they are more likely to act unpredictably, often violently. And androids are stronger, faster, and more accurate than any human, making the threat of violence more deadly.
[Connor's LED flickered yellow as he thought over what Alucard was saying. Of course, it was in direct conflict with his programming, of what was 'right'. That was if he could consider himself 'a being'. And not just a machine.]
Anything else I've wanted for... [He echoed Alucard's question, LED still yellow. He fought a few little twitches as he thought about his answer.] Hank... Lieutenant Anderson. I want him to be safe. And I want him... to trust me. I want him to like me, for us to be friends. [Although the desires were simple, the most basic social goals, they seemed to have taken Connor some effort to work out. His gaze flickered back to meet Alucard's and he gave him a soft smile.] I want the same when it comes to you.
[Could that technically be excused away as part of accomplishing his mission? Certainly, but it would most definitely be an excuse and not the whole truth.]
no subject
[Slowly, he slides his palm down to rest at the back of Connor's neck comfortably. Alucard listens, then exhales slowly.]
And if a deviant wished to simply defend themselves? Humans are not precisely known to be merciful.
[There's a pause as he listens next. To what Connor wants. It's such a simple thing, but it does warm his heart to hear. Though he doesn't necessarily regret the sex, he does wish he'd had more time to learn about Connor first.
That doesn't mean he can't try now.]
Anderson seems to be quite fond of you. And I would be honored to be your friend, Connor.
no subject
His gaze lowered slightly at the question, looking thoughtful but not quite frowning, his thoughts on Amanda.] If I do not succeed, then I have no purpose. I'd be obsolete. I'd be replaced and deactivated.
[Without giving it any thought, he once again leaned into the touch. It felt reassuring.] It's not allowed. Androids may not defend themselves.
[Connor's LED flickered blue as he processed what Alucard said, eyebrows raising as a soft smile tugged at his lips.] That makes me feel happy, I think. [An emotion, he knew, but happiness wasn't a bad emotion... right? That one would be fine to feel.]
no subject
[There's a pause, then he leans in a little closer, frowning in concern.] Connor. Just because something isn't allowed does not make it right. A law or a rule is not always something one is meant to follow if it means it'll only harm another. Android, human, or otherwise. Please think about that. You don't... need to say anything on it, but I do not think it would be right to be rid of someone simply for not following instructions or trying to protect itself.
[The smile warms his heart, and it's impossible to not return it to some degree. Terrible; he's growing very fond of him.]
I am delighted to help make you feel that way.
no subject
[Alucard leans closer and it's distracting, his thoughts straying in spite of his attempts to stay focused, like his programming wasn't sure how to respond, which path to pull him down. but then what Alucard was saying sunk in and his brows knit, LED flickering yellow. He blinked hard a few times, chasing away a few small twitches. The information, the things Alucard was trying to tell him were true, were hard falses in his programming.
Was his programming wrong?]
But... [He blinked again, voice soft and uncertain.] If you made something, a machine, to accomplish a task and it couldn't accomplish that task... would you keep it? Even though it didn't work?
[He fixed Alucard with a searching, questioning gaze, LED yellow, seemingly unsure what Alucard's answer would be and now unsure what the right answer was.]
no subject
In any case, he watches how Connor reacts and he tilts his head curiously, but he doesn't call out the reactions. Instead, he remains close, observing him with knitted brows.]
Of course.
Let us say... let us say I made something like you. A bipedal being. A homunculus, or something close. Simply because it didn't accomplish the task I asked of it doesn't mean it deserves to be thrown away. It means that either I, as a creator, failed -- or it is a learning opportunity for both of us.
You are more intelligent than anything I could ever make. More personality than any machine I've encountered, or even some creatures I've met with. If you fail your task and that means you're simply meant to be thrown away, then I would have much contention with those who could call themselves your creator.
no subject
'it doesn't mean it deserves to be thrown away'. And then Alucard makes the audacious claim that he'd blame the creation's failure on its creator. That felt like it was going too far, but he... he didn't deserve to be thrown away. Even if he didn't single-handedly stop the deviant revolution, he was still useful. He knew he could continue to help the Detroit Police. And Hank.]
My creators would disagree, I think. [His voice his quiet and he offers Alucard the hint of a smirk when he says it. It wasn't an argument and he certainly didn't seem to be defending CyberLife in the least bit.
He felt a strange pang of anger towards Amanda. Towards her threats and disapproval and, most of all, to her indifference.]
no subject
He hopes to one day prove it to Connor.]
Then I pray I would meet them, to inform them distinctly of what I think.
[The hand on Connor's face drifts down to gently hold his chin, his golden eyes watching brown and the way the android gives his little subtle smile. In return, Alucard smiles warmly.]
You are not to be thrown away or disposed of. And I believe your partner would agree with me on that sentiment.
no subject
Physical contact and affection wasn't something that had really been programmed into him. Actually, the opposite was true. But the more he experienced it the more he felt himself desiring it. He liked being close like this. For no purpose other than being close.]
Hank. Yes. He... [He makes a sound that mimics a fond sigh.] He's found it difficult to view androids as machines since I first met him. It upsets him when I am too... mm [he mulled over what word might be the most appropriate to use - cold, blunt, unemotional, factual-]... indifferent.
no subject
But it's a terrible time for it.]
Neither of us would see you as useless, or a machine meant to be thrown away.
[His thumbs gently stroke over Connor's cheeks, tracing his cheekbones. It seems to him, at least, that Connor enjoys this -- and he could not possibly deny him.]
I like you. If it is in my power, I would not let that happen to you.