moriendum: (eric nectar)
moriendum ([personal profile] moriendum) wrote2026-03-03 10:23 pm
Entry tags:

save me, secunit. secunit, save me

some passages I liked from the murderbot books:



all systems red

Ratthi said, “The one where the colony’s solicitor killed the terraforming supervisor who was the secondary donor for her implanted baby?” Again, I couldn’t help it. I said, “She didn’t kill him, that’s a fucking lie.” Ratthi turned to Mensah. “It’s watching it.”


It’s wrong to think of a construct as half bot, half human. It makes it sound like the halves are discrete, like the bot half should want to obey orders and do its job and the human half should want to protect itself and get the hell out of here. As opposed to the reality, which was that I was one whole confused entity, with no idea what I wanted to do. What I should do. What I needed to do.


I hate having emotions about reality; I’d much rather have them about Sanctuary Moon. (!!!!!!!!!!)


This is why I’m glad I’m not human. They come up with stuff like this. I said, “No. That’s a human thing to do. Constructs aren’t that stupid.” What was I supposed to do, kill all humans because the ones in charge of constructs in the company were callous? Granted, I liked the imaginary people on the entertainment feed way more than I liked real ones, but you can’t have one without the other.


artificial condition

When constructs were first developed, they were originally supposed to have a pre-sentient level of intelligence, like the dumber variety of bot. But you can’t put something as dumb as a hauler bot in charge of security for anything without spending even more money for expensive company-employed human supervisors. So they made us smarter. The anxiety and depression were side effects.


But there weren’t any depictions of SecUnits in books, either. I guess you can’t tell a story from the point of view of something that you don’t think has a point of view.


“Sometimes people do things to you that you can’t do anything about. You just have to survive it and go on.”


I wish being a construct made me less irrational than the average human but you may have noticed this is not the case.


exit strategy

(Possibly I was overthinking this. I do that; it’s the anxiety that comes with being a part-organic murderbot. The upside was paranoid attention to detail. The downside was also paranoid attention to detail.)


I was having an emotion, and I hate that. I’d rather have nice safe emotions about shows on the entertainment media; having them about things real-life humans said and did just led to stupid decisions like coming to TranRollinHyfa. (!!!!!!!!!)


Disinformation, which is the same as lying but for some reason has a different name, is the top tactic in corporate negotiation/ warfare.


I hate having emotions about real humans instead of fake ones, it just leads to stupid moments like this.


rogue protocol

(I know that’s actually not a permanent solution and pretending bad things aren’t happening is not a great survival strategy in the long run, but there was nothing I could do about it now.)


network effect

“What is that thing?” Target Leader demanded. “What are you? You’re a bot?” Thiago said, “It’s a security unit. A bot/ human construct.” Target Leader didn’t seem to believe him. “Why does it look like a person?” I said, “I ask myself that sometimes.”


Ugh, emotions.


I was getting tired of being told what to do. Self-determination was a pain in the ass sometimes but it beat the alternative by a lot.


Overse added, “Just remember you’re not alone here.” I never know what to say to that. I am actually alone in my head, and that’s where 90 plus percent of my problems are.


“Because change is terrifying. Choices are terrifying. But having a thing in your head that kills you if you make a mistake is more terrifying.”


fugitive telemetry

All I wanted to do was watch media and not exist. I said, You know I don’t like fun.


system collapse

You’re stalling, ART-drone said. I am not. I can stand here and be useless without any ulterior motives, thanks.


(I asked because the humans would bug me for the information; I was as indifferent to human gender as it was possible to be without being unconscious.)


Would it have been kinder to kill you, before you disabled your governor module? I said, Yes. ART-drone said, You know I am not kind.


It was obvious that media could change emotions, change opinions. Visual, audio, or text media could actually rewrite organic neural processes. Bharadwaj had said that was what I’d done with Sanctuary Moon: I’d used it to reconfigure the organic part of my brain. That it could and did have similar effects on humans.


I felt HostileSecUnit1 go into shutdown mode. It wasn’t dead, it was just catastrophically damaged. (I know, who isn’t?)


home

It’s about being treated as a thing, isn’t it. Whether that thing is a hostage of conditional value, or a very expensively designed and equipped enslaved machine/ organic intelligence. You’re a thing, and there is no safety.


And if someone else was in her position, she would tell them how unhelpful comparisons like that are, that fear is fear.


rapport

“Oh, thanks,” Tarik said, appearing in the hatchway. “You’re supposed to assume it’s kidding and be lulled into a false sense of security,” Matteo explained. Tarik told them, “My sense of security is always false.”

Post a comment in response:

This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting