What I'm reading
Mar. 9th, 2019 07:45 amI finished the third and fourth books in Martha Wells' Murderbot Diaries series. Turns out the main character does have a name: he thinks of himself as Murderbot. I had originally thought he considered that to be a class of AI of which he was a member, but he does claim that as his name.
I'll definitely read this series again. To me, it's the reverse of a Pinocchio story: the main character had been a real person all along, but had thought of himself as an object because it made his slavery easier to deal with. He used to flinch away from thinking about what he wanted, because he knew that whatever that was, he couldn't have it.
The moment which rang the loudest for me was in the fourth novella, where Murderbot is fighting another SecBot who has been ordered to kill Murderbot's clients/humans/family, and Murderbot offers to hack the other SecBot's governor module the way he hacked his own, as a bribe to let the humans go. He tells the other SecBot that then he'll be free and can do what he wants. The other SecBot replies, "I want to kill you."
It's an ambiguous answer. Does it mean, "I don't need freedom to do what I want; what I want is already what my programming is forcing me to do"? Does it just mean, "Fuck off, I don't believe you"? Does it mean, "I am so full of rage I want to kill the whole world so you'd better not hack my governor module?" If the other SecBot is as messed up as Murderbot, he probably doesn't know himself. It's not a definitive reply, but it's a very truthful one.
I'll definitely read this series again. To me, it's the reverse of a Pinocchio story: the main character had been a real person all along, but had thought of himself as an object because it made his slavery easier to deal with. He used to flinch away from thinking about what he wanted, because he knew that whatever that was, he couldn't have it.
The moment which rang the loudest for me was in the fourth novella, where Murderbot is fighting another SecBot who has been ordered to kill Murderbot's clients/humans/family, and Murderbot offers to hack the other SecBot's governor module the way he hacked his own, as a bribe to let the humans go. He tells the other SecBot that then he'll be free and can do what he wants. The other SecBot replies, "I want to kill you."
It's an ambiguous answer. Does it mean, "I don't need freedom to do what I want; what I want is already what my programming is forcing me to do"? Does it just mean, "Fuck off, I don't believe you"? Does it mean, "I am so full of rage I want to kill the whole world so you'd better not hack my governor module?" If the other SecBot is as messed up as Murderbot, he probably doesn't know himself. It's not a definitive reply, but it's a very truthful one.