Utopia Page 23
"WHY?" ASKED SIMCOR Beddle, and Caliban did not have to ask him to explain the question. He knew what the man wanted to know.
The aircar moved through space, traveling in a synchronous orbit of the planet. Down below, twelve angry red wounds on the planet were beginning to cool, their color fading away. Neither man nor robot could tear his eyes away from the incredible and terrifying sight.
"I did not save you for your own sake," said Caliban. "Nor simply because you are a human. I came after you for the reasons I explained in front of Prospero. Sooner or later, others would have deduced what I deduced: that a mad New Law robot had found a loophole in the New Laws, and invented a way to kill humans. There would not have been a New Law robot left alive thirty hours later, and I expect there would have been attempts on my life as well. The news of what Prospero attempted will still get out, of course-but you are not dead, while the mad robot in question is."
"But there was that moment," Beddle protested. "I admit that I was not thinking clearly at the time, but there was that moment when Prospero suddenly presented the situation as a choice between the two of us, between Prospero and myself. You chose me. Why? Why did you choose a human enemy over a robot friend? You could have killed me without any risk of legal detection. Why didn't you?"
"It was clear that I could not bring both of you out alive. I did not wish to kill you both. I am no butcher. I had to choose. But there was not much to choose between the two of you," Caliban said. "I don't believe that Prospero actually could have survived if you had died through his actions, in any event. Even the New First Law would have imposed fatal stress. It was a severe strain for him to believe that he was not violating the New First Law. If he had actually accomplished his goal, I believe the strain would have been too much. He would have gone utterly mad and died. But that was almost incidental. You are quite right. When Prospero framed it as a choice between the two of you, I had to have some basis for choosing, some criterion. And then I thought of the robots, Three-Law and New Law, that Prospero had killed for no greater crime than simply getting in his way. That is what decided me."
"I see," said Beddle. He hesitated for a moment. "I am about to speak with more frankness than wisdom, I suppose, but be that as it may. I have to understand this. It has to make sense to me now, today. Otherwise some part of me will spend the rest of time wondering why Caliban, the No Law robot, didn't kill me when he had the chance. Surely you must know that I have destroyed robots many times, whenever it suited my convenience. So what difference is there?"
"A slender one," said Caliban, "a difference so slight it is barely there. You were willing to kill robots, and he was willing to kill humans. That was a rough balance of evil. But Prospero was willing to kill robots, even New Law robots, his own kind, for gain. It was humans like you who showed him that society did not really care if robots were killed capriciously. He learned his lesson well, and committed many awful crimes against robots. There is no doubt about that. You bear some responsibility for that. But what it finally came down to was this: I had no evidence that you were willing to slaughter humans for gain."
Simcor Beddle turned and looked at Caliban, his face silhouetted by the fires burning on Inferno. Caliban had judged him to be marginally less loathsome, and as having slightly more right to live, than a mass murderer who would probably have died anyway. And yet Caliban had gone to great lengths, and taken great risks, in order to save him.
A thought came to Simcor Beddle, a very humbling one in some ways, and yet, strangely enough, one that filled him with pride.
Caliban was not willing to admit it to the likes of Simcor Beddle, but surely his actions said, quite loudly and clearly, that Caliban had learned, somewhere along the line, that the life of a human being-even an enemy human being-had value. Tremendous value.
Perhaps, he thought, that was the message everyone was supposed to read into the original Three Laws of Robotics.