"Well, then," the roboticist continued, "if a mechanism is capable of learning, how do you keep it from becoming dangerous or destroying itself?
"That was the problem that faced us when we built Snook.u.ms.
"So we decided to apply the famous Three Laws of Robotics propounded over a century ago by a brilliant American biochemist and philosopher.
"Here they are:
""_One: A robot may not injure a human being, nor, through inaction, allow a human being to come to harm._"
""_Two: A robot must obey the orders given it by human beings except where such orders would conflict with the First Law._"
""_Three: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law._""
Fitzhugh paused to let his words sink in, then: "Those are the ideal laws, of course. Even their propounder pointed out that they would be extremely difficult to put into practice. A robot is a logical machine, but it becomes somewhat of a problem even to define a human being. Is a five-year-old competent to give orders to a robot?
"If you define him as a human being, then he can give orders that might wreck an expensive machine. On the other hand, if you don"t define the five-year-old as human, then the robot is under no compulsion to refrain from harming the child."
He began delving into his pockets for smoking materials as he went on.
"We took the easy way out. We solved that problem by keeping Snook.u.ms isolated. He has never met any animal except adult human beings. It would take an awful lot of explaining to make him understand the difference between, say, a chimpanzee and a man. Why should a hairy pelt and a relatively low intelligence make a chimp non-human? After all, some men are pretty hairy, and some are moronic.
"Present company excepted."
More laughter. Mike"s opinion of Fitzhugh was beginning to go up. The man knew when to break pedantry with humor.
"Finally," Fitzhugh said, when the laughter had subsided, "we must ask what is meant by "protecting his own existence." Frankly, we"ve been driven frantic by that one. The little humanoid, caterpillar-track mechanism that we all tend to think of as Snook.u.ms isn"t really Snook.u.ms, any more than a human being is a hand or an eye. Snook.u.ms wouldn"t actually be threatening his own existence unless his brain--now in the hold of the _William Branch.e.l.l_--is destroyed."
As Dr. Fitzhugh continued, Mike the Angel listened with about half an ear. His attention--and the attention of every man in the place--had been distracted by the entrance of Leda Crannon. She stepped in through a side door, walked over to Dr. Fitzhugh, and whispered something in his ear. He nodded, and she left again.
Fitzhugh, when he resumed his speech, was rather more hurried in his delivery.
"The whole thing can be summed up rather quickly.
"Point One: Snook.u.ms" brain contains the information that eight years of hard work have laboriously put into it. That information is more valuable than the whole cost of the _William Branch.e.l.l_; it"s worth billions. So the robot can"t be disa.s.sembled, or the information would be lost.
"Point Two: Snook.u.ms" mind is a strictly logical one, but it is operating in a more than logical universe. Consequently, it is unstable.
"Point Three: Snook.u.ms was built to conduct his own experiments. To forbid him to do that would be similar to beating a child for acting like a child; it would do serious harm to the mind. In Snook.u.ms" case, the randomity of the brain would exceed optimum, and the robot would become insane.
"Point Four: Emotion is not logical. Snook.u.ms can"t handle it, except in a very limited way."
Fitzhugh had been making his points by tapping them off on his fingers with the stem of his unlighted pipe. Now he shoved the pipe back in his pocket and clasped his hands behind his back.
"It all adds up to this: Snook.u.ms _must_ be allowed the freedom of the ship. At the same time, every one of us must be careful not to ... to push the wrong b.u.t.tons, as it were.
"So here are a few _don"ts_. Don"t get angry with Snook.u.ms. That would be as silly as getting sore at a phonograph because it was playing music you didn"t happen to like.
"Don"t lie to Snook.u.ms. If your lies don"t fit in with what he knows to be true--and they won"t, believe me--he will reject the data. But it would confuse him, because he knows that humans don"t lie.
"If Snook.u.ms asks you for data, qualify it--even if you know it to be true. Say: "There may be an error in my knowledge of this data, but to the best of my knowledge...."
"Then go ahead and tell him.
"But if you absolutely don"t know the answer, tell him so. Say: "I don"t have that data, Snook.u.ms."
"Don"t, unless you are...."
He went on, but it was obvious that the officers and crew of the _William Branch.e.l.l_ weren"t paying the attention they should. Every one of them was thinking dark gray thoughts. It was bad enough that they had to take out a ship like the _Brainchild_, untested and jerry-built as she was. Was it necessary to have an eight-hundred-pound, moron-genius child-machine running loose, too?
Evidently, it was.
"To wind it up," Fitzhugh said, "I imagine you are wondering why it"s necessary to take Snook.u.ms off Earth. I can only tell you this: Snook.u.ms knows too much about nuclear energy."
Mike the Angel smiled grimly to himself. Ensign Vaneski had been right; Snook.u.ms was dangerous--not only to individuals, but to the whole planet.
Snook.u.ms, too, was a juvenile delinquent.
10
The _Brainchild_ lifted from Antarctica at exactly 2100 hours, Greenwich time. For three days the officers and men of the ship had worked as though they were the robots instead of their pa.s.senger--or cargo, depending on your point of view.
Supplies were loaded, and the great engine-generators checked and rechecked. The ship was ready to go less than two hours before take-off time.
The last pa.s.senger aboard was Snook.u.ms, although, in a more proper sense, he had always been aboard. The little robot rolled up to the elevator on his treads and was lifted into the body of the ship. Miss Crannon was waiting for him at the air lock, and Mike the Angel was standing by. Not that he had any particular interest in watching Snook.u.ms come aboard, but he did have a definite interest in Leda Crannon.
"h.e.l.lo, honey," said Miss Crannon as Snook.u.ms rolled into the air lock.
"Ready for your ride?"
"Yes, Leda," said Snook.u.ms in his contralto voice. He rolled up to her and took her hand. "Where is my room?"
"Come along; I"ll show you in a minute. Do you remember Commander Gabriel?"
Snook.u.ms swiveled his head and regarded Mike.
"Oh yes. He tried to help me."
"Did you need help?" Mike growled in spite of himself.
"Yes. For my experiment. And you offered help. That was very nice. Leda says it is nice to help people."
Mike the Angel carefully refrained from asking Snook.u.ms if he thought he was people. For all Mike knew, he did.
Mike followed Snook.u.ms and Leda Crannon down the companionway.
"What did you do today, honey?" asked Leda.