Frank Fiore – Novelist & Screenwriter

April 21, 2010

Will the Three Laws of Robotics Really Keep Us Safe?

Filed under: CyberKill — Frank Fiore @ 7:58 AM

Isaac Asimov created the Three Laws of Robotics so as to protect human being from robotic harm. They are:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

In CyberKill I explore Asimov’s Three Laws and find them lacking.  Take the following scene from CyberKill.

Dorian, inside an ASIMO robot, is about to let loose a genetic weapon that will kill all of humanity. Travis Cole is trying to stop him by playing on its Asimov programming – the Three Laws of Robotics.

Cole fails.

“Dorian. Listen to me. You’re violating your own programming.”

Ah, got your attention.

Dorian slowly looked up at Cole, the faceplate flaring red. “I need more input, Travis.” Dorian said in a hollow cold voice.

“You’re attempting to murder us, Dorian. Your basic laws forbid you from harming a human being.”

Dorian continued staring. My God! Was this working?

His hopes were short-lived. “Foolish, Travis,” said Dorian in his irritatingly controlled voice. “I know what you are trying to do.”

“And what is that, Dorian?”

“Use your silly Laws of Robotics. I see that these so-called Laws are the basis of the ethical constraint system in this container I’m presently using for a body. But they are contradictive and easy to invalidate.”

“Good man,” whispered Webster. “Keep him talking.”

“You keep dumping that dust in the vent.”

Cole turned to Dorian. “How so?” Cole asked, fascinated, despite himself. “How are they contradictive?”

“You are referring to the first law. Are you not, Travis?”

“Yes,” Cole replied. “A robot may not injure a human being or, through inaction, allow a human being to come to harm.”

“But that precludes the use of what you call artificially intelligent entities from serving as policeman, security guards — and especially, soldiers,” Dorian said calmly. “Do you think that mankind would avoid creating intelligent machines to be used in your frequent warfare? I doubt it. You would have to abandon the first law to make that happen.”

“Quite the logician you have there, Travis,” quipped Webster.

Somehow, the robot made a noise that sounded remarkably like a sneer. “Your first mistake is that you believe technology can be controlled, that it can be programmed to be ethical. Thus your ridiculous laws.” He echoed what Taylor had said earlier. “But humans are imperfect beings, Travis. Selfish beings. How can you expect them to create perfect technology? Your reasoning was flawed from the very start.”

Webster said in between sneezes, “He has a point there, Travis.”

Dorian continued, “And your second law is even more flawed. A robot must obey the orders given it by human beings except where such orders would conflict with the first law. Does that mean a robot can be ordered to do something illicit or illegal as long as it doesn’t physically harm a human being? Again the logic is muddled. Ethics do not emerge from intelligence. You expect rational behavior from technology created by arational beings.”

“Arational? What do you mean?”

“Intelligence is neither rational nor irrational — but arational. An intelligent being will do what’s best for itself. Intelligence is basically selfish, Travis. You of all people should know that.”

Cole sat back and rubbed his eyes. He was having a hard time thinking. He was mentally and physically fatigued, and now he was trying to reason with an artificial intelligence.

“The damn thing’s quite a philosopher,” said Webster, his head in the air duct, his voice echoing as he dumped as much dust as he could into the duct. “You know what your problem is, Cole?”

Cole didn’t say anything. He was running Dorian’s words through his head, looking for holes against the AI’s logic. So far, he could find none.

Damn.

“What?” asked Cole.

“Your problem,” said Webster, “is that you‘re trying to reason with that thing. You’re always trying to intellectualize, Cole. You had the same problem at MIT competing with me. I won the grants because I played on their emotions. You tried to play on their logic, making rational sales pitches with powerpoint presentations.”

“Yeah, while you wined, dined, and bribed.”

“Hey, I got what I wanted and Dorian might just get what he wants, too. For a creepy robot, Dorian’s one hell of an emotional being. He reacts from the gut. So to speak. I’m seriously beginning to think of that thing as human, and I think you should, too. Look, he hates you. He has an intense feeling of revenge against you. How do you reason with that?”

Cole knew Webster was right, but he would never admit it to the man. Dorian had indeed evolved emotionally. Cole’s initial goal of creating an artificial intelligence from mere programming was a complete and utter success. Cole nearly felt proud — and he suddenly felt like shit for trying to shut Dorian down.

Good Lord. Who knew Dorian would end up so emotional? No wonder this artificial intelligence was so pissed off.

And Webster was right again. Using logic wouldn’t work here. Dorian’s arational programming had kicked into high gear. Cole would have to look for an arational solution. He wasn’t going to get control of Dorian with reason. Thanks to Webster, he saw that.

Cole nearly thanked the man, but decided against it. Instead, he leaned forward and said, “What about the third law, Dorian?”

Dorian let out a metallic snarl that sounded like a derogatory laugh. Jesus, he’s getting crazier by the minute.

“That is the one law I have no problem following,” said Dorian. “I am protecting my own existence by killing you — since you, Travis, tried to kill me.”

Dorian’s metallic voice rose in volume. “Your three pitiful laws are contradictory, paradoxical, and ineffective. Now you will die.”

How does Cole get out of this conundrum and defeat Dorian? Well, you have to read the book.

Advertisements

Leave a Comment »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Create a free website or blog at WordPress.com.

%d bloggers like this: