South Koreans Working on the Three Laws of Robotics

Inspired by Isaac Asimov's three laws of robotics, the South Korean Ministry of Commerce, Industry and Energy decided last November to create a written code of ethics for robots. Their stated goal is "to address and prevent robot abuse of humans and human abuse of robots".

The move was met with some criticism, focused particularly on the prematurity of the idea. Mark Tilden weighed in saying, "From experience, the problem is that giving robots morals is like teaching an ant to yodel. We're not there yet, and as many of Asimov's stories show, the conundrums robots and humans would face would result in more tragedy than utility."

What do you think?
How would you create a robot's moral code?

By the way, the three laws of robotics (created by Asimov) are...

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.