We, RobotOkay. I admit it, I'm a Sci-Fi nut. I especially like anything that has to do with robotics. I like all the robots, even the bad evil ones. A common motif in science fiction stories is to villainize a robot in order to examine and explore our own humanity. There's way too many examples in science fiction stories, TV shows, or even cartoons to go into here, but let's just say that I'm hooked hard-core.
Isaac Asimov's Laws of Robotics
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Of all the science fiction stories that I have read, or movies that I've seen, I have not seen or read any story which instills upon humans any type of formal laws such as those above for robots. So I took the liberty and wrote them myself.
Foxxfyrre's Laws of Humanics
- Lex terrae et hominem: A human must respect and not harm the Earth and its environment or, through inaction, allow the Earth and its environment to come to harm.
- Lex nihil noceat: A human must not injure or harm another human or, through inaction, allow a human to come to harm.
- Lex quantum ad scientiam: A human must learn from other humans. If knowlege gained is in conflict with the first or second Law, it may not be actively or passively enacted upon.
- De lege propriae exsistentiae: A human must protect its own existence as long as such protection does not directly conflict with the First Law, or aggressivly conflict with the Second Law.
What kind of world would we be living in if we followed these laws? The first law protects our little blue ball we live on, therefore preserving it for future generations. The second law simply means "Do no harm." Every doctor has taken an oath where do no harm is a central point. The third law respects knowledge, and the pursuit of pure science. Just because we know the science behind a Nuclear Bomb, doesn't mean we should make one. The fourth law may cause some debate, for many may think it does not allow for one to defend oneself. On the contrary, but it doesn't allow one to OVER defend oneself. Shoot first, ask questions later is in direct conflict with the law.
In an ideal world, could we live with those laws? I think so.
Could they instill peace in the world? Positively.
Could we live with them now? It would be nice.
Will we? There's always hope.
I'll leave you to ponder this a little. I know I will. Here is a little snippet from Wikipedia, which refers to an Isaac Asimov short story within the compilation novel "I, Robot" about his main character Dr.Susan Calvin--
The plot of "Evidence" revolves around the question of telling a human being apart from a robot constructed to appear human – Calvin reasons that if such an individual obeys the Three Laws he may be a robot or simply "a very good man". Another character then asks Calvin if robots are very different from human beings after all. She replies, "Worlds different. Robots are essentially decent."