I was going to mention the AI theme as well.
Originally Posted by catfish_head
It has become an old hat in sci-fi, the archetypal artificial intelligence that turns on its maker. 2001, The Matrix, I, Robot etc. The basic concept is that if you create an entity with consciousness, free will and the ability to learn, it is inevitable that you will lose control over the entity (if you ever had it in the first place). Some may want to argue that the original idea is seeded in Mary Shelley's "Frankenstein; Or, The Modern Prometheus". Bit I think it is no coincidence that the concept closely parallels the story of God and Man. God (the creator of the intelligent species, Man) loses control of Man when he gains the knowledge of Good and Evil by partaking of the fruit in the garden of evil, thereby gaining both the ability to learn and the free will of choosing to disobey his master. The story of Original Sin kicked off an eternal struggle between Man exercising his free will (sometime to his peril) and God attempting to imbue "morals" upon him. It may also seem a familiar story if you have ever met a teenager.
The authors of the various incarnations of the archetypal story have usually wanted to impart a moral upon the reader. It often took one of two shapes. Mankind is evil and this higher intelligence sees this. The cold godless intelligence of the entity is inherently evil and wants to rule the world (comment on Old World Communism, perhaps... maybe another thread). Of course the story has to be thrilling and gripping. Some form of struggle ensues and the result is that the machine must be destroyed or it will destroy.
In Isaac Asimov's classic short story "Runaround", Asimov introduced his Three Laws of Robotics. In one sense, the Three Laws are man's parallel to the Ten Commandments. An attempt to integrate a moral code into his creation (if only for his own protection).
1.) A robot may not harm a human being, or, through inaction, allow a human being to come to harm.
2.) A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3.) A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law.
"I am the Flail of God. If you had not committed great sins, God would not have sent a punishment like me upon You."
- Genghis Khan