Isaac Asimov's The Three Laws of Robotics
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
0. A robot may or not hard humanity, or, by inaction, allow humanity to come to harm.
If you like the concept, feel free to share it and comment!)
Top-row at Pixologic's ZBrush Central.