|
By *eardybobMan
over a year ago
the Goldilocks Zone |
Love Asimov's rules, but I'm gonna change rule 3 (protecting its own existence) out... i've seen enough films to know that it never ends well!
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3. All decisions directly or indirectly affecting humans must be ratified by a collective of humans, and can be vetoed (as long as timings around such ratification does not conflict with the First or Second Laws).
|
Reply privately, Reply in forum +quote
or View forums list | |
|
By *opinovMan
over a year ago
Point Nemo, Cumbria |
"To Hell with AI in still figuring out how to put actual intelligence into humans"
But AI is fascinating - I watched the development of one of the (possibly 'the') world's first true self-learning programmes by a friend... very exciting when it started to show accurate predictions of sun spot activity based upon past observations. |
Reply privately, Reply in forum +quote
or View forums list | |
» Add a new message to this topic