Monday, March 12, 2007

Machine Morality

In a science fiction story called Runaround (1942), Isaac Asimov introduced his three laws of robotics. These would be built into every (fictitious) robot, and were intended to make them safe and productive for use by humans. In essence, these constituted a kind of machine morality.

Roughly stated, these laws, in order of priority, say a robot must:

1. Protect humans
2. Obey orders, except when that conflicts with first law
3. Protect itself, except when that conflicts with first or second laws

Sixty-five years later, we still only have robots to vacuum floors and mow lawns. However, we have a plethora of other electronic devices that ring, flash, chirp, vibrate and talk to get our attention. These intrude ever more aggressively into our lives.

So it's time for additional, more sophisticated rules. My proposals are that machines must:

4. Behave as reasonably expected
5. Encourage correct use (and discourage incorrect use)
6. Avoid unnecessary side-effects
7. Avoid unnecessary resource requirements
8. Work under all the circumstances in which they're intended to be used
9. Be fun to use

These are, of course, obvious to anyone who's ever studied human factors, usability or product design in general. Sadly, however, these laws are seldom followed in practice. So perhaps by listing the laws and corollaries, we can make a kind of checklist for product designers.

Others?

Thursday, March 1, 2007

Why I Don't Post More Often

I'm busy.

But seriously, I have a bunch of ideas, but I like to let them ripen before putting them out here.  I'll get back to something soon.