Wanted: ethical robots, the sooner the better

Xavier Symons
30 Nov 2013
Reproduced with Permission
BioEdge

The ethics of creating autonomous and intelligent robots used to be a purely speculative question. But with the development of drones, self-navigated cruise missiles and self-directing armoured vehicles, robot scientists realise that they must build ethical behaviour into their machines.

At the recent Atlanta Humanoids 2013 conference, the world's largest annual gathering of roboticists, Ronald C. Arkin of Georgia Institute of Technology urged colleagues to be aware of the capabilities of their creations: "these kinds of technologies you are developing may have uses in places you may not have fully envisioned."

Arkin's talk was entitled 'How NOT to Build a Terminator'.

"If you would like to create a Terminator, then I would contend: Keep doing what you are doing, because you are creating component technologies for such a device," he said. "There is a big world out there, and this world is listening to the consequences of what we are creating."

Gill Pratt of DARPA agreed with Arkin, saying, "with dual use being everywhere, it really doesn't matter. If you're designing a robot for health care, for instance, the autonomy it needs is actually in excess of what you would need for a disaster response robot."

Arkin and Pratt are not alone in their concerns. At a UN-sponsored meeting in Geneva earlier this month parties to the Convention on Conventional Weapons (CWC) agreed to reconvene in May to discuss the future of "autonomous weapons".

Top