“I, for One, Welcome Our New [Robot] Overlords”
Posted on January 30th, 2008 at 5:22 pm by Steve

Schematic of an Ethical Autonomous (Lethal) Robot

Following U. S. Immigration and Customs Enforcement’s use of Kafka as an instruction manual, we now have a senior robotics researcher taking a page from Verhoeven’s Robocop:

Nonetheless, the trend is clear: warfare will continue and autonomous robots will ultimately be deployed in its conduct. Given this, questions then arise regarding how these systems can conform as well or better than our soldiers with respect to adherence to the existing Laws of War. This article focuses on this issue directly from a design perspective.

This is no simple task however. In the fog of war it is hard enough for a human to be able to effectively discriminate whether or not a target is legitimate. Fortunately for a variety of reasons, it may be anticipated, despite the current state of the art, that in the future autonomous robots may be able to perform better than humans under these conditions…

Pretty sweet, huh? Autonomous robot killers that will do a better job of protecting civilians and defeating opposing forces. What’s not to like?

Lacking from this overall affective approach is the ability to introduce compassion as an emotion at this time, which may be considered a serious deficit. It is less clear how to introduce such a capability, but by requiring the autonomous system abide strictly to the LOW [Laws Of War] and ROE [Rules Of Engagement], one could contend that it does exhibit compassion: for civilians, the wounded, civilian property, other noncombatants, and the environment. Compassion is already, to a significant degree, legislated into the LOW, and the ethical autonomous agent architecture is required to act in such a manner.

Mmm hmm. This guy actually believes that compassion is already legislated into the laws of war, and he further believes that military leaders will deploy robots that strictly adhere to these laws of war…

Don’t take my word for it: read the whole (117 page!) thing.