While I sometimes write about cool (but hopelessly unrealistic) robots that either re-enact World War II's Pacific Theater in space or protect child stars from homicidal time-traveling robots from the future, Vdawg actually is studying about/ working on military robots. Today, he reports from an interesting conversation he had with some folks at Northrop Grumman.
Vdawg begins by pointing out the PR problems with robot autonomy on the battlefield:
"We find it fine to automatically target tanks, planes, bunkers, or guns, but not people, because we are (a) afraid that machines do not have sufficient discriminatory intelligence necessary to distinguish between friend and foe with acceptable accuracy, and (b) it is just a taboo – the thought of US having robots killing people even on the battlefield is a revolting one to some of the constituents of the American populace. On the other hand, if we target a tank or a fighter jet then it is certain that who is inside is a combatant. And plus this isn’t directly killing humans. For some reason the illusion of indirect killing is apparently enough to restrain the gag reflex of those who tout ‘humane wars’."
So if your business is making robots that make things go boom on their own, you have to change perceptions a bit. How might this happen? Vdawg's interviewees point out several things. First, it's likely that autonomous robotics will first be deployed domestically, in a law enforcement context. Second, Vdawg and his Northrop Grumman interviewees also agree that it's likely that existing conventional systems will grow more and more remote over time--such as the example of a guided munition starting out being "wired" and then slowly progressing to being employed off-site. Lastly, stronger verification tests for the fitness of autonomous systems might also give the public confidence.
It's important to point out, as Vdawg and his interlocutors do, that there are varying degrees of autonomy and control that will vary by weapons system, political mission, and role. Not everything is "Skynet," common to popular perception. Vdawg is also emphatic on the point that dirty systems wielded by strategic spoilers will accelerate the process of innovation faster than more risk-averse states that observe iron Rules of Engagement. Since those states will be loathe to grant their adversaries a tactical advantage, Vdawg argues, systems will be pushed into the field regardless of their maturity.
Comments