Friday, February 6, 2009

War 'bots, and Their Implications

War 'bots, and Their Implications: "

'

blog post photo

'

You can talk all you want about network-centric warfare, P.W. Singer says in his new book, but right now the real revolution in military affairs is not the all-nodes-communicating-with-each-other-in-real-time model that program managers and military contractors can’t get enough of. Instead, it’s happening in the field of robotics, where huge strides are being made in developing drones and unmanned vehicles that—yes—can talk to one another, but that's only part of the story.'

Singer'argues'that while the networks are important, ‘history will care far more about what these linkages enable. That is, these new digital links are important, but hot as much as the platforms they now allow. What will stand out, what is historic for war, and human history in general, are the robotic weapons now playing greater roles on the battlefield.’

Along with the increasing use and sophistication of these ‘bots comes a host of ethical and legal concerns that the military hasn’t yet begun to fully wrap its head around. Chief among them, in Singer’s engaging analysis, is the thorny issue of culpability that arises when these systems malfunction, or as all too often happens in the fog of war, engage the wrong targets, leaving to civilian or friendly casualties.

As an example, Singer brings up the infamous incident in Afghanistan where an American Air Force pilot dropped a 500-lb bomb on a group of Canadian soldiers, mistaking them for Taliban or al Qaeda fighters making their way across the desert at night. In the inquest that followed, the American pilot was rightly found guilty of disregarding orders and the rules of engagement. But what would happen if the Canadians were attacked by an autonomous UAV on a preprogrammed course capable of identifying and engaging targets with limited human supervision? Who would be responsible for the mishap? The military or civilian contractor who wrote the wrong target identification software? The squadron’s commanding officer? The commander who sent the drone out on a mission in the first place? These are issues that we will have to face in the near future, and not ones that have been given a lot of thought.

Singer says that we need to start thinking about the consequences of using autonomous robots to do our fighting for us, and we need to start doing it now.

Before establishing a chain of responsibility however, the military first needs to begin writing doctrine for the use of robotics. The ad hoc way that robotics are being used in Iraq and Afghanistan is not sustainable in perpetuity, and as systems grow in complexity, and are able to act more and more autonomously, rules are increasingly needed. Just as the Brits had no doctrine for how to use tanks in WWI, and thus at times failed to use them to their full potential, no one as yet has devised a doctrine for how to handle increasingly complex robotics on the battlefield, which in the future will rely less and less on a human being ‘in the loop.’

One official at DARPA told Singer that with the increasing sophistication of robotic systems and the ever-expanding autonomy with which they operate, ‘the human is becoming the weakest link in defense systems.’ While that might be true in terms of surveillance or in the simple brute terms of putting steel on target, it also ignores the more complex situations that will arise on any battlefield that require a certain level of emotional intelligence—something only human beings possess. While a robot is great at sensing a threat and taking it out with no hesitation, it’s not as good at assessing the pros and cons of using force to achieve its objectives, something that is crucial in stability and counterinsurgency operations

Given all this, when things do go wrong with unmanned systems—and they will—it’s critical that a chain of causality be established beforehand to ensure that someone, at some level, is responsible for the actions these systems take, since, as Singer writes, ‘by establishing at the start who is ultimately responsible for getting things right, it might hopefully add a dose of deterrence into the system before things go wrong.’ As J.F.C. Fuller once wrote, ‘the more mechanical become the weapons with which we fight, the less mechanical must be the spirit which controls them.’

"



(Via Ares.)

No comments:

LinkWithin

Blog Widget by LinkWithin