As India prepares to induct more unmanned aircraft systems (UAS – with defensive and offensive capabilities) to neutralize future threats from Pakistan and China, a new report from Britain’s Ministry of Defence (MoD) has warned that increasing use of such aircraft may create a ‘Terminator-like’ world and also make conflicts more likely.
The internal report titled “The UK approach to Unmanned Aircraft Systems” raises moral, legal, and scientific issues with regard to excessive use of unmanned systems in conflicts across the globe.
These aircrafts can be launched in combat areas and are flown by “remote control, via satellite links, or by pilots sitting thousands of miles away,” says Chris Bowlby, Producer of Robo War.
Currently, the unmanned aircraft systems are being developed and deployed by around 50 countries. They are extensively used in Afghanistan, Pakistan, Yemen, some Latin American countries, and now Libya.
The report states that use of unmanned aircraft systems involves the removal of risk to ones own forces in warfare. This raises a lot of ethical issues. “For war to be moral (as opposed to just legal) it must link the killing of enemies with an element of self-sacrifice, or at least risk to oneself,” the report argues.
Another research report, written by Chris Cole, has also raised a similar issue where he states that “the core concern with regards to the use of armed drones is the ‘Playstation mentality’ whereby the geographical and psychological distance between the drone operator and the target lowers the threshold in regard to launching an attack and makes it more likely that weapons will be launched.”
Cole’s report also raises the scary possibility of the vulnerability of the unmanned drones to technical failures and hacking. “Far from resolving conflicts, their indiscriminate nature is fueling further anger, mistrust and division between human communities and perpetuating cycles of violent conflict,” his report adds. Reports of civilian casualties in Afghanistan and Pakistan as a result of drone attacks have already been used by extremist elements to fuel anger against the US forces.
The British MoD report also states that the extensive use of unmanned systems in places like Pakistan and Afghanistan has facilitated the use of force in places where it was almost impossible to operate in. As a result such systems have increased the possibility of wars.
Questions have also been raised on the possibility of fielding fully autonomous armed systems. “To a robotic system, a school bus and a tank are the same – merely algorithms in a programme – and the engagement of a target is a singular action; the robot has no sense of ends, ways and means, no need to know why it is engaging a target. There is no recourse to human judgement in an engagement, no sense of a higher purpose on which to make decisions, and no ability to imagine (and therefore take responsibility for) repercussions of action taken,” the report states.
The report states that the desire to save manpower costs and lives will drive the demand for more unmanned systems with more autonomy.
According to a US Department of Defense presentation, the US has atleast 7000 drones (one in 50 troops in Afghanistan). Senior US Air Force officials argue that it no longer makes sense to spend $1 million in training a pilot (who can play manned aircrafts) to fly a drone and that future drone operators may not be such pilots.
And with new softwares being developed which will enable drones to think like pilots to prevent mid-air collisions with other drones and human-power aircrafts, there are also fears that the role of humans in operating such lethal systems will shrink drastically.
“Such a scenario may lead us towards a Terminator-like reality,” the report argues. It calls for immediate action to determine what could be an “acceptable machine behaviour”.
(This article first appeared on the website of Centre for Land Warfare Studies and Indian Defence Review on April 25, 2011)