The Ethical Concerns Of Drone And Automated Warfare

By Brendan O'Halloran

Drone warfare has, in recent years, become a tactic that the United States has become increasingly reliant on. This is despite the fact that it has become increasingly clear that drone strikes and other forms of automated warfare that separate humans from the consequences of conflict are unreliable at best, and increasingly pose danger to noncombatants. The insistence on not just maintaining but ramping up the usage of automated weaponry reveals a classical realist perspective in the US Government’s strategy regarding conflict in the Middle East. There appears to be a belief that wildly nonproportional damage caused by drone strikes is justifiable because they supposedly increase the power and influence of the United States. This stance, and the attacks that result from it, are ethically bankrupt.

Despite long-time claims that drones are the accurate weapons of the future, evidence points to them as being wildly unsafe towards noncombatants. Between 2002 and 2013, there were an estimated 2200 civilian deaths, all of which the United States denied for a long period. [1] And this was before the government began to use drone strikes extensively, with a frequency of attacks rising. As a result, civilian casualties have increased 52% in the new presidential administration. This isn’t surprising, considering the incident in which President Trump, when watching footage of a strike, questioned why the CIA waited for a target’s family to be clear from the blast. [2] Given the massive number of civilian casualties, not to mention the damage in infrastructure incurred, this is concerning. One of the arguments in favor of drone strikes is that they’re supposedly more effective, given that they don’t need to eat or sleep— an argument that seems to be centered around the idea that they’re “better” at warfare because they’re not bound by human limitations. It’s also important to consider the fact that the deaths of innocents at the hands of US drones outrages and possibly radicalizes civilians in the Middle East, weakening American influence, destabilizing the region and creating more enemies. Furthermore, by performing drone strikes in nations like Pakistan without the approval of their government— and possibly killing their citizens in the process— the United States strains their international diplomatic standing. [3] Even in the context of a realist philosophy, drone warfare may be doing more harm than good.

In addition, the advent of lethal autonomous weapons (LAWs) raises several concerns. Drone warfare is one thing, but it at the very least has a human being holding the trigger. Ethically, LAWs are an entirely different matter. They choose their own targets using a similar technology to automated cars— which are notoriously glitchy. [4]  They certainly could kill a large number of terrorists, but they could also choose to target noncombatants, and they have no qualms about collateral damage. The robotics community has even created petitions against their creation and usage, and several figures have raised the concern that they could be used as terror weapons against civilian populations by extremist groups. [5] The very existence of LAWs suggests a world in which the government would cede control over to a machine. No matter how well it’s programmed, a machine can’t make moral decisions. LAWs are simply the endpoint of the current realist philosophy of the government: a sacrifice of moral and ethical concerns in favor of the most effective soldier to consolidate American influence.

So the question remains: If automated weaponry is so unreliable, why has the current presidential administration escalated them? The truth is that drone strikes are an easy way to sanitize war for the public. The conflict in Afghanistan has been raging for fifteen years— the longest in American history. The only constant reminder for American citizens is the thousands of soldiers sent home in caskets. While drone strikes do prevent American soldiers from being put at risk, the main benefit for the government is that it allows them to continue to fight a hopeless war without the public remembering that it’s happening. The war of the future is still one in which noncombatants are regularly harmed.

Ultimately, the United States government’s current path on this matter is one that places effective warfare above ethical concerns. The only solution— one which seems increasingly unlikely— is to deescalate the usage of autonomous weapons. (Ending the war would be another one, but that’s an entirely separate matter.) Warfare is already brutal and inhumane when people are involved. Deploying tools of war that can’t discern between threats and innocents will only worsen this.

Brendan O'Halloran is a sophomore at the College of William and Mary.


 Works Cited

[1] https://www.lawfareblog.com/civilian-casualties-collateral-damage

[2] https://motherboard.vice.com/en_us/article/7xmadd/trump-escalating-americas-drone-war

[3] https://www.theverge.com/2018/12/5/18127785/one-nation-under-drones-john-jackson-military-war-weapons-interview

[4] http://www.nytimes.com/2018/11/15/magazine/autonomous-robots-weapons.html)

[5] http://www.nytimes.com/2018/11/15/magazine/autonomous-robots-weapons.html