Can we trust killer robots and drones to act ethically in future wars?

Heather Zeiger
August 31, 2018
Reproduced with Permission
Features

Technology may change -- but how and why we use it shouldn't. Even when we consider the battlefield of the future, the ethics of a just war are still important guides for how we should use technology.

A recent New York Times op-ed by Robert H. Latiff, a retired Air Force major general and author of Future War: Preparing for the New Global Battlefield, calls for a discussion on ethics in modern warfare. He is concerned with how new technologies such as augmented soldiers, autonomous weapons, and remotely-operated drones not only change the nature of warfare, but also change the nature of the soldier. Will the modern soldier have the moral capital to make difficult split-second decisions? Will he adapt to the technologies that he is using?

As Latiff points out, "War is a deeply human activity because the battlefield gives rise to such feelings as courage, fear, cruelty, remorse, altruism, guilt, sacrifice, and empathy". These feelings serve to restrain behavior and guide ethical considerations within the context of the situation, something that Latiff believes can be lost when technology allows distance from the battlefield.

Drone Wars UK, a website devoted to monitoring how the UK uses drones in war zones and in combat situations, conducted an interview with a former drone (also called Reaper) pilot. This pilot flew Reapers in Afghanistan for the RAF from the Creech Air Force Base in Nevada. (See here for the full interview.)

The pilot, whose real name was not used for security reasons, dislikes the term "Play Station Mentality" to describe the emotional distance that some people believe a drone pilot has when engaging an enemy. In his experience, once he got into the pilot seat of a drone, it was very much like piloting a fast jet. Much of the information he received was in the same form as when he flew in a cockpit. In the pilot's words,

"Once you are in that seat, once you are flying the aircraft, your mind is very much 'I am in this aircraft'…There is the potential for you to feel that what you are doing isn't real and there are no direct consequences. But I think that would only occur for someone who had not themselves sat in an aircraft and been shot at."

The pilot says that when it comes to the rules of engagement, it is the same for Reapers as it is for manned aircraft. The ultimate decision to engage comes down to the pilot.

A common critique is that drone pilots receive information from multiple sources, such as his or her commanding officer, the person assessing the information the drone is collecting, and legal experts on whether it is appropriate to engage. Critics see this kind of group decisionmaking as absolving any particular person from responsibility when something goes wrong.

The Reaper pilot said that this was not the case for him. While he received instructions, the ultimate decision on whether to engage was his -- the standard for jet pilots. He said there were a couple of instances when he decided to not engage the target even though he had been given clearance.

While the Reaper pilot and Latiff may differ in some of their views, they are in agreement on two points: first, the decision to kill should always be made by a human being, not a robot or algorithm, and second, the rules for engagement should not change just because the technology has.

Empathy, experience, and screens

The Reaper pilot was able to see himself in the cockpit of an aircraft that is being shot at because he has been in that situation. He was able to have empathy. His comments, though, bring up a different concern than the "Playstation mentality."

Perhaps, instead, we should be more worried about the "Smartphone effect." Studies have shown that people who spend more time socializing over text or through social media tend to lack empathy. Sherry Turkle's book Reclaiming Conversation discusses the lack of empathy that many teens and college students have as a result of not having face-to-face conversations. Like the Reaper pilot, people who have real-life conversions can put themselves in the other's shoes, or have empathy for the person on the other side of the text message.

The Reaper pilot is able to make a nuanced decision about whether or not to engage because he understands that he is dealing with a real human being. But what about the pilot who has not flown an aircraft or been shot at? Is this pilot more like the young smartphone addict who is emotionally and socially stunted because he has so little practice with real-life interactions?

The principles of a just war assume that we are not to encounter the enemy in a no-holds-barred rage, but that we see him first as a fellow human being, and second as one that threatens other human beings. This requires empathy. And it means that war is a last resort not a first response; that civilian casualties and collateral damage should be minimised; and that a response should be proportional to the threat. Even in the face of enemies who do not hold a high view of human dignity, the response is not to match them as brute-to-brute, but to maintain the moral high ground.

Automation and Information

It is a diminished view of ethics that assumes complex battlefield decisions can be made with information and algorithms. This reduces ethics to a mere calculation. Latiff and the Reaper pilot, both having worked with sophisticated weaponry, adamantly oppose automated weapons because robots lack the ability for moral deliberation.

Human Rights Watch recently published a report on the ethics of "killer robots" ahead of a United Nations meeting on this topic this week. Human Rights Watch is against autonomous robots on the battlefield because "killer robots" go against the Martens Clause, which says that anything not specifically covered in international humanitarian law (which includes new technologies) must satisfy the principles of humanity and the dictates of public opinion. The principles of humanity are the "humane treatment of others and respect for human life and human dignity," something the authors of the report say a robot is not able to do, and public opinion polls show that the public does not want killer robots on the battlefield.

A machine, no matter how sophisticated its algorithm, cannot have a moral intuition, which is exactly what is required for a just war. Autonomous weapons devalue the dignity of the individual, even the individual enemy, when they kill a human being because of a calculation by an algorithm.

The case for morality

Latiff recounts the experience of a US Army chaplain who was deployed several times in the Iraq war. The chaplain served in a unit in which several soldiers had died and many more had been wounded. Their commander was not a strong leader and many of the soldiers were "anguishing from the effects of war and the wounded soul." The chaplain told Latiff that he wishes all soldiers understood justice and morality as described in Just War Theory because it "provides soldiers with an emotional structure that serves to soften or mitigate the damage war does to their souls".

By having a firm understanding of when war is morally justified and when it isn't and what constitutes proper conduct and what doesn't, soldiers can understand their role within a larger context of protecting those who cannot protect themselves rather than sinking into nihilistic despair.

Interestingly, the chaplain believed that a utilitarian calculation actually causes psychological harm to soldiers:

"…morally neutral modes of being that ignore ethics and couch combat operations in more utilitarian terms exacerbate moral injuries and frustrate [the soldier's] treatment."

The chaplain told Latiff that the soldiers need to know there are rules that govern war, that it is not merely "murder and mayhem."

It doesn't matter how sophisticated the technology becomes, people need to have a moral foundation on which to rest, and that moral foundation needs to be able to withstand something like a combat situation or making life-and-death decisions within a particular context.

Principles, like those outlined in Just War Theory, and a virtue ethic that exhorts soldiers to have a certain kind of character have withstood the test of time. They are based on a morality that says there is such a thing as divine authority and that authority has conferred dignity upon all human beings. It's also a morality that acknowledges the difficulty of living in a fallen world where conflicts will inevitably occur.



References

Top