Drones Will Soon Use Artificial Intelligence to Decide Who to Kill

Once complete, these drones will represent the ultimate militarisation of AI and trigger vast legal and ethical implications for wider society.

Drones Will Soon Use Artificial Intelligence to Decide Who to Kill lThe US Army recently announced that it is developing the first drones that can spot and target vehicles and people using artificial intelligence (AI). This is a big step forward.

Whereas current military drones are still controlled by people, this new technology will decide who to kill with almost no human involvement.

Once complete, these drones will represent the ultimate militarization of AI and trigger vast legal and ethical implications for wider society.

There is a chance that warfare will move from fighting to extermination, losing any semblance of humanity in the process.

At the same time, it could widen the sphere of warfare so that the companies, engineers and scientists building AI become valid military targets.

Existing lethal military drones like the MQ-9 Reaper are carefully controlled and piloted via satellite. If a pilot drops a bomb or fires a missile, a human sensor operator actively guides it onto the chosen target using a laser.

Ultimately, the crew has the final ethical, legal and operational responsibility for killing designated human targets. As one Reaper operator states:

“I am very much of the mindset that I would allow an insurgent, however important a target, to get away rather than take a risky shot that might kill civilians.”

Even with these drone killings, human emotions, judgments and ethics have always remained at the centre of war. The existence of mental trauma and post-traumatic stress disorder (PTSD) among drone operators shows the psychological impact of remote killing.

Drones Will Soon Use Artificial Intelligence to Decide Who to Kill
An MQ-9 Reaper Pilot. (Photo: US Air Force)

And this actually points to one possible military and ethical argument by Ronald Arkin, in support of autonomous killing drones. Perhaps if these drones drop the bombs, psychological problems among crew members can be avoided.

The weakness in this argument is that you don’t have to be responsible for killing to be traumatised by it. Intelligence specialists and other military personnel regularly analyse graphic footage from drone strikes.

Research shows that it is possible to suffer psychological harm by frequently viewing images of extreme violence.

When I interviewed over 100 Reaper crew members for an upcoming book, every person I spoke to who conducted lethal drone strikes believed that, ultimately, it should be a human who pulls the final trigger. Take out the human and you also take out the humanity of the decision to kill.

Grave consequences

The prospect of totally autonomous drones would radically alter the complex processes and decisions behind military killings. But legal and ethical responsibility does not somehow just disappear if you remove human oversight. Instead, responsibility will increasingly fall on other people, including artificial intelligence scientists.

The legal implications of these developments are already becoming evident. Under current international humanitarian law, “dual-use” facilities – those which develop products for both civilian and military application – can be attacked in the right circumstances.

For example, in the 1999 Kosovo War, the Pancevo oil refinery was attacked because it could fuel Yugoslav tanks as well as fuel civilian cars.

Drones Will Soon Use Artificial Intelligence to Decide Who to Kill 2
An Air Force RPA reconnaissance drone is retrofitted for use in attack squadron. (Photo: U.S. Air Force)

With an autonomous drone weapon system, certain lines of computer code would almost certainly be classed as dual-use. Companies like Google, its employees or its systems, could become liable to attack from an enemy state.

For example, if Google’s Project Maven image recognition AI software is incorporated into an American military autonomous drone, Google could find itself implicated in the drone “killing” business, as might every other civilian contributor to such lethal autonomous systems.

Ethically, there are even darker issues still. The whole point of the self-learning algorithms – programs that independently learn from whatever data they can collect – that technology uses is that they become better at whatever task they are given.

If a lethal autonomous drone is to get better at its job through self-learning, someone will need to decide on an acceptable stage of development – how much it still has to learn – at which it can be deployed.

In militarized machine learning, that means political, military and industry leaders will have to specify how many civilian deaths will count as acceptable as the technology is refined.

Recent experiences of autonomous AI in society should serve as a warning. Uber and Tesla’s fatal experiments with self-driving cars suggest it is pretty much guaranteed that there will be unintended autonomous drone deaths as computer bugs are ironed out.

If machines are left to decide who dies, especially on a grand scale, then what we are witnessing is extermination.

Any government or military that unleashed such forces would violate whatever values it claimed to be defending. In comparison, a drone pilot wrestling with a “kill or no kill” decision becomes the last vestige of humanity in the often inhuman business of war.

This article was amended to clarify that Uber and Tesla have both undertaken fatal experiments with self-driving cars, rather than Uber experimenting with a Tesla car as originally stated.

Find us here

Get news from the CSGLOBE in your inbox each weekday morning

The views and opinions expressed in this article are those of the authors/source and do not necessarily reflect the position of CSGLOBE or its staff.

Paid content

The 10 Biggest Dangers Posed By Future Technology

It’s not easy predicting the future of technology. In the fifties it seemed a pretty much foregone conclusion that by 2015 we would all...

Did we discover “Alien technology” and reverse-engineer it?

A lot of people have wondered ever since the first reported UFO crash, if we had come into contact with Alien technology. Highly advanced technology...

Self-Driving Uber Car Kills Pedestrian in Arizona, Where Robots Roam

Arizona officials saw opportunity when Uber and other companies began testing driverless cars a few years ago. Promising to keep oversight light, they invited...

What's New Today

Georgia House Votes To Allow Citizens To Abolish Police Departments In The State

The Georgia House backed an effort on Friday to dissolve the Glynn County Police Department and any...

Leaked CDC document contradicts Pence claim that U.S. coronavirus cases ‘have stabilized’

Even as Vice President Mike Pence wrote in a Wall Street Journal op-ed published Tuesday that coronavirus...

Five bombshells about Trump from Bolton ‘s book

Excerpts from former national security adviser John Bolton ’s book about his time in the Trump administration...

Don’t Listen to Fox. Here’s What’s Really Going On in Seattle’s Protest Zone.

It seems I live in a city undergoing a “totalitarian takeover” that will lead to “fascist outcomes”...

MOST READ

Putin has Banned Rothschild and His New World Order Banking Cartel Family from Entering Russian Territory

As of recently, Russian president Vladimir Putin took yet another decision for his country. "Under any circumstances", the Rothschild family is banned from entering Russian territory. Along...

What Is Agenda 21? Depopulation of 95% of the World By 2030

Most people are unaware that one of the greatest threats to their freedom may be a United Nations program which plans to depopulate 95%...

Rothschild Bank Now Under Criminal Investigation After Baron David De Rothschild Indictment

Last year, Baron David de Rothschild was indicted by the French government after he was accused of fraud in a scheme that allegedly embezzled...

The 10 Biggest Dangers Posed By Future Technology

It’s not easy predicting the future of technology. In the fifties it seemed a pretty much foregone conclusion that by 2015 we would all...

New Invention Lets You Sow Seeds with a 12-Gauge Shotgun

Studio Total, a Scandinavian creative lab run by Per Cromwell, has invented Flower Shell – a 12-gauge seed-filled shotgun shell that can be blasted...

Wireless Energy Transmission On A Global Scale? Tesla Would Be Proud.

If only Nikola Tesla (inventor of AC or alternating current that is used world-wide, the fluorescent tube that we use to this day ) were...

Microchip implant ahead of iPhone 6 release

Australian man who's had a microchip inserted into his hand so that he can do more with the iPhone 6...maybe A Brisbane man is living...

Mexico City Legislators Ask To Ban Donald Trump From Their Country

Legislators in Mexico City are asking the federal government to ban Donald Trump from entering the country, citing the GOP presidential hopeful’s repeated anti-Mexican...

Gold Backed Russian Ruble, Chinese Yuan Primed To Destroy U.S. Dollar As Global Reserve Currency

A revolutionary transformation of the entire global monetary system is currently underway, being driven by an almost perfect storm. The implications of this transformation are...

How to access your subconscious mind – 3 essential tips

Did you know that your conscious mind is only the smallest tip of a vast pyramid of knowledge within you? Your subconscious mind contains...

Congrats, America: Hillary, Trump Could Both Be Criminally Charged After 2016 Election

Republican presidential nominee Donald Trump and Democratic presidential nominee Hillary Clinton have made headlines recently for their alleged problems with the law. Trump, who in...