Autonomous drones may have been used to hunt down and attack human targets in combat for the first time, according to a United Nations Security Council report. Armed with AI and an explosive warhead, the killer drones were deployed during a military offensive that was launched in March of last year as part of the Libyan Civil War, and were programmed to attack targets on their own, without the need to be remotely operated by a human.
During an offensive launched in late March 2020, ground-based Hafter Affiliated Forces (HAF) units were hunted and attacked by autonomous drones that were deployed by Government of National Accord Affiliated Forces (GNA-AF). According the UN report, “logistics convoys and retreating HAF were subsequently hunted down and remotely engaged by the unmanned combat aerial vehicles or the lethal autonomous weapons systems such as the STM Kargu-2 and other loitering munitions.”
Unlike typical remotely-operated drones, these weapon systems are autonomous, “programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true ‘fire, forget and find’ capability,” the report says. It goes on to say that GNA-AF’s drones, both autonomous and remotely-operated, proved to be “a significant force multiplier for the [GNA-AF] ground units” that forced HAF units into a disorganized retreat. “Once in retreat, they were subject to continual harassment from the unmanned combat aerial vehicles and lethal autonomous weapons system,” the report adds.
The STM Kargu itself is a 60-centimeter (2-foot) quadcopter drone built in Turkey that can be armed with an explosive payload used for kamikaze attacks. According to the company the device uses a facial-recognition system that can be used to seek out a specific individual, and large swarms of the small, low-cost drones can effectively overwhelm advanced air defense systems.
The report doesn’t specify whether or not any HAF soldiers were injured or killed in the attacks, although it did say that the “unmanned combat aerial vehicles and lethal autonomous weapons systems” proved to be a “highly effective combination in defeating the United Arab Emirates-delivered Pantsir S-1 surface-to-air missile systems.” Regardless of whether or not the attacks resulted in casualties, the report states that the use of these weapons was “in non-compliance… with resolution 1970 (2011),” a UN Security Council resolution condemning the use of lethal force by the Libyan government against protesters participating in the 2011 Libyan Civil War.
“There are serious doubts that fully autonomous weapons would be capable of meeting international humanitarian law standards, including the rules of distinction, proportionality, and military necessity, while they would threaten the fundamental right to life and principle of human dignity,” states Human Rights Watch in their opposition to the development and deployment of killer robots. “Human Rights Watch calls for a preemptive ban on the development, production, and use of fully autonomous weapons.”
Subscribers, to watch the subscriber version of the video, first log in then click on Dreamland Subscriber-Only Video Podcast link.
In a perfect world, such use of AI could be classified as a war crime.