“Killer robots the soldiers that never sleep” by Simon Parkin

from BBC, article covering South Korean automated machine gun.

“(…) Daejeon, a city in central South Korea, a machine gun turret idly scans the horizon. (…)

A gaggle of engineers standing around the table flinch as, unannounced, a warning barks out from a massive, tripod-mounted speaker. A targeting square blinks onto the computer screen, zeroing in on a vehicle that’s moving in the camera’s viewfinder. (…) The speaker (…) has a range of three kilometres. (…) “Turn back,” it says, in rapid-fire Korean. “Turn back or we will shoot.”

The “we” is important. The Super aEgis II, South Korea’s best-selling automated turret, will not fire without first receiving an OK from a human. (…)

The past 15 years has seen a concerted development of such automated weapons and drones. (…) . Robots reduce the need for humans in combat and therefore save the lives of soldiers, sailors and pilots.

(…) . The call from Human Rights Watch for an outright ban on “the development, production, and use of fully autonomous weapons” seems preposterously unrealistic. Such machines already exist and are being sold on the market – albeit with, as DoDAAM’s Park put it, “self-imposed restrictions” on their capabilities.

(…) Things become more complicated when the machine is placed in a location where friend and foe could potentially mix (…)

If a human pilot can deliberately crash an airliner, should such planes have autopilots that can’t be over-ruled?

Likewise, a fully autonomous version of the Predator drone may have to decide whether or not to fire on a house whose occupants include both enemy soldiers and civilians.(…)  In this context the provision of human overrides make sense.(…)

Some believe the answer, then, is to mimic the way in which human beings build an ethical framework and learn to reflect on different moral rules, making sense of which ones fit together. (…)  At DoDAAM, Park has what appears to be a sound compromise. “When we reach the point at which we have a turret that can make fully autonomous decisions by itself, we will ensure that the AI adheres to the relevant army’s manual. We will follow that description and incorporate those rules of engagement into our system.”

‘(…)

The clock is ticking on these questions. (…)  Regardless of what’s possible in the future, automated machine guns capable of finding, tracking, warning and eliminating human targets, absent of any human interaction already exist in our world. Without clear international regulations, the only thing holding arms makers back from selling such machines appears to be the conscience, not of the engineer or the robot, but of the clients. “If someone came to us wanting a turret that did not have the current safeguards we would, of course, advise them otherwise, and highlight the potential issues,” says Park. “But they will ultimately decide what they want. And we develop to customer specification.”

 

Leave a Reply

Your email address will not be published. Required fields are marked *