Post by Her3tiK on Sept 20, 2011 10:54:28 GMT -5
No, really. Apparently the military's got automated drones that can find targets without any input at all.
www.washingtonpost.com/national/national-security/a-future-for-drones-automated-killing/2011/09/15/gIQAVy9mgK_story.html?hpid=z1
Essentially, the idea proposed here is that an unmanned drone of some kind (it mentions UAVs and automated cars) identify some target based on facial recognition, and then resort to internal programming to decide whether or not to kill said target. While these systems are supposed to await human instruction if the situation is too difficult for them to process, the whole idea still seems filled with problems. For example, I'd hate to have a twin on the terrorist watch list, or whatever it is they'll be searching for targets.
Then, of course, there's the whole 'no people involved in killing another person' issue:
The article goes on to describe in incident in South Africa where a "semi-autonomous" cannon malfunctioned and killed nine "friendly" soldiers. You can see how a glitch in the programming could be an issue here. All it takes is one dick to replace the list of targets with, I dunno, a garrison, to make this into a really bad idea. But surely that could never happen.
Skynet 2012!
www.washingtonpost.com/national/national-security/a-future-for-drones-automated-killing/2011/09/15/gIQAVy9mgK_story.html?hpid=z1
Essentially, the idea proposed here is that an unmanned drone of some kind (it mentions UAVs and automated cars) identify some target based on facial recognition, and then resort to internal programming to decide whether or not to kill said target. While these systems are supposed to await human instruction if the situation is too difficult for them to process, the whole idea still seems filled with problems. For example, I'd hate to have a twin on the terrorist watch list, or whatever it is they'll be searching for targets.
Then, of course, there's the whole 'no people involved in killing another person' issue:
The killing of terrorism suspects and insurgents by armed drones, controlled by pilots sitting in bases thousands of miles away in the western United States, has prompted criticism that the technology makes war too antiseptic. Questions also have been raised about the legality of drone strikes when employed in places such as Pakistan, Yemen and Somalia, which are not at war with the United States. This debate will only intensify as technological advances enable what experts call lethal autonomy.
The prospect of machines able to perceive, reason and act in unscripted environments presents a challenge to the current understanding of international humanitarian law. The Geneva Conventions require belligerents to use discrimination and proportionality, standards that would demand that machines distinguish among enemy combatants, surrendering troops and civilians.
The prospect of machines able to perceive, reason and act in unscripted environments presents a challenge to the current understanding of international humanitarian law. The Geneva Conventions require belligerents to use discrimination and proportionality, standards that would demand that machines distinguish among enemy combatants, surrendering troops and civilians.
The article goes on to describe in incident in South Africa where a "semi-autonomous" cannon malfunctioned and killed nine "friendly" soldiers. You can see how a glitch in the programming could be an issue here. All it takes is one dick to replace the list of targets with, I dunno, a garrison, to make this into a really bad idea. But surely that could never happen.
Skynet 2012!