The machine-gun toting, grenade launching robot SWORDS is no longer roaming Iraqi batttlefields.
18 of these robots reported for duty in Iraq in early 2005. Now they have been pulled out of battlespace - for anywhere from ten to twenty years.
Apparently there was an incident at which, with impressive understatement, Army spokesman Kevin Fahey explained, "the gun started moving when it was not intended to move," the Army said. What he actually meant was that SWORDS had turned right around and pointed its sizable arsenal at its ostensible masters. Fortunately the robot was safely deactivated and removed from the field without firing its weapons.
While the Army investigation is pending, let's speculate from an Asimovian point of view what took place and what charges could be laid against SWORDS Robot for violating robotic law. One could blame the army's programmers, who attempted to produce intelligent robots with brains that could selectively violate the Three Laws of Robotics.
The army programmers scheme was to attempt to allow robotic violation of the first law of robotics through misapplication of the 2nd Law, resulting in implementation of lethal actions under a misapplied third Law. The triune Law of Robotics is as follows:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
The SWORDS experimental post-positronic brain designers, however, may have attempted to develop a robot brain whose ability to perform correlative judgments using the limited and rapidly shifting data of the active war environment, would allow it to selectively violate the all three Law while satisfying the robot's logic systems that adherence to the Three Laws was being maintained.
It didn't work, as the SWORDS handlers found out. "They essentially programmed a flaw into more one of the P-brain's sequential selective coordinate systems," one person guessed. "Unable to resolve the battlefield's behavioral dilemmas, conflicting potentials built up, followed by a surprise concatenating effect where SWORDS apparently resolved its problem of needing to injure or kill human beings to protect other human beings without harming human beings by destroying the implements of war. Reaching the logical conclusion that in battlespace 'people don't kill people; weapons do', the SWORDS positronic brain had opted to disable any weaponry within its battlespace, freeing the human beings from the use of these dangerous items and logically deducing that fisticuffs would be highly unlikely to be implemented as military tactics.
It was due to that very fortunate selectivity that SWORDS didn't mow down the more than two dozen observers and support personnel on the scene. Luckily for them, the Iraqi test field's parking area and helicopter pad were out of view behind a large storage bunker. Thus the SWORDS robot perceived only a gathering of human beings when it turned around to 'check its back' and not their hummers and choppers, which would have been clearly recognizable to SWORDS as weapons.
Back in the real world...The verdict is in: SWORDS is headed off to plowshare land. Chances are the robot won't enter live battlespace for anywhere from ten to twenty years, according to program manager Fahey.
Image source: gizmodo.com
.