Ok, this is really getting out of control! What am I talking about? Well that op-ed piece that it seems EVERYONE has picked up on.
What's got everyone jumping on the band-wagon? Well it's the article that calls for a code of behavior, much like Asimov's 3 laws, that would keep a robot from running amok and doing who knows what unspeakable horrors to the human race - as in the movie "I Robot" (I know - it was trash, but people bought into it)
Are we all on the same page here? Good! Because someone out there needs a serious reality check!
Smart bombs, autonomous drones and artificial intelligence are all together none of the sort. Today's robotic systems and devices are no more complicated than throwing a light switch. Offs and ons people - period. Today's devices are the sum of their parts and programming. If said light switch, when thrown, give you a nasty jolt, you do not inquire as to its' morals or lack there-of, no, you buy another switch or repair the old one. Today's robots have on thing better than the switch, a way to save and repeat thousands of on and off requests. Called a program. If these machines fail to carry out their prerequisite ons and offs, the first thing you do is check the hardware and then debug the software. You do not give it a good talking to.
We are many many years away from the point when the switch, in a moment of moral quandary, wonders at its lot in life or the ramifications of electrocuting some ner-do-well hell bent on finding out what's inside.
The time when a robot can truly question its' reason for existence or why it's not a good idea to rip off someones arm because of a perceived slight, is still in the very far future, if it ever truly happens at all. Humans still can not tell the difference between hardware and software in themselves, let alone creating it in another. I think it's very telling when one of these drones drops a 500 pounder in the middle of a crowded square, and people call for robotic moral for the devices and never call for an accounting the ones truly responsible and most likely in need of a moral compass - the manufacturers and programmers.
for a more in depth article on robotic behaviors - click Times Online's article
What's got everyone jumping on the band-wagon? Well it's the article that calls for a code of behavior, much like Asimov's 3 laws, that would keep a robot from running amok and doing who knows what unspeakable horrors to the human race - as in the movie "I Robot" (I know - it was trash, but people bought into it)
Are we all on the same page here? Good! Because someone out there needs a serious reality check!
Smart bombs, autonomous drones and artificial intelligence are all together none of the sort. Today's robotic systems and devices are no more complicated than throwing a light switch. Offs and ons people - period. Today's devices are the sum of their parts and programming. If said light switch, when thrown, give you a nasty jolt, you do not inquire as to its' morals or lack there-of, no, you buy another switch or repair the old one. Today's robots have on thing better than the switch, a way to save and repeat thousands of on and off requests. Called a program. If these machines fail to carry out their prerequisite ons and offs, the first thing you do is check the hardware and then debug the software. You do not give it a good talking to.
We are many many years away from the point when the switch, in a moment of moral quandary, wonders at its lot in life or the ramifications of electrocuting some ner-do-well hell bent on finding out what's inside.
The time when a robot can truly question its' reason for existence or why it's not a good idea to rip off someones arm because of a perceived slight, is still in the very far future, if it ever truly happens at all. Humans still can not tell the difference between hardware and software in themselves, let alone creating it in another. I think it's very telling when one of these drones drops a 500 pounder in the middle of a crowded square, and people call for robotic moral for the devices and never call for an accounting the ones truly responsible and most likely in need of a moral compass - the manufacturers and programmers.
for a more in depth article on robotic behaviors - click Times Online's article
No comments:
Post a Comment