It is a fact that Artificial Intelligence scares the crap out of some people. The knowledge and
resources are in place that allow for the construction of more powerful AI systems. It doesn't stretch the imagination that such AI systems could reach the level of humans in intelligence and in doing so have the capability to construct AIs that could be far more intelligent than the original AI.
Present AIs are fairly cute and simplistic. However as the past has demonstrated, as the systems become more capable, more responsibilities will be shifted to them. The question on a lot of people's minds are how long before we can no longer control these computer intelligences?
As the author postulates in the Irish Times Technology section:
- Imagine how a medical robot, originally programmed to treat cancer, could conclude that the best way to obliterate cancer is to exterminate humans who are genetically prone to the disease.
From the the Irish Times article:
- Elon Musk, recently said artificial intelligence is “potentially more dangerous than nukes”. Stephen Hawking, one of the smartest people on earth, wrote that successful AI “would be the biggest event in human history. Unfortunately, it might also be the last.
No comments:
Post a Comment