Home United States USA — IT Elon Musk says AI harbors 'vastly more risk than North Korea'

Elon Musk says AI harbors 'vastly more risk than North Korea'

303
0
SHARE

Commentary: In a couple of Friday tweet-thoughts, the Tesla CEO repeats his call for artificial intelligence to be regulated.
He’s worried. Very worried.
The mention of several place-names currently invokes shudders.
Whether it be North Korea, Venezuela or even Charlottesville, Virginia, it’s easy to get a shivering feeling that something existentially unpleasant might happen, with North Korea still topping many people’s lists.
For Tesla and SpaceX CEO Elon Musk, however, there’s something far bigger that should be worrying us: artificial intelligence.
In a Friday afternoon tweet, he offered, “If you’re not concerned about AI safety, you should be. Vastly more risk than North Korea.”
He accompanied this with a poster of a worried woman and the words, “In the end, the machines will win.”
The machines always win, don’t they? Look how phones have turned us into neck-craning zombies. And, lo, here was Musk also tweeting on Friday about a bot created by OpenAI — the nonprofit he backs — beating real humans at eSports.
Still, Musk thinks humanity can do something to fight the robots.
Indeed, he followed his North Korea message with a renewed call for AI regulation: “Nobody likes being regulated, but everything (cars, planes, food, drugs, etc) that’s a danger to the public is regulated. AI should be too.”
“Biggest impediment to recognizing AI danger are those so convinced of their own intelligence they can’t imagine anyone doing what they can’t, ” he tweeted .
You really can’t trust humans to do good, even supposedly intelligent humans.
Especially in these times when few appear to agree what good even looks like.
If you’re not concerned about AI safety, you should be. Vastly more risk than North Korea. pic.twitter.com/2z0tiid0lc
Biggest impediment to recognizing AI danger are those so convinced of their own intelligence they can’t imagine anyone doing what they can’t

Continue reading...