“If you’re not concerned about AI safety, you should be. Vastly more risk than North Korea, ” Musk warned people around the world on Twitter Friday night.
Elon Musk tweeted some warnings about artificial intelligence on Friday night.
“If you’re not concerned about AI safety, you should be. Vastly more risk than North Korea, ” Musk tweeted after his $1 billion startup, OpenAI, made a surprise appearance at a $24 million video game tournament Friday night, beating the world’s best players in the video game, “Dota 2.”
Musk claimed OpenAI’s bot was the first to beat the world’s best players in competitive eSports, but quickly warned that increasingly powerful artificial intelligence like OpenAI’s bot — which learned by playing a “thousand lifetimes” of matches against itself — would eventually need to be reined in for our own safety.
“Nobody likes being regulated, but everything (cars, planes, food, drugs, etc) that’s a danger to the public is regulated. AI should be too, ” Musk said in another tweet on Friday night.
RELATED: See North Korea’s new weapons
Musk has previously expressed a healthy mistrust of artificial intelligence. The Tesla and SpaceX CEO warned in 2016 that, if artificial intelligence is left unregulated, humans could devolve into the equivalent of “house cats” next to increasingly powerful supercomputers. He made that comparison while hypothesizing about the need for a digital layer of intelligence he called a “neural lace” for the human brain.
“I think one of the solutions that seems maybe the best is to add an AI layer, ” Musk said. “A third, digital layer that could work well and symbiotically” with the rest of your body, ” Musk said during Vox Media’s 2016 Code Conference in Southern California.
Nanotechnologists have already been working on this concept .
Musk said at the time: “If we can create a high-bandwidth neural interface with your digital self, then you’re no longer a house cat.”
Jillian D’Onfro contributed to this report.
NOW WATCH: This machine can produce 300 bricks a minute
See Also: