Monday, December 14, 2015

Forget Terminator – Elon Musks new companies shall ensure that robots are our friends – IDG.se

ai
Elon Musk want to make the robots to our friends.

From cars and mobile phones to Facebook’s news feed. Increasingly in our society is controlled by artificial intelligence – so how do we avoid a Terminator scenario where robots annihilate humanity? Elon Musk is so clear answer.

Together with a group of Silicon Valley profiles he is launching a nonprofit company called Open AI and has a budget equivalent to 8.5 billion. The goal is to “develop digital intelligence in the manner most likely to be of benefit to all humanity, without having to generate financial gain”.

In plain language: the company must ensure that the robots work for us, not against us.

“Today’s AI system has been impressive, but limited abilities. It seems that we will continue to whittle away the restrictions, and in extreme cases they can reach human levels in virtually all intellectual fields. It is hard to imagine how much benefit the community would have an AI on the human level, and it is equally difficult to imagine how much it could harm society if it were built improperly, “writes Open AI.

On the list of scholars are several famous names from American University, and backroom get Elon Musk distinguished company. Y Combinator-chairman, Sam Altman, computer scientists Alan Kay and venture capitalist Peter Thiel, who founded Paypal together with Musk are few. Moreover, they are backed by Amazon Web Services and Infosys.

This is Elon Musks first call for caution. As recently as last summer, he co-authored, together with including Stephen King and Steve Wozniak, an open letter warning of a robot apocalypse if we do not properly restrict autonomous weapons systems.

“If any eminent military power Pressing against Artificial weapons development, a global arms race will be inevitable, and the result of this technology the course will be obvious: AI weapons will be tomorrow Kalashnikovs, “wrote the group.

LikeTweet

No comments:

Post a Comment