Press "Enter" to skip to content

Prominent transhumanist on Artificial General Intelligence: ‘We must stop everything. We are not ready.’

Warning came during panel titled ‘How to Make AGI Not Kill Everyone’

At last week’s SXSW conference, prominent transhumanist Eliezer Yudkowsky said that if the development of artificial general intelligence is not stopped immediately across the globe, humanity may be destroyed.

“We must stop everything,” Yudkowsky said during a panel titled “How to Make AGI (Artificial General Intelligence) Not Kill Everyone.”

“We are not ready,” he continued. “We do not have the technological capability to design a superintelligent AI that is polite, obedient and aligned with human intentions – and we are nowhere close to achieving that.”

Yudkowsky, founder of the Machine Intelligence Research Institute, has made similar comments in recent years, repeatedly warning that humanity must cease all work on AGI or face human extinction.

Read Full Article Here…(allisrael.com)


Home | Caravan to Midnight (zutalk.com)

Live Stream + Chat (zutalk.com)

We Need Your Help to Keep Caravan To Midnight Going,

Please Consider Donating to Help Keep Independent Media Independent

Breaking News: