Press "Enter" to skip to content

MOST FRIGHTENING HORROR MOVIE EVER IS NOT SCIENCE FICTION: AI expert warns Elon Musk-signed letter doesn’t go far enough, says ‘literally everyone on Earth will die’


An artificial intelligence expert with more than two decades of experience studying AI safety said an open letter calling for six-month moratorium on developing powerful AI systems does not go far enough.

Eliezer Yudkowsky, a decision theorist at the Machine Intelligence Research Institute, wrote in a recent op-ed that the six-month “pause” on developing “AI systems more powerful than GPT-4” called for by Tesla CEO Elon Musk and hundreds of other innovators and experts understates the “seriousness of the situation.” He would go further, implementing a moratorium on new large AI learning models that is “indefinite and worldwide.”

The letter, issued by the Future of Life Institute and signed by more than 1,000 people, including Musk and Apple co-founder Steve Wozniak, argued that safety protocols need to be developed by independent overseers to guide the future of AI systems.


Read Full Article Here…(

Home | Caravan to Midnight (

We Need Your Help To Keep Caravan To Midnight Going

Please Consider Donating To Help Keep Independent Media Independent

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Breaking News: