7 Times Smart People Said AI Would Spin Out of Control


A recent survey of AI researchers around the world showed that more than a third of them were concerned that AI could eventually lead to a “global catastrophe” tantamount to nuclear war. The AI Index report released by the Stanford Institute for Human-Centered Artificial Intelligence shows that researchers are quite concerned about what could happen to this technology if not governed by proper regulations. “These systems demonstrate capabilities in answering questions and generating text, images and code that were unimaginable a decade ago, and they outperform the state of the art on many benchmarks, old and new,” he said report says. “However, they are prone to hallucinations, are regularly biased, and can be lured into serving nefarious ends, highlighting the complicated ethical challenges associated with their deployment.”