A newfangled bill proposed by US lawmakers would preventartificial intelligence(AI ) from being able to singlehandedly launch atomic weapons without human remark , codifying live Pentagon rules .

While current rules prohibit the sovereign launch of atomic weapons , there are n’t any real Pentateuch that foreclose this from happening . With the astronomical acclivity in AI models in recent years , functionary have become implicated that they could slip their path into the very top - level decision - devising of the US military .

In anticipation of this possibility , Senator Edward Markey ( D - Mass. ) and Representatives Ted W. Lieu ( CA-36 ) , Don Beyer ( VA-08 ) , and Ken Buck ( CO-04 ) have introduced a bipartisan bill calledBlock Nuclear Launch by Autonomous Artificial Intelligence Actthat will “ safeguard the nuclear dictation and control process from any future variety in insurance that allows artificial intelligence service ( AI ) to make nuclear launching decision ” .

It will ensure humans are “ in the loop ” following an order by the President to launch a nuclear arm , for either defense or offense .

“ AI technology is developing at an super rapid step , ” said Representative Ted Lieu in astatement .

“ While we all seek to grapple with the pace at which AI is accelerating , the future of AI and its role in society remains undecipherable . It is our job as Members of Congress to have responsible foresight when it comes to protect future generation from potentially devastating consequences . That ’s why I ’m pleased to insert the two-way , two-chambered Block Nuclear Launch by Autonomous AI Act , which will insure that no matter what happens in the future , a human being has control over the utilization of a nuclear arm – not a robot . AI can never be a substitute for human sagaciousness when it comes to launching atomic weapons . "

The bill is following - through on a recommendation from a 2021 National Security Commission on Artificial Intelligencereportthat indicate such a law , in the hope that the US would spearhead the idea for other nuclear powers to keep up .

AI models have no concept of empathy and would not truly realise the impact of a nuclear weapon , so allowing them uncontrolled access to the launch system could lead to a calamity that could otherwise be stave off . For example , Soviet submariner Vasili Arkhipov exclusive - handedlyprevented nuclear warwhen their Captain mistakenly suppose that war had demote out between the US and the Soviet Union – had AI been at the helm , the world could look very dissimilar today .