2,400 Leading Artificial Intelligence (AI) Scientists Sign Pledge Against Killer Robots

Wiki
image_pdfimage_print

More than 2.400 scientists who specialize in artificial intelligence (AI) have declared that they will not participate in the development or manufacture of robots that can identify and attack people without human oversight.  The Future of Life Institute is behind the pledge that calls for norms, laws and regulations that stigmatize, shame and effectively outlaw the development of killer robots.

Thousands of scientists who specialise in artificial intelligence (AI) have declared that they will not participate in the development or manufacture of robots that can identify and attack people without human oversight.

Demis Hassabis at Google DeepMind and Elon Musk at the US rocket company SpaceX are among more than 2,400 signatories to the pledge which intends to deter military firms and nations from building lethal autonomous weapon systems, also known as Laws.

The move is the latest from concerned scientists and organisations to highlight the dangers of handing over life and death decisions to AI-enhanced machines. It follows calls for a preemptive ban on technology that campaigners believe could usher in a new generation of weapons of mass destruction.

Orchestrated by the Boston-based organisation, The Future of Life Institute, the pledge calls on governments to agree norms, laws and regulations that stigmatise and effectively outlaw the development of killer robots. In the absence of such measures today, the signatories pledge to “neither participate in nor support the development, manufacture, trade, or use of lethal autonomous weapons.” More than 150 AI-related firms and organisations added their names to the pledge to be announced today at the International Joint Conference on AI in Stockholm.

Yoshua Bengio, an AI pioneer at the Montreal Institute for Learning Algorithms, told the Guardian that if the pledge was able to shame those companies and military organisations building autonomous weapons, public opinion would swing against them. “This approach actually worked for land mines, thanks to international treaties and public shaming, even though major countries like the US did not sign the treaty banning landmines. American companies have stopped building landmines,” he said. Bengio signed the pledge to voice his “strong concern regarding lethal autonomous weapons.”

Read full article here…

Visit our Classified ads.

Check out our Classified ads at the bottom of this page.

Recent stories & commentary

Freedom

NBC Employs Researcher to Dox Trump Supporters

October 23, 2020 Revolver and Fox News 0

Brandy Zadrozny, an online researcher with NBC News, says her job is to seek out personally identifying information on anonymous Trump supporters. She has access to third-party dark data vendors who monitor and sell internet users’ personal information. Recently, she has targeted QAnon, a movement that supports Trump.

Classifieds

For classified advertising rates and terms, click here. The appearance of ads on this site does not signify endorsement by the publisher. We do not attempt to verify the accuracy of statements made therein or vouch for the integrity of advertisers. However, we will investigate complaints from readers and remove any message we find to be misleading or that promotes anything fraudulent, illegal, or unethical.

Subscribe
Notify of
guest
1 Comment
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Gil
Gil
2 years ago

Laughing my backside off. What do you call a drone but a flying robot which has human oversight when it is used to kill whoever the commanders feel like plus political assassinations. All PR, means nothing.