"Slaughterbots" is a term used to describe drones that use artificial intelligence and facial recognition to target people with lethal force.
The term was brought into the popular lexicon by an arms-control advocacy video created in 2017 by the Future of Life Institute (FLI) and Stuart Russell, who is a professor of computer science at UC Berkeley.
While defense analyst Paul Scharre criticized the video as fear mongering "science fiction," the team behind the campaign continue to argue that "autonomous weapons are potentially scalable weapons of mass destruction (WMDs); essentially unlimited numbers can be launched by a small number of people. "
Max Tegmark, a professor at MIT and president of FLI, also warned that these weapons could be used by cartels and political dissents to carry out targeted assassinations.
In 2021 a resolution to ban autonomous lethal weapons failed to pass at the United Nations.
Indicator | Value |
---|---|
Stars | ★★★☆☆ |
Platform | Metaculus |
Number of forecasts | 155 |
"Slaughterbots" is a term used to describe drones that use artificial intelligence and facial recognition to target people with lethal force.
The term was brought into the popular lexicon by an arms-control advocacy video created in 2017 by the...
<iframe src="https://metaforecast.org/questions/embed/metaculus-11122" height="600" width="600" frameborder="0" />