The discourse around robots, drones, and other autonomous machines may be setting us up to believe that in the future humans will not be responsible for the behavior of such machines. People's attention focuses on the increasing independence of these technologies from human control and intervention. What happens when an autonomous military weapon decides to attack a target and military personnel don't understand why the machine did what it did, let alone have an opportunity to intervene? Will no one—no human—be responsible? Or will there be some schema in which the machines themselves are responsible? Experience, writes Deborah Johnson, suggests that existing systems of liability, insurance, and public criticism will make sure that humans take responsibility for their robotic creations.
A former staffer says he sexually assaulted her in 1993.
The Scandinavian country is betting against draconian restrictions and in favor of the free movement of people and goods.
Plus: civic dynamism on display, Justice Department embraces home detainment of federal prisoners, and more...
Plus: COVID-19 in prisons and jails, Trump campaign threatens TV stations, state disparities in new coronavirus cases, and more...
No, British Epidemiologist Neil Ferguson Has Not 'Drastically Downgraded' His Worst-Case Projection of COVID-19 Deaths
But he has raised his estimate of the virus's reproduction number, which implies a lower fatality rate than his research group initially assumed.