The discourse around robots, drones, and other autonomous machines may be setting us up to believe that in the future humans will not be responsible for the behavior of such machines. People's attention focuses on the increasing independence of these technologies from human control and intervention. What happens when an autonomous military weapon decides to attack a target and military personnel don't understand why the machine did what it did, let alone have an opportunity to intervene? Will no one—no human—be responsible? Or will there be some schema in which the machines themselves are responsible? Experience, writes Deborah Johnson, suggests that existing systems of liability, insurance, and public criticism will make sure that humans take responsibility for their robotic creations.
Jonathan Vanderhagen believes a judge doomed his son to an early death. The judge says Vanderhagen's Facebook posts were intimidating.
Navy Confirms Authenticity of UFO Videos Published by Blink-182 Frontman's Extraterrestrial Research Organization
The videos show a U.S. military jet's encounter with what appears to be a fast-moving, unidentified object.
Conservatives deploy state power to go after speech they don't like.
"Controlled choice" is supposed to fix inequality in New York public schools. It might make everything worse.
Many arms of government are unpopular with large swathes of the American population.