The discourse around robots, drones, and other autonomous machines may be setting us up to believe that in the future humans will not be responsible for the behavior of such machines. People's attention focuses on the increasing independence of these technologies from human control and intervention. What happens when an autonomous military weapon decides to attack a target and military personnel don't understand why the machine did what it did, let alone have an opportunity to intervene? Will no one—no human—be responsible? Or will there be some schema in which the machines themselves are responsible? Experience, writes Deborah Johnson, suggests that existing systems of liability, insurance, and public criticism will make sure that humans take responsibility for their robotic creations.
Biden's Latest Round of Student Loan Debt Forgiveness Is an Indictment of Federal Higher Education Subsidies
Thirty-five years after Bill Bennett sounded the alarm about student loan defaults, we still haven't learned a damn thing.
But the appeals court wasn't having it.
In 2018, the Republican said family separations were "tragic and heart-rending."
"I chose to be that guy who didn't issue the apology," says Daniel Elder. "Things went from there and it wasn't good."
A new survey of students' free speech attitudes has both encouraging and worrying findings.