Accurate, Focused Research on Law, Technology and Knowledge Discovery Since 2002

Can Computer Algorithms Learn to Fight Wars Ethically?

Washington Post Magazine: “Maybe the autonomous weapons being developed by the Pentagon will be better than humans at making moral decisions. Or maybe they’ll be a nightmare come to life…The scale of the exercises at West Point, in which roughly 100 students have participated so far, is small, but the dilemmas they present are emblematic of how the U.S. military is trying to come to grips with the likely loss of at least some control over the battlefield to smart machines. The future may well be shaped by computer algorithms dictating how weapons move and target enemies. And the cadets’ uncertainty about how much authority to give the robots and how to interact with them in conflict mirrors the broader military’s ambivalence about whether and where to draw a line on letting war machines kill on their own. Such autonomous machines were once so far beyond the technical grasp of scientists that debating their ethics was merely an intellectual exercise. But as the technology has caught up to the idea, that debate has become very real…Already, the U.S. Navy is experimenting with ships that can travel thousands of miles on their own to hunt for enemy submarines or ships that could fire guns from just offshore as the Marines storm beaches. The Army is experimenting with systems that will locate targets and aim tank guns automatically. And the Air Force is developing deadly drones that could accompany planes into battle or forge ahead alone, operating independently from “pilots” sitting thousands of miles away in front of computer screens. But while the march toward artificial intelligence in war continues, it doesn’t progress uncontested. Mary Wareham is one of the leading activists pushing governments to consider the moral ramifications of using AI in weapons. Originally from New Zealand, Wareham, whom I spoke to at her D.C. office in July 2019, has spent most of the past 20 years working for Human Rights Watch, trying to get governments to ban antipersonnel weapons such as cluster bombs and land mines. Now, as the advocacy director for the organization’s arms division, she is working to persuade world leaders to impose sweeping restrictions on autonomous weapons…”

See also At artificial-intelligence conferences, researchers are increasingly alarmed by what they see. Matthew Hutson,The New Yorker, 15 Feb, 2021.

Sorry, comments are closed for this post.