Bryan James Casey is publishing Amoral Machines, or: How Roboticists Can Learn to Stop Worrying and Love the Law in the Northwestern University Law Review. Here is the abstract.
The media and academic dialogue surrounding high-stakes decision-making by robotics applications has been dominated by a focus on morality. But the tendency to do so while overlooking the role that legal incentives play in shaping the behavior of profit maximizing firms risks “marginalizing the entire field” of robotics and rendering many of the deepest challenges facing today’s engineers utterly intractable. This Essay attempts to both halt this trend and offer a course-correction. Invoking Oliver Wendell Holmes’ canonical analogy of a “bad man . . . who cares nothing for . . . ethical rules,” it demonstrates why philosophical abstractions like the trolley problem—in their classic framing—provide a poor means of understanding the real-world constraints faced by robotics engineers. Using insights gleaned from the economic analysis of law, it argues that profit maximizing firms designing autonomous decision-making systems will be less concerned with esoteric questions of right and wrong than with concrete questions of predictive legal liability. And until such time as the conversation surrounding so-called “moral machines” is revised to reflect this fundamental distinction between morality and law, the thinking on this topic by philosophers, engineers, and policymakers alike will remain hopelessly mired. Step aside roboticists—lawyers have got this one.
Download the article from SSRN at the link.
Comments
You can follow this conversation by subscribing to the comment feed for this post.