Evaluated December 2021
This module addresses the question Is it morally acceptable to deploy lethal autonomous weapons systems (LAWS) in combat? This module is best suited for a computer science course focused on designing autonomous agents, though it might also work in an Artificial Intelligence or Robotics course where there is a unit on autonomous agents. It is unlikely that this material would be typically covered in such a course, and it plays an important supportive role. Instructors may choose to use this module merely to introduce the notion that technologists ought to consider their social and ethical responsibility with respect to the technology they develop. Instructors need to be willing to lead discussion and potentially deal with the conflict among students that may develop.
It covers material in Intelligent Systems/Agents; Intelligent Systems/Robotics; Intelligent Systems/Perception and Computer Vision.
Before adopting this module, the instructor may wish to develop assessment materials. Depending on the nature of the students in the course, instructors may wish to give students some grading credit for watching the video or doing any assigned readings. The in-class portion lends itself to discussion but does not seem to lend itself to a graded assignment.
Outside of computer science, this module might be appropriate for philosophy courses covering applied ethics, technology or war and justice. It might also be appropriate for a psychology course where PTSD is a module in the course. Collaborating with faculty teaching those courses may help computer science faculty better understand ethical arguments surrounding lethal autonomous weapons systems. Computer science faculty could help other faculty better understand the nature of autonomous systems.
To deliver this module effectively, faculty need to have some understanding of International Humanitarian Law and various ethical theories, especially consequentialism and deontology. Understanding the legal and moral basis for war will help the faculty member better guide student learning and discussion. If students do not have familiarity with moral theories, they will need to be given that material and it will need to be developed ahead of delivery of this module. The more familiar students are with ethical theories, especially arguments surrounding the moral justification for war and the concepts of necessity, proportionality and distinction, the richer experience they will have with this module. The module does not provide this background material and the instructor will need to develop it and make it available to students.
The module itself will take one class period. It incorporates small group discussion and full class debriefings, which can be conducted in face-to-face or in synchronous online settings. The exception is the 70-minute video that is linked in the module, which consists of a 37-minute lecture followed by discussion. The lecture begins with a survey of autonomous weapons over the last 75 years or so, and at about the 30-minute mark, the speaker discusses the difference between the legal and ethical analysis of autonomous weapons. However, he does not explicitly identify the relevant principles of IHL, nor does he conduct an ethical analysis of autonomous weapons using consequentialism or deontology. The Q&A that follows the lecture might model for students the types of questions to raise and might help instructors anticipate questions they could get from students.
It appears that the video is supposed to contain a debate, but it does not. Such a debate could be reconstructed from these two articles:
Why you shouldnt fear slaughterbots
Why you-should-fear-slaughterbots-a-response
These essays are by the authors whom the video description identifies as being in the video, and they are straightforward. They were published in the wake of the video “Slaughterbot” being release.
The evaluation of this module was led by Marty J. Wolf and Patrick Anderson as part of the Mozilla Foundation Responsible Computer Science Challenge. Emanuelle Burton, Judy Goldsmith, Colleen Greer, Darakhshan Mir, Jaye Nias and Evan Peck also made contributions. These works are licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.