Evaluated December 2021
The module is designed to be incorporated into a graduate-level course on privacy and big data. However, the module is non-technical in nature and requires no prior philosophical knowledge or competency, therefore it could be used in undergraduate courses. It incorporates three case studies in a small group discussion format. It could be used to take up the issue of privacy in an Artificial Intelligence course, a Database course or an Information Security course.
This module covers material in Social Issues and Professional Practice.
Instructors adopting this module need to be aware that the public materials available only provide a summary of goals and themes and a brief account of the three case studies at its core as well as a basic outline of in-class activities. Instructors will need to establish background readings for themselves related to the philosophical and ethical concepts. This is an opportunity to collaborate with colleagues in philosophy or sociology. Students will have a richer learning experience if the instructor assigns key readings for them. Instructors will also need to develop full case information to use this material. In addition to the case studies suggested here, a fourth example rooted in one of the classic case studies of algorithmic unfairness (e.g., different job ads being shown to different users, digital redlining) will help to underscore that big data rarely impacts all users equally.
As a standalone module, it could occupy one or two class periods and since it is non-technical, it would need to replace rather than combine with existing material. Considered as a template, the module could potentially serve as the foundation for an interdisciplinary partnership. The module’s design attempts to use students’ own moral intuition to explore some of the basic problems at the core of how privacy is conceptualized and operationalized in technology. By using Facebook and COVID contact-tracing as examples, the module touches on dimensions of experience that students will recognize and will potentially see as either of secondhand importance or will be of first-hand relevance.
Instructors using this module will find that it includes no graded assignment and therefore requires no class level assessments. Faculty and programs incorporating this module will need to determine whether it merits inclusion in a program level assessment.
The evaluation of this module was led by Emanuelle Burton and Jaye Nias as part of the Mozilla Foundation Responsible Computer Science Challenge. Patrick Anderson, Judy Goldsmith, Colleen Greer, Darakhshan Mir, Evan Peck and Marty J. Wolf also made contributions. These works are licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.