Designing a Coordination Mechanism for Managing Privacy as a Common-Pool Resource
Ubiquitous computing technologies such as “smart” door locks, thermostats, fitness trackers and video monitors can help make users’ lives safer and more efficient. These devices automatically collect data about users and their activities within their homes, which are then combined and processed by algorithms on a cloud server owned by the service provider. This enables beneficial system functionality that would not be possible from the devices in isolation. However, aggregating data from different points in time and about many different devices and users can also produce potentially invasive insights and inferences about individuals and households that can be surprising, unsettling or harmful when used for purposes users do not expect. This creates a social dilemma for users: the “derived data” produced by aggregation can have both positive and negative effects.
People adhere to rules and norms for offline privacy-related behaviors. However, because the collection and processing capabilities in ubiquitous computing systems are invisible and embedded in everyday objects, users cannot currently develop a shared understanding about which uses of derived data are acceptable, such as those that make the system perform better, and those that are unacceptable, like making sensitive inferences that are unrelated to system operation. This project will investigate norms for acceptable uses of derived data, as well as develop and evaluate tools to support collective privacy management decisions. The social norm studies include semi-structured interviews to identify norms, and validation experiments involving simulated norms violation and responses. Building on frameworks for analyzing social-ecological common pool resource systems, the project will perform iterative design and prototyping of a privacy coordination mechanism based on home automation systems. The system will be installed and evaluated in a real-world test to evaluate effectiveness and usability, as well as qualitative analysis of unexpected events.
This project addresses a problem with broad social importance: as “smart” devices with embedded sensors become more common, privacy in ubiquitous computing systems will increasingly be an issue that nearly everyone will encounter. This represents a shift in conceptualizing information privacy problems from the self-management model, in which users are individually responsible for making up-front decisions about what information to protect and what to disclose, to a collective governance model that will allow users to make decisions about how to manage the derived data for themselves. Demonstrating that derived data can be collectively managed like a common pool resource will point to new kinds of solutions for digital privacy issues, and the outcomes of this project have the potential to shape discussions about policy and regulation. The project also includes substantial training at both the undergraduate and graduate levels in interdisciplinary, multi-method team-based research.
PI: Emilee Rader
Publications
-
Nthala, N. and Rader, E.. “Towards a Conceptual Model for Provoking Privacy Speculation.” Poster in Poster presented at the 2020 Symposium on Usable Privacy and Security. 2020. ( Abstract, Link, PDF, Poster )
-
Emilee Rader, Samantha Hautea and Anjali Munasinghe. “I Have a Narrow Thought Process: Constraints on Explanations Connecting Inferences and Self-Perceptions” SOUPS 2020. 2020. [IAPP SOUPS Privacy Award] ( Abstract, Link, PDF )
-
Jina Huh-Yoo and Emilee Rader. “It’s the Wild, Wild West: Lessons Learned From IRB Members’ Risk Perceptions Toward Digital Research Data” Proc. ACM Hum.-Comput. Interact.. Vol. 4 No. CSCW1 pp. Article 59 (May 2020). 2020. ( Abstract, PDF )
-
Hautea, S., Munasinghe, A., and Rader, E.. “That’s Not Me: Surprising Algorithmic Inferences.” Poster in Extended Abstracts of the 2020 CHI Conference On Human Factors In Computing Systems. 2020. ( Abstract, Link )
-
Nthala, N. and Rader, E.. “Towards a Conceptual Model for Provoking Privacy Speculation.” Poster in Extended Abstracts of the 2020 CHI Conference On Human Factors In Computing Systems. 2020. ( Abstract, Link )
-
Jayati Dev, Emilee Rader and Sameer Patil. “Why Johnny Can’t Unsubscribe: Barriers to Stopping Unwanted Email” Proceedings of the ACM Conference on Human Factors in Computing (CHI). 2020. ( Abstract, Link )
-
Emilee Rader and Janine Slaker. “The Importance of Visibility for Folk Theories of Sensor Data” SOUPS 2017. Santa Clara, CA. July 2017. ( Abstract, Link, PDF )
-
Yumi Jung and Emilee Rader. “The Imagined Audience and Privacy Concern on Facebook: Differences Between Producers and Consumers” Social Media + Society. 2016. ( Abstract, Link, PDF )
-
Emilee Rader. “Effects of Data Aggregation Awareness on Information Privacy Concern” Proceedings of the Symposium on Usable Privacy and Security (SOUPS). Menlo Park, CA. July 2014. ( Link )
News
-
Jina Huh-Yoo (Drexel University) and Emilee Rader published a paper titled “It’s the Wild, Wild West: Lessons Learned From IRB Members’ Risk Perceptions Toward Digital Research Data” in the Proceedings of the ACM on Human-Computer Interaction in May 2020.
-
Samantha Hautea will be presenting a poster coauthored with Anjali Munasinghe and Emilee Rader titled, “That’s Not Me: Surprising Algorithmic Inferences”, at CHI 2020 in Honolulu, HI!
-
Norbert Nthala will be presenting a poster coauthored with Emilee Rader titled, “Towards a Conceptual Model for Provoking Privacy Speculation”, at CHI 2020 in Honolulu, HI!
-
“Why Johnny Can’t Unsubscribe: Barriers to Stopping Unwanted Email”, a paper by Emilee Rader and coauthors Jayati Dev and Sameer Patil from Indiana University, will be published at CHI 2020 in Honolulu, HI!
-
Emilee Rader wrote an article for The Conversation in February, 2019, “Most Americans don’t realize what companies can predict from their data”
-
Check out this video of Emilee Rader for Data Privacy Day in January, 2019: “Five things you might not know about digital privacy”
-
Emilee will be presenting a paper with Janine Slaker at SOUPS 2017 about their work on the Privacy project, titled “The Importance of Visibility for Folk Theories of Sensor Data”.
-
Yumi Jung and Emilee published a new article in Social Media + Society
-
Emilee received a $463,000 grant from the National Science Foundation to study privacy