Discussion with Helen Nissenbaum: How should we think about obfuscation?
Celebrating the 385th anniversary of the University of Amsterdam
Many of our contemporary privacy worries originate in the large scale generation, collection, storage, analysis, and dissemination of data. When thinking about privacy protection, our natural reflex is to come up with solutions that curtail such data practices. A discussion with a.o. Helen Nissenbaum.
In her recent work, Professor Helen Nissenbaum directs our attention to an altogether different privacy protection strategy called ‘obfuscation’. Obfuscation is defined as “the deliberate addition of ambiguous, confusing, or misleading information to interfere with surveillance and data collection”. Instead of trying to curtail data flows, obfuscation is about the disruption of such flows by adding even more data to them. A great example of obfuscation is the TrackMeNot browser plugin. The plugin automatically and continuously sends search queries to the web browser of your choice. As a result, your own search queries disappear in a sea of fake ones, thus enhancing the privacy of your search queries.
As useful as obfuscation can be for the protection of privacy, it is also criticized. Some liken obfuscation to digital pollution and to free-riding. Moreover, obfuscation could also be used for less laudable aims, like hiding one’s identity when engaging in morally dubious behavior. Given its ability to do both good and bad, how should we think about obfuscation?
The panel consists of Joris van Hoboken, Sarah Eskens, and Marijn Sax. Beate Roessler will act as moderator.
About the speakers
Helen Nissenbaum is Professor of Information Science at Cornell Tech. Nissenbaum's work spans societal, ethical, and political dimensions of information technologies and digital media. Her books include Obfuscation: A User's Guide for Privacy and Protest, Values at Play in Digital Games, and Privacy in Context: Technology, Policy, and the Integrity of Social Life. She has moreover contributed to privacy-enhancing software, including TrackMeNot (for protecting against profiling based on Web search) and AdNauseam (protecting against profiling based on ad clicks).
Beate Roessler (moderator) is Professor of ethics at the University of Amsterdam and chair of the Capacity group of Philosophy and Public Affairs. She has been working in the field of privacy for almost twenty years and has published a number of books and articles on the different aspects of the ethics of privacy.
Joris van Hoboken is a Senior Researcher at the Institute for Information Law (IViR), University of Amsterdam. His research addresses law and policy in the field of digital media, electronic communications and the internet, with a focus on the fundamental rights to privacy and freedom of expression and transatlantic relations.
Sarah Eskens is a PhD Candidate at IViR as of March 2016. She studies how news personalization affects the fundamental information and privacy rights of news consumers.
Marijn Sax is a PhD candidate at IViR and in the department of philosophy. He studied Political Science and Philosophy and his research focuses on health apps, more specifically on the ethical dimensions of this new phenomenon and how ethical considerations can inform legal regulation.
You can sign up for this program for free. If you subscribe for the program we count on your presence. If you are unable to attend, please let us know via firstname.lastname@example.org | T: +31 (0)20 525 8142.
Spui 25-27 | 1012 WX AmsterdamGa naar detailpagina
+31 (0)20 525 8142