David Kotz over at Dartmouth has been doing some interesting work on helping individuals control data sensed about us:
As pervasive environments become more commonplace, the privacy of users is placed at increased risk. The numerous and diverse sensors in these environments can record users’ contextual information, leading to users unwittingly leaving “digital footprints.” Users must thus be allowed to control how their digital footprints are reported to third parties. While a significant amount of prior work has focused on location privacy, location is only one type of footprint, and we expect most users to be incapable of specifying fine-grained policies for a multitude of footprints. In this paper we present a policy language based on the metaphor of physical walls, and posit that users will find this abstraction to be an intuitive way to control access to their digital footprints. For example, users understand the privacy implications of meeting in a room enclosed by physical walls. By allowing users to deploy “virtual walls,” they can control the privacy of their digital footprints much in the same way they control their privacy in the physical world. We present a policy framework and model for virtual walls with three levels of transparency that correspond to intuitive levels of privacy, and the results of a user study that indicates that our model is easy to understand and use.
Sounds great! One quibble about “Users must thus be allowed to control how their digital footprints are reported to third parties” — who is the second party, and how do users control what that party gets? The sensor itself, or the sensor operator? In either case, that is also something to address up front.
I was interested and admittedly surprised to see that this research was funded by the Bureau of Justice Assistance at the U.S. Department of Justice. —Chris Peterson