Principled sensing will often involve getting permission from those being sensed. We can get some ideas about how to think about this process from the paper Affective Sensors, Privacy, and Ethical Contracts by two MIT Media lab researchers, Carson Reynolds (now at U. Tokyo) and Prof. Rosalind Picard. While not a new paper, it seems like a good place to get started for newcomers to the goal of appropriate sensing. From the abstract:
Sensing affect raises critical privacy concerns, which are examined here using ethical theory, and with a study that illuminates the connection between ethical theory and privacy. We take the perspective that affect sensing systems encode a designer’s ethical and moral decisions: which emotions will be recognized, who can access recognition results, and what use is made of recognized emotions. Previous work on privacy has argued that users want feedback and control over such ethical choices. In response, we develop ethical contracts from the theory of contractualism, which grounds moral decisions on mutual agreement. Current findings indicate that users report significantly more respect for privacy in systems with an ethical contract when compared to a control.
A later quote: “Our theory asserts that ethical decisions are encoded by interaction technology.” Sounds right to me. See the Affective Computing Group for more recent papers. —Chris Peterson