Tackling Privacy, One Carnegie Mellon Project at a Time
Last Thursday, the Future of Privacy Forum hosted a conversation among five of CMU’s leading privacy researchers. While the panelists discussed a number of their leading privacy projects, I wanted to highlight some of the interesting takeaways I took from the presentation.
Many of the researchers focused on how subtle nudges can be used to change people’s behaviors. While this is frequently done to encourage users to share more data, the CMU researchers expressed in interest in exploring how nudges can be “used for good.” Discussing efforts by hotels to get patrons to reuse wash towels, Jason Hong explained how subtle changes in wording reminders — from “please recycle” to “75% of guests in this room” — could have significant impacts on patron recycling behaviors.
Lujo Bauer explained how these sorts of nudges could be applied to password composition meters. Increasingly, online services detail password requirements to users and show either colored bars or outright classify a user’s proposed password as “weak” or “strong.” According to Bauer, people typically do not try very hard to get to the point where a meter tells them the password is excellent, but “they will avoid it if a meter tells them their password sucks.” His takeaway: when it comes to security measures, avoid giving users too much positive feedback.
Bauer lamented that the online ecosystem is forcing users to engage in insecure behaviors. Of course, while nudges could be used to reinforce positive behaviors, it begs the question what is defined as “positive” behavior. When it comes to security issues like passwords, promoting better security may be a no brainer, but things are much less binary when it comes to privacy. Privacy-protective nudges can push towards privacy paternalism, which may be no more ethical than the alternative.
Travis Breaux highlighted the continuing challenge of communicating privacy policy into engineering objectives. He noted that many mobile app developers still do not understand the privacy implications that can come with connecting their apps through outside services and social networks, which calls for the need to further detail the entire data supply chain. Breaux explored the potential behind establishing rich data collection/use descriptions that could be more detailed and useful than generic privacy policies, and describing a case study involving applications on Facebook, explained how these sorts of tools could help developers understand more accurately how they are collecting, using, and repurposing information.
Lorrie Cranor discussed the difficulties with communicating data use in the Internet of Things whether through visual, auditory, or haptic channels, or make information “machine readable (if you remember P3P and DNT).” She also highlighted one study that looked at the timing dimension of providing users with notice. A student developed a simple history quiz app that displayed a privacy notices in different places: (1) in the app store, (2) as soon as the app was opening, (3) in the middle of the history quiz, (4) at the quiz’s end or (5) never at all. “We invited people to take our quiz, but didn’t tell them it was about privacy,” she explained.
When users where then asked about the contents of that privacy notice, the study found that people who “saw” the policy in the app store could not recall it any better than people who did not see it at all. According to Cranor, at the time a user is downloading an app, they are not paying attention to other information in the app store. This “doesn’t suggest you don’t put that info in the app store . . . but suggests that sort of timing may not be sufficient. Also suggests it’s really important to test these things.”
Norman Sadeh further criticized the state of our overly-complicated privacy policies. “It’s not the case that every single sentence in a privacy policy matters,” he stated, discussing his effort to try to extract the key points of interest to users from privacy policies.
Last but not least, the group described its Bank Privacy Project. The researchers described how larger banks tend to collect more information and use it for more purposes, while smaller banks do the exact opposite. “If you don’t want your bank sharing,” Cranor explained, “you need to find a bank you’ve never heard of.” Because this is nigh-impossible for an average consumer to do, enter the Bank Privacy Project.
-Joseph Jerome, Policy Counsel