Controlling the Future of Privacy
Last week, I was fortunate enough to see several cool new applications of location technology and social data at two conferences which bookended my week. Privacy issues were addressed at the end of each conference, which I understand: a lecture about privacy is the last thing entrepreneurs and researchers want to hear. Unfortunately, privacy can be something of a bummer — just today, Forrester Research made headlines with a report predicting a privacy “crackdown” in 2016.
Privacy has an image problem. It doesn’t help that while some specific privacy laws and regulations are quite clear, “privacy” as a concept remains amorphous and ill-defined. Rather than an ideology or a value, privacy is a tool in pursuit of those things. For many years, privacy has been defined along the lines of offering individuals control over their information. The 2012 White House Consumer Privacy Bill of Rights places a principle of individual control front and center, before any other consumer right, declaring that “[c]onsumers have a right to exercise control over what personal data companies collect from them and how they use it.”
Yet, if privacy is about control, what does it mean when people feel completely out of control when it comes to their digital footprint? Last fall, a Pew survey found that 91% of Americans believe they “have lost control over how personal information is collected and used by companies.” Our current conceptions of privacy as control doesn’t really work anymore. Provocateurs at these conferences noted that modern society has made it impossible to survive, let alone thrive without a smartphone. Going off the grid can, by itself, be suspicious. Having a LinkedIn profile can be a professional requisite.
At the Future of Privacy Forum, we have long called for efforts to “featurize” privacy. After spending a week seeing the many ways innovators could deploy data, we need to embrace creative, out-the-box ways to get consumers to think about how they can use and take advantage of their data online. Advances in web design and, more recently, app development have made everything from tracking personal finances to reading the text-heavy Harvard Law Review more enjoyable. There’s no reason design and functionality can’t also be used to make privacy more engaging.
Even small tweaks go far. Facebook, for example, recently featured a blue privacy dinosaur to help its users with a “privacy check-up.” More than 86% of Facebook users seeing the tool actually completed the entire privacy check-up, and Facebook suggested that the dinosaur “helped make the experience a little more approachable and a little more engaging.” Presenting users with a privacy check-up is easier than asking them to wade through a myriad of privacy settings of their own volition. Putting these simple tools right in front of user eyeballs not only makes privacy more approachable, but perhaps more salient.
Getting privacy right will only become more important. Increasingly, a trust deficit already exists when it comes to innovative uses of information. According to a presentation by Susan Etlinger from Altimeter Research, 45% of American consumers report having little or not trust in how organizations use their data. Getting users engaged with their data is only half the privacy battle, however.
Beyond featurization or “appifying” privacy in ways that gives users more control and ownership over their information, much more work needs to be done to provide more transparency and to build accountability measures around data use. As Steve Hegenderfer at Bluetooth described it, companies need to offer both good product and good policy. This Friday, I’ll be discussing my thoughts on what this looks like — and what the future of privacy holds in general at Privacy & Access 20/20 Conference in Vancouver.