Solove and Schwartz Reconcile US-EU Privacy Law

In their new essay, Reconciling Personal Information in the United States and European Union, Professors Dan Solove and Paul Schwartz explore the divergence between European and US privacy law. The pair point to trans-Atlantic differences in the definition of personally identifiable information (PII) as one of the biggest challenges for harmonizing the the two legal systems’ privacy regimes.

“The differences in the definitions of PII in the two systems are caused by their disparate treatment of this situation: frequently, data is merely identifiable, but the people to whom the data pertains are not currently identified,” they write. In the United States, PII has a plethora of different and inconsistent definitions, while the EU approach is broad and vague. This fundamental difference presents serious compliance costs for companies doing business in America and Europe.

Solove and Schwartz build on their conception of PII 2.0 to address this challenge. PII 2.0 establishes two categories of PII — “identified” and “identifiable” data — and tailors legal protections to the level of risk to individuals. “PII 2.0 would enlarge the scope of some U.S. privacy laws, but it would not impede data flows within the United States or internationally. In the European Union, it would provide for more tailored and nuanced protections,” they conclude.

Chuck Schumer Joins Lineup of Local Marketing Experts to Discuss Location Privacy at Leading Industry Conference

Top innovators in hyperlocal business are confirmed to speak at Street Fight Summit 2013, with more industry-leading speakers joining the lineup weekly. Some of the headliners already confirmed to speak at Street Fight Summit include Jules Polonetsky, Director, Future of Privacy Forum.

Terrell McSweeny Touches on Privacy Issues During Confirmation Hearing

During her confirmation hearing today, Terrell McSweeny briefly touched upon privacy issues.  With her children in the background, she noted that she is particularly cognizant of the need to protect children’s privacy.  Noting that her small children were already both “iPad and app proficient,” she recognized that mobile technology–and the ways that children are using this technology–is evolving rapidly in both good and bad ways. In response to a question by Senator Ed Markey (D-Mass.), McSweeny stated that she was interested in working with the Senator on protections for teenagers who fall outside of current COPPA rules.
On the topic of data brokers, McSweeny suggested that the FTC had an important mission to educate the public. “I’m often struck by how little most of us know about how information is collected and used online,” she said, adding that she thought there were lessons that could be learned from the various multistakeholder processes that have been pursued in the past few years.

Framing Big Data and Privacy

 

The first session from the Big Data & Privacy: Making Ends Meet conference held on September 10, 2013. Event was co-hosted by the Future of Privacy Forum and Stanford Law School’s The Center for Internet and Society. Panelists are Martin Abrams, Deirdre Mulligan, Neil Richards, Omer Tene, Erik Jones, and Jules Polonetsky moderated.

The full report is available here.

Sharing Thoughts on Big Data and Privacy

We wanted to draw your attention to several excellent pieces discussing and summarizing last Tuesday’s “Big Data and Privacy: Making Ends Meet” workshop:

The full report on the workshop is available here.

Apple Introduces New Privacy Features with iOS 7

With all the excitement around the launch of the new iPhone and iOS 7, we thought it would be interesting to highlight some new privacy-specific features. Some of these have been introduced as part of the iOS 7 software; others, like the Touch ID fingerprint scanner, are specific to the new iPhone. Here are some new privacy features worth noting:

Limitations on Tracking

No MAC address or UDID tracking

iOS 7 restricts developers’ ability to track users using MAC addresses and UDIDs. This is a technical change; developers trying to access a MAC address are shown an identical number for every device. While the App Store has not accepted new apps that access UDIDs since May, iOS 7 now provides a Vendor Identifier (a unique per-developer identifier) to existing applications accessing the UDID.

Safari privacy

In iOS 7, Safari’s “Private” mode can now be enabled from within the browser itself. Safari’s Do Not Track setting is in Settings > Safari. Enabling private browsing will also turn on Do Not Track for that session.

Limit Ad Tracking 

When Apple advised developers not to use device IDs for tracking, it provided an alternative Advertising Identifier that users could clear. This option is now more visible; it resides under Settings > Privacy > Advertising, rather than under the About section. Users can also find the option to limit ad tracking in the Advertising section.

Other Privacy Features

Touch ID

The new iPhone allows users to unlock the phone using a biometric fingerprint scanner. The phone does not store an image of the fingerprint, but instead stores a mathematical representation of your fingerprint. This information is stored only on the device within an advanced security architecture called the Secure Enclave. Fingerprint data is never accessed by iOS or other third party apps, never stored on Apple servers, and never backed up to iCloud or anywhere else.

Find My iPhone Activation Lock

Find My iPhone now has an ‘Activation Lock,’ requiring a correct Apple ID and password to turn off Find My iPhone or erase the device. Even if the device is erased, reactivating it also requires a correct Apple ID and password.

Frequent Locations

Frequent Locations is a visualization of information that the iPhone collects regarding places you’ve been and how often. This information is kept solely on the device unless a user opts-in to improve Apple Maps, in which case some locations may be submitted in an anonymous form.

Security Features

iOS 7 also debuts data isolation for the microphone and motion detector worldwide. Additional security features are discussed in greater detail in this article.

We’re excited to see these features implemented and look forward to the launch.

Campaigns, the Internet, and Big Data

In this morning’s Politico, Emile Schultheis explores how German politicians are trying to emulate the Obama’s campaign data and digital operation in advance of the country’s September 22 federal election.  “The Internet has come into the center of the campaign,” Robert Heinrich, Green Party campaign manager, is quoted as saying.  He notes, however, that campaigns “will not win a campaign with the Internet, but without it, you will lose.”

This echoes comments provided by Rayid Ghani at the FPF and Stanford Center for Internet & Society “Big Data and Privacy” workshop on Tuesday.  Ghani, the former Chief Scientist for the Obama 2012 campaign, has turned his attention to using data for good.  Speaking about his time on the campaign, he suggested that the campaign’s data operation may have had nothing to with the President’s ultimate margin of victory.  Instead, he emphasized that good data was all about improving the President’s probability of victory.

“People are assuming that you can predict everything about everyone 10 years in advance with big data, but algorithms are not as deterministic as some think. It is more about probability,” Ghani said.  In his address about Big Data, he admitted that he wasn’t nearly as concerned about privacy threats as he was the public’s lack of education and knowledge about how data works.

Informing people about how data are used—and how inferences are drawn from it—was suggested over and over again as one way to address public concerns about Big Data.  Perhaps education and digital literacy will be a primary tool to help make privacy concerns and Big Data benefits meet?

Future of Privacy Forum and Stanford’s Center for Internet and Society Team Up to Talk Big Data Risk and Rewards

For Immediate Release

Contact: Melissa Merz, 773.505.6037

[email protected]

Tuesday, September 10, 2013

Future of Privacy Forum and Stanford’s Center for Internet and Society Team Up to Talk Big Data Risk and Rewards

 

Washington, DC – The Future of Privacy Forum (FPF) and Stanford’s Center for Internet and Society (CIS) today hosted a forum addressing critical issues involved with the collection and use of Big Data.

The day-long forum was sold out and had to be moved twice to accommodate space demands.

On September 3, the Stanford Law Review Online published 11 new papers by privacy leaders in academia, government and business in a special symposium issue dedicated to the tension between big data innovation and privacy risks. Many of the papers are slated to be presented today at a forum titled Big Data and Privacy: Making Ends Meet, which is hosted by Microsoft in Washington, DC.

The forum makes up one of the largest gatherings of privacy experts to take place since the recent revelations about the massive scope of NSA data collection and analysis. While Snowden’s leaks focused on data-driven government surveillance, big data tools also are used to spark innovation and growth in the private sector in areas ranging from public health and scientific research to energy conservation and sustainable development.

The Stanford publication and today’s forum raises issue at the heart of the current privacy discussion such as balancing big data’s benefits against privacy costs. In their presentation, FPF Executive Director Jules Polonetsky and FPF Senior Fellow and CIS Affiliate Scholar Omer Tene proposed a first-of-its kind framework for assessing big data benefits against attendant privacy costs. The full paper – published last week — can be read here. The calculation takes into account factors such as who benefits from big data; who bears the costs; and what is the probability of risks and rewards.

“Finding the right balance between privacy risks and big data rewards is one of the biggest public policy challenges of our time,” Polonestky and Tene said. “Unfortunately, the discussion lurches from crisis to crisis, focusing on legalistic formalities while the bigger policy choices remain blurred. This is a debate that has to happen now, not later.”

Rayid Ghani, co-founder of Edgeflip, a startup building social media analytics products for non-profits and the former Chief Scientist for the Obama for America 2012 campaign, is scheduled to provide a keynote address. Jennifer Stoddart, the Privacy Commissioner of Canada, will deliver closing notes.

Today’s forum is scheduled from 9 a.m. ET to 5 p.m. ET in Washington, DC, at Microsoft’s Innovation and Policy Center, 901 K St., NW.

Media should contact Melissa Merz at 773.505.6037 or at [email protected]. Also, follow the conversation on Twitter at #BigPrivacy.

 

Full report and all papers available here.

New Survey on App Stores and Account Info Sharing – What This Means for COPPA

FPF is committed to helping the app marketplace comply with the FTC’s revised Children’s Online Privacy Protection Act (COPPA) rule.  As explained in our public comments filed with the FTC, we think that one way to help companies and parents alike is to encourage collaboration.  For example, by leveraging a common platform, “operators” under the rule could provide streamlined notice and obtain verifiable consent from parents using an easy-to-understand and manageable approach.  This way, parents aren’t overwhelmed by countless notices and consent processes, and “operators” under the rule can fulfill their COPPA obligations in a practical manner.

However, in its COPPA FAQs, the FTC said that a parent’s app store account number and password alone aren’t sufficient to establish consent under the rule:  “The mere entry of an app store account number or password, without other indicia of reliability (e.g., knowledge-based authentication questions or verification of government identification), does not provide sufficient assurance that the person entering the account or password information is the parent, and not the child.”  The assumption appears to be that parents share their account names and passwords with their children.

As part of our mission to give policy makers more data to inform their decision making, we recently commissioned Harris Interactive to conduct a survey online among U.S. adults to better understand whether parents share account information with their kids.  We identified parents with children ages 3-12 who own smartphones, tablets, and/or e-readers and have an app store account, and asked whether those parents have ever shared their account name and password with their children so that their kids could download free apps or purchase apps.  The results show that 72% have never shared this information with their kids.  And, for the less than 100 parents surveyed who have, most require that their kids ask permission before using their account information to download free apps or purchase apps: only 4% of those parents surveyed said their kids did not have to first ask permission before downloading free apps or purchasing apps.

So what does this mean?  This is a recent poll, and we certainly plan to study it in greater depth.  But it suggests that most parents keep their app store account information private – they don’t just hand it over to their kids.  And, for those that do share this information, there are rules in place so that kids have to get permission first.

This is encouraging for those of us that want to answer the FTC’s call for better notices and simpler yet effective choices.  We want to avoid overly long, overly detailed privacy policies by giving parents a general up-front notice, and provide more specific notice at a time that is most relevant to the parent.  We want parents to be able to easily provide verifiable consent in a way that complies with COPPA but doesn’t subject them to duplicative procedures.

As the app market continues to grow, the Commission is right to be concerned about apps targeting children and COPPA compliance.  It just makes sense to leverage existing platforms and relationships to provide parents and operators with smart, innovative, and common sense notice and consent mechanisms.  Hopefully, these findings will encourage policy makers to help do just that.

For complete survey methodology, including weighting variables, please contact [email protected] or call (202) 642-9142.

Is "Compromise" a Dirty Word?

Policy making used to be about consensus and compromise.  Once upon a time, if you convened a diverse group of participants, each with different interests and sensitivities, and came to an agreed upon understanding, you could declare victory.  Imagine the group that came to this agreement consisted of entities ranging from civil liberties groups to industry trade associations.  In 2013, that’s practically a unicorn.

And yet, it happened.  In 2012, for the first time ever, the U.S. National Telecommunications and Information Administration (NTIA) convened a multi-stakeholder process on mobile privacy.  After more than a year of work, privacy groups like Consumer Action, World Privacy Forum, FPF  and the ACLU, and industry including the App Developers Alliance, Intuit, AT&T, Verizon, CTIA,  SIIA and the Internet Commerce Coalition, came together in July to support a code of conduct for mobile app developer transparency.

Now, the NTIA multi-stakeholder process is taking a lot of criticism in the press.  It didn’t go far enough.  Or it went too far.  It wasn’t focused.  It was messy.

To that, we say, “Of course it was – this is how consensus is built.”  The NTIA group may not have fundamentally altered the state of privacy in America.  It wasn’t supposed to.  Instead, it is one part of a broader strategy for incremental progress to develop flexible but enforceable privacy codes of conduct in the U.S.  And yes, it was messy.  Debate and compromise among stakeholders with divergent viewpoints will always be. The NTIA success is all the greater when one considers it came to a modest consensus in the absence of any immediate legislative or technical threat.  This stands in stark contrast to the ongoing Do Not Track debate in the W3C.  In the face of technical browser implementations and proposed legislation, the W3C group has yet to agree to anything, leaving the call for Do Not Track unanswered.

The NTIA multi-stakeholder process should be applauded because it demonstrates that we can compromise – something that is becoming far less common in today’s polarized policy making environment.  It should be supported because instead of a stalemate, we moved forward – standardized mobile privacy notices are now being deployed and tested in the field.  It should be encouraged and yes, improved upon, because although it was not perfect, it was and is a step in the right direction for American consumers.