Comments to the NTIA on Drones

On April 20, 2015, the Future of Privacy Forum submitted comments to the NTIA on unmanned aircraft systems or “drones.” FPF’s comments emphasize the need for further accountability and transparency measures in deploying commercial drones.

You can read our full comments here.

Quick Security Tips for Vendors

As part of our on-going support to vendors, especially start-ups and small business providers in the ed tech market, we have recently published our “Quick Security Tips for Vendors” on FERPA|Sherpa.  This tool, a companion to our “Quick Privacy Tips for Vendors,” is designed to provide a simple baseline of security principles and practices as an ed tech business grows its products and services.  Of course, this list of tips does not constitute a complete security policy, but if followed, it will ensure that vendors have taken the best, first steps toward responsible protection of student data, as these tips flag many of the common key concerns. A company that implements student data privacy and security policies and procedures in compliance with these 2 checklists will have a strong foundation moving forward.

Peter Swire on Encryption and Mandated Access

Senate Committee on the Judiciary

Questions for the Record from Senator Grassley

To: Peter Swire

Huang Professor of Law and Ethics

Scheller College of Business

Georgia Institute of Technology

  1. Global Competitiveness

In my opening statement, one of the concerns I expressed was that, in considering solutions to the “Going Dark” problem, we carefully consider the impact on the global competitiveness of American technology companies. You testified that if U.S. companies were required to give U.S. law enforcement access to encrypted communications and devices, U.S. companies find themselves at a disadvantage in the global marketplace. Yet it appears that countries like the United Kingdom, France and China are considering laws that would move in this direction.

  1. Do you agree that these foreign governments may be moving in this direction? If so, how would the global competitiveness of U.S. companies be damaged if foreign governments mandate the same sort of access?

 

Swire: I agree that other countries have been considering laws concerning mandated access. My view is that the position of the United States government is highly relevant to the likelihood of other countries adopting such laws, especially for close allies such as the United Kingdom and France. If the United States were to mandate access legally, which I hope it will not, my view is that the U.S. decision would substantially increase the likelihood of such laws being enacted by our allies. By contrast, if the United States maintains the status quo of no such mandates, then that fact becomes an important and relevant precedent against enactment of such measures by our allies.

I believe the U.S. position would also have a large impact on other countries around the world, especially for authoritarian or dictatorial regimes that would like to use mandated access to clamp down on political dissent, religious activity, and other activities. If the U.S. resists mandates, then the U.S. based technology companies have a much greater ability to resist demands for mandated access in such countries. Being able to resist such demands will protect devices and sensitive data of Americans and American businesses in those countries. By contrast, if the U.S. requires access, then it will be much for difficult for U.S. based technology companies to push back against requests from China or other foreign governments.

My initial point, therefore, is that the U.S. actions in this area have a very important impact on whether other countries adopt mandated access. As I stated during the hearing, I also believe that mandates in the U.S. would harm U.S. based technology companies because of the suspicion around the world that their products and services are not secure and information is shared with U.S. government agencies.

In terms of mandates in another country, such as a U.S. ally, there would be multiple effects and the overall outcome depends on the circumstances. For instance, if a small market country mandates access, then that might aid local companies that comply with the local law while U.S. companies may decide not to take the reputational risk of doing business in that jurisdiction. In that event, U.S. companies might lose access to a small market but face less competition from companies based in there in other markets. If the country is seen globally as having a weak human rights record, mandated access may push the U.S. companies, consistent with the Global Network Initiative principles, not to continue doing business there, thus losing market access. Such company decisions to eschew a market, however, may send a strong signal globally about the importance of customer security to the U.S. based companies, with offsetting gains in other markets.

In addition, there is a crucial dynamic aspect to such mandates. The small country, or country with weak human rights, might find the consequences negative if they lose access to cutting edge technology from U.S. based companies. They thus might reconsider their decision to mandate access, in order to bring U.S. based companies back into the jurisdiction. In such an event, a clear U.S. policy of not requiring access is crucial – the good long-term outcome of U.S. company participation and no mandates occurs only if the U.S. retains its policy where no mandates are imposed.

Congrats to National Student Clearinghouse!

Exciting news to share from today’s press release from iKeepSafe.  We extend our congratulations to FPF Advisory Board Member National Student Clearinghouse for this recognition.

National Student Clearinghouse’s program: StudentTracker for High Schools

Earns iKeepSafe FERPA Badge

July 22, 2015 iKeepSafe.org, a leading digital safety and privacy nonprofit, announced today that it has awarded its first privacy protection badge to StudentTrackerSM for High Schools from the National Student Clearinghouse, the largest provider of electronic student record exchanges in the U.S. Its selection as the first recipient of the new badge reflects the ongoing efforts of the Clearinghouse, which performs more than one billion secure electronic student data transactions each year, to protect student data privacy.

A nonprofit organization founded by the higher education community in 1993, the Clearinghouse provides educational reporting, verification, and research services to more than 3,600 colleges and universities and more than 9,000 high schools. Its services are also used by school districts and state education offices nationwide.

Earlier this year, iKeepSafe launched the first independent assessment program for the Family Educational Rights and Privacy Act (FERPA) to help educators and parents identify edtech services and tools that protect student data privacy.

“The National Student Clearinghouse is as committed to K12 learners as we are to those pursuing postsecondary education, and that also means we’re committed to protecting their data and educational records,” said Ricardo Torres, President and CEO of the Clearinghouse. “So many aspects of education are moving into the digital realm, and we’re focused on providing students with the privacy and protection they deserve in a rapidly changing digital environment.”

The Clearinghouse became the first organization to receive the iKeepSafe FERPA badge by completing a rigorous assessment of its StudentTrackerSM for High Schools product, privacy policy and practices. “As the first company to earn the iKeepSafe FERPA badge, the National Student Clearinghouse has demonstrated its dedication to K12 students and their families, and to the privacy and security of their data,” said iKeepSafe CEO Marsali Hancock.

Products participating in the iKeepSafe FERPA assessment must undergo annual re-evaluation to continue displaying the iKeepSafe FERPA badge. For the evaluation, an independent privacy expert reviewed the StudentTrackerSM for High Schools product, its privacy policy and practices, as well as its data security practices.

For more information, please visit http://ikeepsafe.org/educational-issues/clearinghouses/

Tackling Privacy, One Carnegie Mellon Project at a Time

CMU Event

CMU Privacy Researchers Norman Sadeh, Lorrie Cranor, Lujo Bauer, Travis Breaux, and Jason Hong (l-r). Photo by JC Cannon.

Last Thursday, the Future of Privacy Forum hosted a conversation among five of CMU’s leading privacy researchers. While the panelists discussed a number of their leading privacy projects, I wanted to highlight some of the interesting takeaways I took from the presentation.

Many of the researchers focused on how subtle nudges can be used to change people’s behaviors. While this is frequently done to encourage users to share more data, the CMU researchers expressed in interest in exploring how nudges can be “used for good.” Discussing efforts by hotels to get patrons to reuse wash towels, Jason Hong explained how subtle changes in wording reminders — from “please recycle” to “75% of guests in this room” — could have significant impacts on patron recycling behaviors.

Lujo Bauer explained how these sorts of nudges could be applied to password composition meters. Increasingly, online services detail password requirements to users and show either colored bars or outright classify a user’s proposed password as “weak” or “strong.” According to Bauer, people typically do not try very hard to get to the point where a meter tells them the password is excellent, but “they will avoid it if a meter tells them their password sucks.” His takeaway: when it comes to security measures, avoid giving users too much positive feedback.

Bauer lamented that the online ecosystem is forcing users to engage in insecure behaviors. Of course, while nudges could be used to reinforce positive behaviors, it begs the question what is defined as “positive” behavior. When it comes to security issues like passwords, promoting better security may be a no brainer, but things are much less binary when it comes to privacy. Privacy-protective nudges can push towards privacy paternalism, which may be no more ethical than the alternative.

Travis Breaux highlighted the continuing challenge of communicating privacy policy into engineering objectives. He noted that many mobile app developers still do not understand the privacy implications that can come with connecting their apps through outside services and social networks, which calls for the need to further detail the entire data supply chain. Breaux explored the potential behind establishing rich data collection/use descriptions that could be more detailed and useful than generic privacy policies, and describing a case study involving applications on Facebook, explained how these sorts of tools could help developers understand more accurately how they are collecting, using, and repurposing information.

Lorrie Cranor discussed the difficulties with communicating data use in the Internet of Things whether through visual, auditory, or haptic channels, or make information “machine readable (if you remember P3P and DNT).” She also highlighted one study that looked at the timing dimension of providing users with notice.  A student developed a simple history quiz app that displayed a privacy notices in different places: (1) in the app store, (2) as soon as the app was opening, (3) in the middle of the history quiz, (4) at the quiz’s end or (5) never at all. “We invited people to take our quiz, but didn’t tell them it was about privacy,” she explained.

When users where then asked about the contents of that privacy notice, the study found that people who “saw” the policy in the app store could not recall it any better than people who did not see it at all. According to Cranor, at the time a user is downloading an app, they are not paying attention to other information in the app store. This “doesn’t suggest you don’t put that info in the app store . . . but suggests that sort of timing may not be sufficient. Also suggests it’s really important to test these things.”

Norman Sadeh further criticized the state of our overly-complicated privacy policies. “It’s not the case that every single sentence in a privacy policy matters,” he stated, discussing his effort to try to extract the key points of interest to users from privacy policies.

Last but not least, the group described its Bank Privacy Project. The researchers described how larger banks tend to collect more information and use it for more purposes, while smaller banks do the exact opposite. “If you don’t want your bank sharing,” Cranor explained, “you need to find a bank you’ve never heard of.” Because this is nigh-impossible for an average consumer to do, enter the Bank Privacy Project.

-Joseph Jerome, Policy Counsel

Practical De-Identification Workshop

“Practical De-Identification” was held on July 9, 2015. The event was attended by industry and policy leaders from a range of sectors, who joined in a lively and in-depth discussion about what it means for data to be de-identified. Expert panelists discussed the current regulatory framework, how to understand identifiers and pseudonymous data, the role of controls, and sector-specific applications of deidentification.

You can download the speakers’ presentations by clicking HERE. For more information and a summary of the proceedings, please reach out to [email protected].

Building on the success of this event, EY and FPF will be kicking off an effort to provide guidance on what steps are needed to provide effective controls, as well as to support the policy arguments for the relevancy of controls to the deidentification process.

DSC_0756

 

DSC_0755  DSC_0765

Peter Swire Testifies on Encryption and "Going Dark"

This morning, FPF Senior Fellow Peter Swire presented testimony before the Senate Judiciary Committee on encryption and the balance between public safety and privacy. Swire highlights the concerns raised by a diverse coalition of cybersecurity and privacy experts, tech companies, and human rights activists about law enforcement’s “going dark” argument.

“We can respect the heartfelt concerns of law enforcement officials facing new challenges while respectfully disagreeing with proposed policies,” he concludes. You can read his full testimony here.

Altimeter Offers Up Privacy Lessons for IoT

A new report today from Altimeter explores what brands can learn about consumer privacy perceptions in the booming Internet of Things. The group warns of the “massive gulf between consumer awareness and industry practices when it comes to practice,” and suggests that companies could respond to consumer anxiety by pursing more trusted customer relationships. At present, Altimeter found that trust and understanding of new connected technologies trail far behind consumer interest in these products.

Some of the numbers the report cites about consumer understanding are problematic: 40% of consumers still have little understanding of how, when, where, or with whom tracking involving HTTP cookies occurs. With regards to the “Internet of Things,” 87% of surveyed consumers had never even heard of the term. While the IoT is still in early days, the juxtaposition of these two numbers suggests that exposure to these technologies alone will not solve the public’s perceived privacy anxieties.

Altimeter found significant percentages of the public concerned about both company use of their data and their data being sold or shared:

Yet consumers are also worried about how companies are using their information. Altimeter reports that consumers are worried about where and how long their data is stored, and even how personally identifiable their information may be. Most important, sensitivity to data use is not exclusive to older populations. Even for the survey’s youngest segment, aged 18-24, well over 40% of those survey indicated high levels of concerns about typical data uses.

The report cautions that industry is facing a trust deficit and has “an existential imperative to foster trust with consumers, for risk of failure, security compromise, customer safety, and ethical responsibility.” It argues that this dynamic calls for a transformation is not just privacy and security and compliance but in the design of consumer experiences. In other words, industry needs to provide a clearer value exchange in the Internet of Things:

Whether in the form of money, time, or energy, consumers are most incentivized to share their data by gains in efficiency. Indeed not all ‘value exchange’ is created equal; this study finds that consumers with higher trust place higher value on information to aid with decision-making, where as those with lower trust are more compelled to share their data for customer support needs. These particular findings address a deeper question: are coupons really enough?

Coupons may indeed not be enough. For the Internet of Things, successful brands will do better at communication, education, and consumer engagement.

-Joseph Jerome, Policy Counsel