Beyond IRBs: Designing Ethical Review Processes for Big Data

CALL FOR PAPERS

Beyond IRBs: Designing Ethical Review Processes for Big Data Research

In the age of Big Data, innovative uses of information are continuously emerging in a wide variety of contexts. Increasingly, researchers at companies, not-for-profit organizations and academic institutions use individuals’ personal data as raw material for analysis and research. For research on data subject to the Common Rule, institutional review boards (IRBs) provide an essential ethical check on experimentation. Still, even academic researchers lack standards around the collection and use of online data sources, and data held by companies or in non-federally funded organizations is not subject to such procedures. Research standards for data can vary widely as a result. Companies and non-profits have become subject to public criticism and may elect to keep research results confidential to avoid public scrutiny or potential legal liability.

To prevent unethical data research or experimentation, experts have proposed a range of solutions, including the creation of “consumer subject review boards,”[1] formal privacy review boards,[2] private IRBs,[3] and other ethical processes implemented by individual companies.[4] Organizations and researchers are increasingly encouraged to pursue internal or external review mechanisms to vet, approve and monitor data experimentation and research. However, many questions remain concerning the desirable structure of such review bodies as well as the content of ethical frameworks governing data use. In addition, considerable debate lingers around the proper role of consent in data research and analysis, particularly in an online context; and it is unclear how to apply basic principles of fairness to selective populations that are subject to research.

To address these challenges, the Future of Privacy Forum (FPF) is hosting an academic workshop supported by the National Science Foundation, which will discuss ethical, legal, and technical guidance for organizations conducting research on personal information. Authors are invited to submit papers for presentation at a full-day program to take place on December 10, 2015. Successful submissions may address the following issues:

Papers for presentation will be selected by an academic advisory board and published in the online edition of the Washington and Lee Law Review. Four papers will be selected to serve as “firestarters” for the December workshop, awarding each author with a $1000 stipend.

Submissions must be 2,500 to 3,500 words, with minimal footnotes and in a readable style accessible to a wide audience.

Submissions must be made no later than October 25, 2015, at 11:59 PM ET, to [email protected]. Publication decisions and workshop invitations will be sent in November.


 

[1] Ryan Calo, Consumer Subject Review Boards: A Thought Experiment, 66 Stan. L. Rev. Online 97 (2013).

[2] White House Consumer Privacy Bill of Rights Discussion Draft, Section 103(c) (2015).

[3] Jules Polonetsky, Omer Tene, & Joseph Jerome, Beyond the Common Rule: Ethical Structures for Data Research in Non-Academic Settings, 13 Colo. Tech. L. J. 333 (2015).

[4] Mike Schroepfer, CTO, Research at Facebook (Oct. 2, 2014), http://newsroom.fb.com/news/2014/10/research-at-facebook/.

Beyond the Common Rule

As part of a symposium on corporate consumer research, Jules Polonetsky, Omer Tene, and Joseph Jerome published “Beyond the Common Rule: Ethical Structures for Data Research in Non-Academic Settings” in volume 13, issue 2, of the Colorado Technology Law Journal. At corporations, not-for-profits, and academic institutions, researchers are analyzing data and testing theories that often rely on data about individuals. Many of these new uses of personal information are natural extensions of current practices, well within the expectations of individuals and the boundaries of traditional Fair Information Practice Principles. In other cases, data use may exceed expectations, but organizations can provide individuals with additional notice and choice. However, in some cases enhanced notice and choice is not feasible, despite the considerable benefit to consumers if personal information were to be used in an innovative way. This article addresses the processes required to authorize noncontextual data uses at corporations or not-for-profit organizations in the absence of additional notice and choice. Although many of these challenges are also relevant to academic researchers, their work will often be guided by the oversight of Internal Review Boards (which are required for many — but not all — new research uses of personal information).

The full article, “Beyond the Common Rule: Ethical Structures for Data Research in Non-Academic Settings,” is available to read here. 

Student Data and De-Identification

Today, FPF has released its newest paper, Student Data and De-Identification: Understanding De-Identification of Education Records and Related Requirements of FERPA.  Prepared in partnership with Reg Leichty of Foresight Law + Policy, this paper provides an overview of the different tools used to de-identify data to various degrees, based on the type of information involved, and the determined risk of unintended disclosure of individual identity. Proper data de-identification requires technical knowledge and expertise as well as knowledge of, and adherence to, industry best practice.

“Data de-identification represents one privacy protection strategy that should be in every student data holder’s playbook. Integrated with other robust privacy and security protections, appropriate de-identification – choosing the best de-identification technique based on a given data disclosure purpose and risk level – provides a pathway for protecting student privacy without compromising data’s value. This paper provides a high level introduction to: (1) education records de-identification techniques; and (2) explores the Family Educational Rights and Privacy Act’s (FERPA) application to de-identified education records. The paper also explores how advances in mathematical and statistical techniques, computational power, and Internet connectivity may be making de-identification of student data more challenging and thus raising potential questions about FERPA’s long-standing permissive structure for sharing non-personally identifiable information.”

 

De-Identification and other issues relating to student privacy will be discussed at our upcoming Student Privacy Symposium on Sept 21.  Learn more.

Student Privacy Pledge – Hits 150!

Ed Tech Student Privacy Pledge reaches 150 signatories

The Future of Privacy Forum (FPF) and the Software & Information Industry Association (SIIA) are pleased to recognize that the Student Privacy Pledge reached the milestone of 150 education technology companies who have signed. The Pledge represents industry’s public commitment to the responsible handling of student data and provides accountability for signatory school service providers. The result is a bolstering of the public trust necessary for continued technology access for school operations and student learning – technology that is critical to the nation’s continued educational and economic competitiveness.

The K-12 Student Privacy Pledge was introduced by FPF and SIIA in October 2014 with 14 original signatories and took effect in January 2015 as a legally enforceable agreement for companies that provide services to schools. The twelve specific commitments in the Pledge detail ongoing industry practices that both meet the demands of families and schools and track key federal and state laws. By signing the Pledge, school service providers clearly articulate their adherence to these practices to schools and parents regarding the collection, use, maintenance, and retention of student data.

Signatories of the Student Privacy Pledge promise to:

The Pledge adds to an existing framework of student data protections, which also include existing laws, contracts, and company privacy policies. A company’s security and other commitments made under the Student Privacy Pledge are legally enforceable under Section 5 of the federal Consumer Protection Act.

“Although many states have passed student privacy laws, the Pledge creates an instantly enforceable and nationally applicable commitment to student data privacy. The Pledge, provides a baseline for communication between companies and end users that builds trust,” said Kobie Pruitt, education policy manager, FPF.

“The Pledge demonstrates the industry’s commitment to these strict legally enforceable best practices for data security and the safeguarding of student privacy,” said Mark Schneiderman, senior director of education policy, SIIA.

FPF and SIIA are proud to facilitate the efforts of education technology companies to take leadership in protecting student privacy by signing the Student Privacy Pledge. We look forward to a continual increase in the number of companies joining this effort, and agreeing to be held publically accountable to the student information safeguards embodied in the Pledge.

NOTE: This blog was cross-posted at both http://fpf.org/category/blog/ and http://blog.siia.net/.

Comments to the NTIA on Drones

On April 20, 2015, the Future of Privacy Forum submitted comments to the NTIA on unmanned aircraft systems or “drones.” FPF’s comments emphasize the need for further accountability and transparency measures in deploying commercial drones.

You can read our full comments here.

Quick Security Tips for Vendors

As part of our on-going support to vendors, especially start-ups and small business providers in the ed tech market, we have recently published our “Quick Security Tips for Vendors” on FERPA|Sherpa.  This tool, a companion to our “Quick Privacy Tips for Vendors,” is designed to provide a simple baseline of security principles and practices as an ed tech business grows its products and services.  Of course, this list of tips does not constitute a complete security policy, but if followed, it will ensure that vendors have taken the best, first steps toward responsible protection of student data, as these tips flag many of the common key concerns. A company that implements student data privacy and security policies and procedures in compliance with these 2 checklists will have a strong foundation moving forward.

Peter Swire on Encryption and Mandated Access

Senate Committee on the Judiciary

Questions for the Record from Senator Grassley

To: Peter Swire

Huang Professor of Law and Ethics

Scheller College of Business

Georgia Institute of Technology

  1. Global Competitiveness

In my opening statement, one of the concerns I expressed was that, in considering solutions to the “Going Dark” problem, we carefully consider the impact on the global competitiveness of American technology companies. You testified that if U.S. companies were required to give U.S. law enforcement access to encrypted communications and devices, U.S. companies find themselves at a disadvantage in the global marketplace. Yet it appears that countries like the United Kingdom, France and China are considering laws that would move in this direction.

  1. Do you agree that these foreign governments may be moving in this direction? If so, how would the global competitiveness of U.S. companies be damaged if foreign governments mandate the same sort of access?

 

Swire: I agree that other countries have been considering laws concerning mandated access. My view is that the position of the United States government is highly relevant to the likelihood of other countries adopting such laws, especially for close allies such as the United Kingdom and France. If the United States were to mandate access legally, which I hope it will not, my view is that the U.S. decision would substantially increase the likelihood of such laws being enacted by our allies. By contrast, if the United States maintains the status quo of no such mandates, then that fact becomes an important and relevant precedent against enactment of such measures by our allies.

I believe the U.S. position would also have a large impact on other countries around the world, especially for authoritarian or dictatorial regimes that would like to use mandated access to clamp down on political dissent, religious activity, and other activities. If the U.S. resists mandates, then the U.S. based technology companies have a much greater ability to resist demands for mandated access in such countries. Being able to resist such demands will protect devices and sensitive data of Americans and American businesses in those countries. By contrast, if the U.S. requires access, then it will be much for difficult for U.S. based technology companies to push back against requests from China or other foreign governments.

My initial point, therefore, is that the U.S. actions in this area have a very important impact on whether other countries adopt mandated access. As I stated during the hearing, I also believe that mandates in the U.S. would harm U.S. based technology companies because of the suspicion around the world that their products and services are not secure and information is shared with U.S. government agencies.

In terms of mandates in another country, such as a U.S. ally, there would be multiple effects and the overall outcome depends on the circumstances. For instance, if a small market country mandates access, then that might aid local companies that comply with the local law while U.S. companies may decide not to take the reputational risk of doing business in that jurisdiction. In that event, U.S. companies might lose access to a small market but face less competition from companies based in there in other markets. If the country is seen globally as having a weak human rights record, mandated access may push the U.S. companies, consistent with the Global Network Initiative principles, not to continue doing business there, thus losing market access. Such company decisions to eschew a market, however, may send a strong signal globally about the importance of customer security to the U.S. based companies, with offsetting gains in other markets.

In addition, there is a crucial dynamic aspect to such mandates. The small country, or country with weak human rights, might find the consequences negative if they lose access to cutting edge technology from U.S. based companies. They thus might reconsider their decision to mandate access, in order to bring U.S. based companies back into the jurisdiction. In such an event, a clear U.S. policy of not requiring access is crucial – the good long-term outcome of U.S. company participation and no mandates occurs only if the U.S. retains its policy where no mandates are imposed.

Congrats to National Student Clearinghouse!

Exciting news to share from today’s press release from iKeepSafe.  We extend our congratulations to FPF Advisory Board Member National Student Clearinghouse for this recognition.

National Student Clearinghouse’s program: StudentTracker for High Schools

Earns iKeepSafe FERPA Badge

July 22, 2015 iKeepSafe.org, a leading digital safety and privacy nonprofit, announced today that it has awarded its first privacy protection badge to StudentTrackerSM for High Schools from the National Student Clearinghouse, the largest provider of electronic student record exchanges in the U.S. Its selection as the first recipient of the new badge reflects the ongoing efforts of the Clearinghouse, which performs more than one billion secure electronic student data transactions each year, to protect student data privacy.

A nonprofit organization founded by the higher education community in 1993, the Clearinghouse provides educational reporting, verification, and research services to more than 3,600 colleges and universities and more than 9,000 high schools. Its services are also used by school districts and state education offices nationwide.

Earlier this year, iKeepSafe launched the first independent assessment program for the Family Educational Rights and Privacy Act (FERPA) to help educators and parents identify edtech services and tools that protect student data privacy.

“The National Student Clearinghouse is as committed to K12 learners as we are to those pursuing postsecondary education, and that also means we’re committed to protecting their data and educational records,” said Ricardo Torres, President and CEO of the Clearinghouse. “So many aspects of education are moving into the digital realm, and we’re focused on providing students with the privacy and protection they deserve in a rapidly changing digital environment.”

The Clearinghouse became the first organization to receive the iKeepSafe FERPA badge by completing a rigorous assessment of its StudentTrackerSM for High Schools product, privacy policy and practices. “As the first company to earn the iKeepSafe FERPA badge, the National Student Clearinghouse has demonstrated its dedication to K12 students and their families, and to the privacy and security of their data,” said iKeepSafe CEO Marsali Hancock.

Products participating in the iKeepSafe FERPA assessment must undergo annual re-evaluation to continue displaying the iKeepSafe FERPA badge. For the evaluation, an independent privacy expert reviewed the StudentTrackerSM for High Schools product, its privacy policy and practices, as well as its data security practices.

For more information, please visit http://ikeepsafe.org/educational-issues/clearinghouses/

Tackling Privacy, One Carnegie Mellon Project at a Time

CMU Event

CMU Privacy Researchers Norman Sadeh, Lorrie Cranor, Lujo Bauer, Travis Breaux, and Jason Hong (l-r). Photo by JC Cannon.

Last Thursday, the Future of Privacy Forum hosted a conversation among five of CMU’s leading privacy researchers. While the panelists discussed a number of their leading privacy projects, I wanted to highlight some of the interesting takeaways I took from the presentation.

Many of the researchers focused on how subtle nudges can be used to change people’s behaviors. While this is frequently done to encourage users to share more data, the CMU researchers expressed in interest in exploring how nudges can be “used for good.” Discussing efforts by hotels to get patrons to reuse wash towels, Jason Hong explained how subtle changes in wording reminders — from “please recycle” to “75% of guests in this room” — could have significant impacts on patron recycling behaviors.

Lujo Bauer explained how these sorts of nudges could be applied to password composition meters. Increasingly, online services detail password requirements to users and show either colored bars or outright classify a user’s proposed password as “weak” or “strong.” According to Bauer, people typically do not try very hard to get to the point where a meter tells them the password is excellent, but “they will avoid it if a meter tells them their password sucks.” His takeaway: when it comes to security measures, avoid giving users too much positive feedback.

Bauer lamented that the online ecosystem is forcing users to engage in insecure behaviors. Of course, while nudges could be used to reinforce positive behaviors, it begs the question what is defined as “positive” behavior. When it comes to security issues like passwords, promoting better security may be a no brainer, but things are much less binary when it comes to privacy. Privacy-protective nudges can push towards privacy paternalism, which may be no more ethical than the alternative.

Travis Breaux highlighted the continuing challenge of communicating privacy policy into engineering objectives. He noted that many mobile app developers still do not understand the privacy implications that can come with connecting their apps through outside services and social networks, which calls for the need to further detail the entire data supply chain. Breaux explored the potential behind establishing rich data collection/use descriptions that could be more detailed and useful than generic privacy policies, and describing a case study involving applications on Facebook, explained how these sorts of tools could help developers understand more accurately how they are collecting, using, and repurposing information.

Lorrie Cranor discussed the difficulties with communicating data use in the Internet of Things whether through visual, auditory, or haptic channels, or make information “machine readable (if you remember P3P and DNT).” She also highlighted one study that looked at the timing dimension of providing users with notice.  A student developed a simple history quiz app that displayed a privacy notices in different places: (1) in the app store, (2) as soon as the app was opening, (3) in the middle of the history quiz, (4) at the quiz’s end or (5) never at all. “We invited people to take our quiz, but didn’t tell them it was about privacy,” she explained.

When users where then asked about the contents of that privacy notice, the study found that people who “saw” the policy in the app store could not recall it any better than people who did not see it at all. According to Cranor, at the time a user is downloading an app, they are not paying attention to other information in the app store. This “doesn’t suggest you don’t put that info in the app store . . . but suggests that sort of timing may not be sufficient. Also suggests it’s really important to test these things.”

Norman Sadeh further criticized the state of our overly-complicated privacy policies. “It’s not the case that every single sentence in a privacy policy matters,” he stated, discussing his effort to try to extract the key points of interest to users from privacy policies.

Last but not least, the group described its Bank Privacy Project. The researchers described how larger banks tend to collect more information and use it for more purposes, while smaller banks do the exact opposite. “If you don’t want your bank sharing,” Cranor explained, “you need to find a bank you’ve never heard of.” Because this is nigh-impossible for an average consumer to do, enter the Bank Privacy Project.

-Joseph Jerome, Policy Counsel

Practical De-Identification Workshop

“Practical De-Identification” was held on July 9, 2015. The event was attended by industry and policy leaders from a range of sectors, who joined in a lively and in-depth discussion about what it means for data to be de-identified. Expert panelists discussed the current regulatory framework, how to understand identifiers and pseudonymous data, the role of controls, and sector-specific applications of deidentification.

You can download the speakers’ presentations by clicking HERE. For more information and a summary of the proceedings, please reach out to [email protected].

Building on the success of this event, EY and FPF will be kicking off an effort to provide guidance on what steps are needed to provide effective controls, as well as to support the policy arguments for the relevancy of controls to the deidentification process.

DSC_0756

 

DSC_0755  DSC_0765