Comments to NTIA on Big Data and Privacy
Today, FPF submitted comments to the NTIA as it begins its exploration of how big data impact the Consumer Privacy Bill of Rights. While the NTIA sought comment on over a dozen key questions, our filing focus largely on four issues: (1) the need for additional clarity surrounding the flexible application of the Consumer Privacy Bill of Rights’ privacy principles, (2) challenges to the “notice and choice” model and using context to inform a use-based approach to data use, (3) practical de-identification, and (4) what internal review boards might look like and consider in the age of big data.
Much of our filing builds upon FPF’s thinking on how to develop a benefit-risk analysis for data protects, with big data concerns of particular importance. Industry increasingly faces ethical considerations over how to minimize data risks while maximizing benefits to all parties. As the White House’s earlier Big Data Report acknowledged, there is a potential tension between socially beneficial and privacy invasive uses of information in everything from educational technology to consumer generated health data. The advent of big data requires active engagement by both internal and external stakeholders to increase transparency, accountability and trust.
FPF believes that a documented review process could serve as an important tool to infuse ethical considerations into data analysis without requiring radical changes to the business practices or innovators or industry in general. Institutional review boards (IRBs), which remain the chief regulatory response to decades of questionable ethical decisions in the field of human subject testing, provide a useful precedent for focusing on good process controls as a way to address potential privacy concerns. While IRBs have become a rigid compliance device and would be inappropriate for wholesale use in big data decision-making, they could provide a useful template for how projects can be evaluated based on prevailing community standards and subjective determinations of risks and benefits, particularly in cases involving greater privacy risks. Using an IRB model as inspiration, big data may warrant the creation of new advisory processes within organizations to more fully consider ethical questions posed by big data.
Moving forward, broader big data ethics panels could provide a commonsense response to public concerns about data misuse. While these institutions could provide a further expansion of the role of privacy professionals within organizations, they might also provide a forum for a diversity of viewpoints inside and out of organizations. Ethics reviews could include members with different backgrounds, training, and experience, and could seek input from outside actors including consumer groups and regulators.While these panels will vary between the public and private sector, businesses and researchers, they could provide an important check on any data misuse.
Organizations and privacy professionals have become experienced at evaluating risk, but they should also engage in a rigorous data benefit analysis in conjunction with traditional privacy risks assessments. FPF suggests that organizations could develop procedures to assess the “raw value” of a data project, which would require organizations to identify the nature of a project, its potential beneficiaries, and the degree to which those beneficiaries would benefit from the project. Our guidance for this process is included in our filing for the first time.