New Privacy Tech Alliance Promotes Innovative Privacy Technologies

TEL AVIV, ISRAEL – June 25, 2019 – The Future of Privacy Forum and the Israel Tech Policy Institute are launching the Privacy Tech Alliance during CyberWeek 2019, to promote the market for privacy-protective technologies internationally, facilitate the development of new technologies, and maximize value for innovators and investors.

“As the data ecosystem and regulatory requirements grow more complex, companies need technical solutions from innovators in this emerging sector,” said Jules Polonetsky, CEO of the Future of Privacy Forum and a co-founder of the Israel Tech Policy Institute. “Our goal is to encourage the social benefits of technology that allows for data-driven insights while minimizing privacy risks.”

The Privacy Tech Alliance brings together innovative startups and academics in the privacy space with companies and government agencies that need solutions and investors who see the potential upside. Startups and academic researchers are joining leading Chief Privacy Officers (CPOs) and venture capitalists to:

“The global nature of privacy regulation means there is a growing market for privacy-protecting technologies,” said Limor Shmerling Magazanik, Managing Director of the Israel Tech Policy Institute. “Companies around the world are eager for tech-based solutions to help them comply with the EU’s General Data Protection Regulation, the California Consumer Privacy Act, and state and national rules modeled upon them.”

A recent wave of investments indicates funders see promise in privacy tech. Companies involved with the Privacy Tech Alliance in the U.S., EU, and Israel provide privacy-enhancing technology tools and privacy program management solutions, including de-identification, secure communications, homomorphic encryption, active monitoring, and data mapping and discovery.

The Privacy Tech Alliance launch will be held at 5:00 Tel Aviv time on June 25 at Camilo – The Green House, George Waze 24, Tel Aviv Jaffa, Israel 6997714. Speakers at the launch event will include:

Companies that have joined the Privacy Tech Alliance Advisory Board include Anonos, BigID, Duality, D-ID, Immuta, Nymity, OneTrust, Privacy Analytics, Truata, TrustArc, and WireWheel.

Click here to view an archived broadcast of the event.

To learn about Privacy Technologies, visit the Resources page.

Media Contact:

Nancy Levesque
Future of Privacy Forum
[email protected]

About the Future of Privacy Forum

Future of Privacy Forum is a global non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting www.fpf.org.

About the Israel Tech Policy Institute 

Israel Tech Policy Institute is an incubator for tech policy leadership and scholarship, advancing ethical practices in support of emerging technologies. Learn more about ITPI by visiting www.techpolicy.org.il.

FPF Letter to NY State Legislature

On Friday, June 14, FPF submitted a letter to the New York State Assembly and Senate supporting a well-crafted moratorium on facial recognition systems for security uses in public schools. FPF also cautioned against overly broad bans or language that might have unintended consequences on other security programs, including some that may include biometric technology.

The New York State Assembly is currently considering revisions to New York’s state education laws regarding biometric identifying technology in response to the adoption of a facial recognition system by the Lockport school district.

Specifically, FPF recommended:

FPF supports a moratorium to allow time for comprehensive study of the impact of facial recognition systems on school campuses. Our analysis of the risks and benefits of facial recognition systems suggests that an evidence-based review of widespread use of these systems in schools will likely find that the systems do not offer sufficient benefits when used for security purposes at public schools (as FPF Senior Counsel Brenda Leong discusses in this video). Although the desire to provide the highest levels of security and protection for students and school personnel is well-intentioned, it is unclear that facial recognition systems will actually make schools safer. Particularly in light of the costs of purchase, implementation, training, and maintenance, we believe the study is unlikely to find sufficient value or benefit in these systems to justify their risks and privacy impacts.

Schools may also face backlash from parents and staff who don’t want to be involved in such a system. For example, some parents who volunteer at school may wish to opt out of having their biometric information collected and stored. Although privacy best practices would require provision of an alternate method, any barrier to entry may decrease people’s willingness to volunteer or come to the school at all. For similar reasons, employees may also resist. Schools would thus incur additional costs to create alternatives for individuals who do not want to take part in a facial recognition system.

While FPF supports a moratorium on this technology, some provisions of the draft New York law contains broad language that may lead to unintended consequences. Facial recognition systems for campus security have triggered the immediate concerns, and that should be the moratorium’s target. Schools may implement facial categorization technologies in other ways that, if banned outright, would prevent or compromise current services to students. For example, schools may currently use biometric software that does not identify individuals but measures facial expressions, voice data, or gait analysis in order to help students in special education, occupational therapy, and physical therapy programs. If the ban applies broadly to all biometrics in all cases, it could unintentionally eliminate these services and programs.

Likewise, some school systems in New York have already purchased and implemented biometric systems based on fingerprints and palm prints for lunch-line efficiencies, attendance reporting, and other administrative functions. These systems are widespread throughout the country and have not typically presented high-risk factors for student privacy. Allowing these school districts to continue using these systems would prevent unnecessary costs of reverting to less-reliable technology, unless or until any risks are identified. Excessively broad language concerning biometric collection or use might even compromise the current practice of collecting the fingerprints of staff and other employees at public schools in order to run background checks, an outcome that would actually decrease student safety.

Instituting a moratorium on facial recognition technology in schools, while permitting continued operation of other existing biometric programs would mitigate privacy risks while creating time for the state to review the risks and benefits of biometric programs for students, teachers, parents, and others. The study should, of course, consider all aspects of biometrics use and make appropriate recommendations. By allowing existing programs to continue in the interim, schools could gradually make necessary changes without negatively impacting students or services.

Finally, if the study does find appropriate uses or justifications for facial recognition systems, we recommend that the current requirement to provide appropriate notice to those affected be expanded to require appropriate consent by school employees, students, visitors, and others who might be impacted. Establishing an express consent requirement and/or options to opt out are important for protecting individual privacy.

The full FPF letter to members of the State Senate can be read here, and to members of the State Assembly here

 

Other FPF School Safety & Privacy Resources:

Ethical and Privacy Protective Academic Research and Corporate Data

Is edtech helping or hindering student education?  What effect does social media have on elections? What types of user interfaces help users manage privacy settings?  Can the data collected by wearables inform health care?  In almost every area of science, academic researchers are seeking access to personal data held by companies to advance their work.

Data held by companies holds the potential to unlock new scientific insights that can benefit society and expand human knowledge. When responsibly shared with academic researchers, this data can support progress in medicine and public health, education, social science, and many other fields.

But access to the data needed is often unavailable due to a range of barriers – including the need to protect privacy, address commercial concerns, maintain ethical standards, and comply with legal obligations.

To help companies tackle these challenges, the Future of Privacy Forum has launched the Corporate-Academic Data Stewardship Research Alliance, a peer-to-peer network of private companies who share the goal of facilitating privacy-protective data sharing between businesses and academic researchers.

The work of the Alliance builds upon the 2017 FPF report, Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with Researchers. Grants from the Alfred P. Sloan Foundation allowed the FPF Education and Innovation Foundation to undertake both projects.

The Alliance will support data sharing efforts under way, help address and mitigate challenges that create barriers to sharing and promote practices that encourage more data sharing between industry and academic researchers. So far, more than 25 prominent companies are participating in the Alliance’s activities.

In its initial work, the Alliance has identified a number of existing barriers to data sharing and has begun to address potential solutions that support compliance with legal, policy and ethical concerns.

Alliance participants agree on the need for a common understanding of the legal landscape with regard to sharing personal information with researchers. In response, the Alliance is producing an overview of how the use and sharing of personal information for research purposes is treated in key privacy laws, as well as a paper that analyzes the legal landscape and argues that lawmakers should continue to make allowances for scientific research when drafting future privacy laws.

The Alliance has also begun work on establishing a set of best practices for sharing data for research purposes.  Those best practices include data security, de-identification, vendor management, due diligence, training and education, and more. This work will likely result in a guidance document or an industry Code of Conduct.

A major barrier to data sharing identified by the participants is the lack of contractual uniformity. Research institutions, some of which are subject to state procurement rules, may require their own contractual terms. This creates scaling issues, in which the company must negotiate with each institution separately. Additionally, companies want to ensure that the contracts include provisions that address and reduce the risks (privacy, security, etc.) inherent in data sharing. Going forward, the Alliance will gather or develop model contractual terms or template agreements that all parties can agree to, with the goal of easing the negotiation process and ensuring that appropriate protections for all parties, including data subjects, are included in the written agreements.

The lack of access to an Institutional Review Board (IRB) or ethics review board is another roadblock for companies. Some companies have expressed a preference for an independent third-party that could review a range of privacy and ethical issues that go beyond what a traditional IRB might address. The Alliance will support efforts to develop effective options for independent review of data sharing and the related research purposes.

To encourage privacy protective data sharing for scientific research, the Alliance will create a new Award for Leadership in Data Stewardship and Achievement in Academic Research.

The Alliance welcomes industry participants to join our monthly calls and contribute to our work.  If you are interested in learning more about the Alliance, please contact FPF Senior Fellow Mike Hintze at [email protected].

NAI’s 2020 Code of Conduct Expands Self-Regulation for Ad Tech Providers

By Christy HarrisStacey Gray, and Meredith Richards

As debates over the shape of federal privacy legislation in the United States continue, online advertising remains a key focus of scrutiny in the US Congress, with its recent hearing on digital advertising and data privacy. Amidst these debates, the Networking Advertising Initiative (NAI), the leading self-regulatory body for online advertising technology (ad tech) providers, announced a major update to its Code of Conduct on May 14, 2019. The revised 2020 Code of Conduct officially expands the scope of the NAI Code to a broader range of products and technologies in the online advertising industry and strengthens existing requirements, which is a crucial step given the recent attention to privacy in ad tech. The NAI is a non-profit, self-regulatory association responsible for creating and enforcing third-party advertising standards for data collection and uses for online and mobile advertising.

According to the NAI, the 2020 Code of Conduct is the largest overhaul of its self-regulatory requirements since the Code was originally released in 2000. The 2020 Code now includes digital advertising practices such as the use of offline data for tailored advertising, and incorporates sensor technology and real-time uses of location data. The Code update also aims to “future-proof” Tailored Advertising by covering any use of previously collected user data to target advertising across websites and apps.

Key takeaways from the 2020 Code of Conduct:

Viewed Content Advertising refers to data collected from viewed video content, for example from Smart TVs. This inclusion reflects a broader debate over regulating data privacy issues involving Smart TVs, which are increasingly collecting data on consumer activity and viewing preferences for digital marketing purposes. For more, see FPF’s 2018 description of Smart TV data collection practices, Seeing the Big Picture on Smart TVs and Smart Home Tech).

The revised NAI Code of Conduct will come into effect in January of 2020. With advertising technologies continuing to advance, federal regulators as well as state and local legislators are paying greater attention than ever before to digital advertising practices. In these debates, self-regulatory efforts play a unique role, and organizations such as the NAI have the ability, not only to go beyond existing laws in addressing consumer privacy concerns, but also to help shape evolving legislative efforts. We are glad to see that the NAI is taking major steps in the right direction in order to continue to be on the front lines of protecting consumer privacy and promoting responsible business practices.

CCPA Amendment Update June 2019 – Twelve Bills Survive Assembly and Move to the Senate

By Michelle Bae and Jeremy Greenberg

Privacy professionals seeking clarity on compliance with the California Consumer Privacy Act (CCPA) are monitoring numerous amendment bills introduced in the California State Assembly and the California State Senate. Twelve bills garnered the votes needed to pass the Assembly and moved to the Senate for further revision and voting. The Assembly’s calendar prohibits consideration of new bills this session, which means that any amendments enacted prior to the CCPA’s 2020 effective date will be based on these bills. California’s complex legislative process makes tracking amendments yeoman’s work.

This post provides clarity to an otherwise murky process by: 1) presenting an overview of the California state legislative process; 2) identifying a CCPA timeline and key deadlines; 3) analyzing the CCPA amendments that recently passed the Assembly along with noteworthy bills that failed in the Senate; and 4) outlining likely next steps for amendment efforts prior to the law’s effective date.

Legislative Process

Four moving parts of the California State Legislature impact the content and status of CCPA amendment bills: the California State Assembly, California State Senate, California Governor, and California Attorney General’s Office.

The Legislature consists of both the California State Assembly (lower house) and the California State Senate (upper house). While bills may be introduced in either house, amendments introduced in the Assembly must first pass the Assembly floor with a majority vote before moving to the Senate for further amendment and votes.

Bills that pass the Senate are sent to the Governor to be either signed into law or vetoed. In addition to any legislative amendments, the California Attorney General’s Office can clarify or modify CCPA requirements by promulgating regulations under its CCPA rulemaking authority. As part of the preliminary rulemaking process, the California AG has held seven public forums and concluded the comment period. As mentioned during the January 14th public forum, the Attorney General is likely to issue draft regulations this fall, followed by a formal notice and comment period.

Timeline and Key Deadlines

The bills that passed the Assembly have moved to the Senate for committee process, which will continue for the coming months.

CCPA Amendment Bills

Below is a summary of the key provisions of each CCPA amendment bill that passed the Assembly and moved to the Senate for further consideration:

Bill No. Subject Summary
AB 25 Exemption of “employee” from definition of “consumer” – Exempts employees and contractors from the definition of “consumer.”
AB 846 Loyalty programs – Permits loyalty programs with the consumer’s affirmative consent and voluntary participation;

– Prohibits loyalty programs that are unjust, coercive, or unreasonable.

AB 873 Definition of personal information – Revises definition of de-identification to include data that does not identify and is not “reasonably linkable” to a consumer;

– Revises definition of personal information as being “reasonably linkable” to a consumer.

AB 874 Carve-outs from personal information – Redefines and excludes “publicly available information” from the definition of personal information;

– Clarifies that personal information does not include de-identitied or aggregated data.

AB 981 Insurance transactions –  Removes consumer’s right to delete or not sell personal information if it is necessary to perform an insurance transaction.
AB 1138 Social Media: Parent’s Accountability and Child Protection Act – Requires parental/guardian consent for children under 13 to create an account with a social media website or app.
AB 1146 Exemption for vehicle information – Exempts vehicle information shared between a new auto dealer and a vehicle manufacturer when information is shared or retained pursuant to, or in anticipation of, a vehicle repair relating to warranty work or recall.
AB 1202 Data broker requirements – Defines a “data broker” and requires data brokers to register with and provide certain information to the Attorney General, and failure to register may lead to liability (civil penalties, fees and costs).
AB 1281 Facial recognition technology: disclosure – Requires businesses that use facial recognition technology to disclose that usage in a physical sign that is clear and conspicuous at the entrance of every location that uses facial recognition technology.
AB 1355 Addressing differential treatment and disclosures – Under the non-discrimination provision, allows differential treatment of a consumer who has exercised CCPA rights if the differential treatment is reasonably related to value provided to the business by the consumer’s information;

– Requires businesses to disclose in their privacy policy consumer’s right to request specific pieces of information and the categories of information collected by businesses.

AB 1416 Exceptions for businesses – Allows an exception for businesses to:

– Provide personal information to a government agency solely for the purposes of carrying out a government program;

– Sell personal information of consumers who have opted-out of sale solely for the purpose of detecting security incidents, fraud or illegal activity prevention.

AB 1564 Consumer requests – Requires businesses to make available a toll-free number or a physical address and email address for submitting requests;

– A business that exclusively operates online is only required to provide an email address for requests.

Below is an analysis of two noteworthy bills that failed in the Senate, which contain top-of-mind concerns for privacy professionals. Although these particular bills will not move forward this year, they are noteworthy because the senators who authored the bills will now vote on, and potentially revise, bills passed by the Assembly.

Bill No. Subject Summary
SB 561 Private right of action – Authored by Senator Hannah-Beth Jackson, the bill would have expanded the private right of action to privacy violations, including statutory damage class actions, and eliminated the 30-day right to cure.
SB 753 Exemption of targeted advertising, definition of sale – Authored by Senator Stern, the bill would have allowed a carve-out for certain targeted advertisements from selling requirements and narrowed the definition of “sale.”

Next Steps and What to Expect

Attention is shifting to the bills that have passed the Assembly and are under consideration by the Senate. While it is impossible to accurately predict which amendments will pass the Senate, it is important to note that the Senate has traditionally exhibited a different sensibility than the Assembly, and it is possible that Senators, such as Sen. Hanna Beth Jackson (author of SB-561) and Sen. Henry Stern (author of SB-753), might attempt to rewrite amendments or vote against amendments that do not align with ideals championed by the failed amendments.

Bills approved by the Senate will then cross the Governor’s desk to be either signed into law or vetoed in mid-October. We expect the California AG’s office will clarify ambiguities via its CCPA rulemaking authority in the fall. With all of these elements at play, it is safe to say that a dynamic amendment process will continue over the next several months.

Privacy professionals would be prudent to expect that the core provisions of the CCPA will likely remain intact. And although some of the amendments under consideration could result in significant ramifications if signed into law, it is crucial for companies to focus on the core obligations of CCPA to comply with the 2020 effective date.