The Elise Berkower Memorial Fellowship

FPF launched a new fellowship in memory of Elise Berkower. Elise was a senior privacy executive at global measurement and data analytics company Nielsen for nearly a decade and was a valued, longtime member of the FPF Advisory Board. FPF graciously acknowledges the Berkower Family, Nielsen Foundation, and IAPP as founding sponsors of the Elise Berkower Memorial Fellowship.

Her Career

Elise served as Chief Privacy Counsel at Nielsen, playing a lead role in efforts to ensure the company was best in class in the way it handles consumer data, in addition to playing an active role in working across industry groups to help raise the bar of consumer protection across a range of organizations.

The Nielsen Foundation is a private foundation funded by Nielsen. The Nielsen Foundation seeks to enhance use of data by the social sector to reduce discrimination, ease global hunger, promote effective education, and build strong leadership.

While at Nielsen, Elise was heavily involved in recruiting, training and mentoring interns, creating a pathway for a generation of students to enter the privacy field. Elise provided nearly 100 students with an opportunity to gain the experience and build the relationships needed to launch their careers in this field. The Elise Berkower Memorial Fellowship is designed for recent law school graduates and will honor Elise’s legacy by identifying and nurturing young lawyers interested in contemporary privacy issues with a focus on consumer protection and business ethics.

The Fellowship

The Elise Berkower Memorial Fellowship is a one-year, public interest position for recent law school graduates committed to the advancement of responsible data practices. Candidates will be selected based on both academic qualifications and a commitment to the personal qualities exemplified by Elise – collaboration with co-workers, peers and the broader privacy community and a commitment to ethical conduct.

The Elise Berkower Memorial Fellow will focus on consumer and commercial privacy issues, from technology-specific areas such as advertising practices, drones, wearables, connected cars, and student privacy, to general data management and privacy issues related to ethics, de-identification, algorithms, and the Internet of Things.


Apply



Donate


FPF is inviting supporters to help us fully fund the Elise Berkower Memorial Fellowship. Donations of all sizes are welcome. Thank you for your support!

To donate, click “Tickets” on this page.


Additional Information


Future of Privacy Forum Launches Fellowship in Memory of Privacy Hero Elise Berkower

FOR IMMEDIATE RELEASE             

March 26, 2018

Contact: Melanie Bates, Director of Communications, [email protected]

Future of Privacy Forum Launches Fellowship in Memory of Privacy Hero Elise Berkower

Washington, DC – Today, the Future of Privacy Forum announced the launch of a new fellowship in memory of Elise Berkower. Elise was a senior privacy executive at global measurement and data analytics company Nielsen for nearly a decade and was a valued, longtime member of the FPF Advisory Board. FPF graciously acknowledges the Berkower Family and the Nielsen Foundation as founding sponsors of the Elise Berkower Memorial Fellowship.

“Elise was passionate about mentoring and teaching recent law school graduates because in addition to spreading her vast knowledge and experience to others, Elise understood that when teaching the teacher also learns and fine tunes their understanding,” said Howard Berkower, Elise’s brother. “Our family can think of no better way to honor Elise’s personal and professional life of generosity, commitment and accomplishment than to establish this Privacy Fellowship.”

Elise served as Chief Privacy Counsel at Nielsen, playing a lead role in efforts to ensure the company was best in class in the way it handles consumer data, in addition to playing an active role in working across industry groups to help raise the bar of consumer protection across a range of organizations.

“Elise’s leadership was critical in laying the foundation for what today is Nielsen’s industry-leading privacy efforts,” said Eric J. Dale, Nielsen’s Chief Legal Officer and Secretary and Director of the Nielsen Foundation.  “We are so excited about the FPF fellowship in her honor, which allows Elise’s legacy of excellence and development in privacy to continue for years to come.  We are thrilled that the Nielsen Foundation is a founding sponsor of the Elise Berkower Memorial Fellowship.”

The Nielsen Foundation is a private foundation funded by Nielsen. The Nielsen Foundation seeks to enhance use of data by the social sector to reduce discrimination, ease global hunger, promote effective education, and build strong leadership.

While at Nielsen, Elise was heavily involved in recruiting, training and mentoring interns, creating a pathway for a generation of students to enter the privacy field. Elise provided nearly 100 students with an opportunity to gain the experience and build the relationships needed to launch their careers in this field. The Elise Berkower Memorial Fellowship is designed for recent law school graduates and will honor Elise’s legacy by identifying and nurturing young lawyers interested in contemporary privacy issues with a focus on consumer protection and business ethics.

“Elise had a passion for privacy that was infectious,” said Farah Zaman, Senior Global Data Privacy Counsel, Colgate-Palmolive Company. “She could produce a practical, insightful, and thorough legal and technical solution off the top of her head, and educate interns, junior and senior legal colleagues, and even counsel across the table while doing so.  Elise was a remarkable person to work for not only because of her brilliance, but also because of her profound example of how to simultaneously be an effective in-house counsel, privacy advocate, and infinitely kind and thoughtful person. The legal profession and privacy community will be advanced by any attorney or intern who follows her example.”

The fellowship will be a one-year, public interest position for recent law school graduates committed to the advancement of responsible data practices. Candidates will be selected based on both academic qualifications and a commitment to the personal qualities exemplified by Elise – collaboration with co-workers, peers and the broader privacy community and a commitment to ethical conduct.

“She was one of the brightest stars in our privacy community,” said Zoe Strickland, Managing Director, Global Chief Privacy Officer, JPMorgan Chase. “She had such a deep understanding of privacy in that complicated intersection of data use, technology, tracking, and aggregation. And a quick and sharp wit as a bonus. This fellowship is a testament to her skill and legacy, and provides a wonderful avenue for students to follow in her big footsteps.”

“Elise was a quiet, but powerful, force in the privacy community,” said Trevor Hughes, President & CEO of the International Association of Privacy Professionals.  “For almost two decades, she worked for better outcomes for all involved. Her influence can be seen in many of the privacy standards that we work with today. She is terribly missed, and fondly remembered.”

“Elise was one of the quiet heroines of the earliest days of privacy compliance,” said Nuala O’Connor, President & CEO of Center for Democracy & Technology. “She was so much more than just a respected and valued colleague; she was a friend and a teacher and a partner to many. She was the best example of how to imbue privacy and data ethics into an organization.”

“Elise was one of the founding Board Members of the NAI, providing groundbreaking and effective leadership in developing standards and best practices for digital technology companies facing emerging privacy and public policy issues raised by complicated business models,” said Leigh Freund, President & CEO of the Network Advertising Initiative (NAI). “Elise’s incredible knack for diplomatic negotiation and collaboration with industry leaders at a critical point in the development of the internet economy resulted in a lasting legacy of effective and responsible self-regulation that we can all be proud of.”

“When I became Chief Privacy Officer at DoubleClick, I needed someone with a deep commitment to consumer protection to work with and educate thousands of companies and clients, teaching many of them for the first time the basics of internet privacy,” said Jules Polonetsky, FPF’s CEO. “Elise joined DoubleClick from her role as Chief Administrative Law Judge at the New York City Department of Consumer Affairs and played a critical role in shaping DoubleClick’s best practices as well as setting a standard for the industry.”

The Elise Berkower Memorial Fellow will focus on consumer and commercial privacy issues, from technology-specific areas such as advertising practices, drones, wearables, connected cars, and student privacy, to general data management and privacy issues related to ethics, de-identification, algorithms, and the Internet of Things.

To apply for the inaugural Elise Berkower Memorial Fellowship or to support our efforts, please visit https://fpf.org/2018/03/26/the-elise-berkower-memorial-fellowship.

 

### 

The Future of Privacy Forum (FPF) is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. 

Deciphering “Legitimate Interests”: Report based on more than 40 cases from practice

FPF and Nymity collaborated to compile a Report on actual cases from practice and relevant guidance from the Article 29 Working Party and individual Data Protection Authorities (DPAs) concerning the use of “legitimate interests” as a lawful ground for processing under EU data protection law. Our aim is to help organizations better understand how to use and apply legitimate interests as a lawful basis for processing, while at the same time contributing to enhanced personal data protection for individuals.

We have identified specific cases that have been decided at national level by DPAs and Courts from the European Economic Area (EEA), as well as the most relevant cases where the Court of Justice of the European Union interpreted and applied the “legitimate interests” ground. We looked at cases across industries and we compiled them in two lists: one for uses of this ground that were found lawful and one for uses that were found unlawful.

There are over 40 cases discussed representing a wide variety of data processing activities from over 15 countries, such as:

The summary of cases contain useful examples of how the “balancing exercise” is conducted in practice, and in many instances, the safeguards that were needed to tilt the balance and make the processing lawful. Two examples are provided below.

READ REPORT

Facebook and Cambridge Analytica: Statement by Jules Polonetsky, FPF CEO

Yesterday, Mark Zuckerberg addressed concerns about misuse of Facebook users data by Cambridge Analytica.

I think Mark well addresses the key issues. The biggest change was made back in 2014 when Facebook altered the platform to reduce data access by apps. But did any other apps collect a suspicious amount of data back then? Facebook will conduct a full audit of any app with suspicious activity and ban apps that misused personal information. We hope those will also be reported to relevant authorities when appropriate.

Facebook will also tell people who are affected by apps that have misused their data and will build a way for people to know if their data might have been accessed via the “thisisyourdigitallife” app that provided data to Cambridge Analytica. Moving forward, if Facebook removes an app for misusing data, they will tell everyone who used it.

Facebook will also turn off access to information for unused apps, if someone hasn’t used an app for 3 months, which makes sense.

Do you ever use Login with Facebook on other apps or sites? In the next version, apps will only be able to request name, profile photo and email address, unless they get special approval.

I just checked and my Facebook profile is linked to dozens of apps. I turned a number of them off. A few are super useful – when I use TripAdvisor, I love seeing reviews by my FB friends listed first, so I can assess whether to take the review seriously! But as a Facebook power user, even I had to poke around to find the setting that displayed that information. Going forward, Facebook promises to make these choices more prominent and easier to manage.

And finally, Facebook will expand its bug bounty program to reward people who report misuses of data by app developers.

These are clearly all useful steps forward and should help shut the door on shady apps misusing Facebook data.

Thinking more broadly, it seems clear that many of the issues raised by the Cambridge Analytica controversy are not exclusive to a particular platform, data practice, or policy.

Do we need baseline, comprehensive privacy legislation in the US, with common sense data protections for users and greater certainty for companies?

Should the FTC have authority over political orgs and other non-profits to police unfair and deceptive practices? The Commission currently lacks this. And Congress is typically reluctant to pass laws that impact campaigns and political parties – the organizations that help members earn re-election.

How can new targeting capabilities – across the online ecosystem – be made more transparent for elections and issue campaigns? Do we need standards for election ads in the 21st century? Can we have data standards that are clear about what behavior is appropriate and what is not when communicating through TV, radio, in print, and online? This is an issue in the US and abroad, as the Irish Data Protection Commissioner (who handled and helped resolve a few years ago the issues of Facebook apps grabbing too much data) explains: “the micro-targeting of social media users with political ads remains an ongoing issue today.”  Although GDPR does capture the activity of political actors in Europe, guidance in the form of a Code of Conduct for political advertisers would be welcome, according to a new opinion from the European Data Protection Supervisor.

How can we support more legitimate research – for transparency about platforms and their impact, and for broader needs of society and science?  Can we have research programs managed by companies that provide more controls, a good vetting process, better informed consent for research, corporate ethics review processes?

Can we have a sophisticated conversation about the risks and mitigation strategies regarding data portability? Worries about Cambridge Analytica’s data exfiltration implicate many of the same issues raised by data portability tools and GDPR Article 20.

We are pleased to see Facebook’s response, but are looking forward to understanding how to best address the broader issues for all stakeholders.  These issues are important to discuss – they are not going away.

Understanding Session Replay Scripts – a Guide for Privacy Professionals

Over the last few months, privacy researchers at Princeton University’s Center for Information Technology Policy (CITP) have published the results of ongoing research demonstrating that many website operators are using third-party tools called “session replay scripts” to track visitors’ individual browsing sessions, including their keystrokes and mouse movements. These “session replay scripts,” typically used as analytics tools for publishers to better understand how visitors are navigating their websites, were found on 482 of the 50,000 most trafficked websites, including government (.gov) and educational (.edu) websites, and websites of major retailers.

As the research demonstrated, session replay scripts can raise serious privacy concerns if implemented incorrectly, causing security vulnerabilities and the potential for inadvertent collection of personal data (e.g. credit card numbers, health information, or other sensitive data). Therefore, privacy professionals should be involved in decisions related to whether and how to use these kinds of tools, and should carefully consider their usefulness and potential risks. With the right privacy and security safeguards in place, however, limited implementation of session replay scripts can be part of a range of ordinary, useful third-party web analytics tools.

FPF has developed a three-page guide for privacy professionals, who can in turn assist website marketing and design teams with decisions about whether and how to implement these types of analytics scripts. In this guide, we define and describe the term “session replay scripts,” and provide a checklist of privacy tips to use when deciding how best to implement them. In deciding whether and how to implement third-party scripts, privacy professionals should evaluate script providers’ terms and privacy policies, carefully select which pages within a site may or may not be appropriate for their use, and continue to assess the strength of technical safeguards — such as automated and manual redaction tools — over time.

Download the 3-page Guide here (link to PDF).

More Resources:

Taming The Golem: Challenges of Ethical Algorithmic Decision-Making

Omer Tene and Jules Polonetsky recently published, “Taming The Golem: Challenges of Ethical Algorithmic Decision-Making,” in Volume 19, Issue 1, of the North Carolina Journal of Law and Technology.

The prospect of digital manipulation on major online platforms reached fever pitch in the last election cycle in the United States. Jonathan Zittrain’s concern about “digital gerrymandering” found resonance in reports, which were resoundingly denied by Facebook, of the company’s alleged editing of content to tone down conservative voices. At the start of the last election cycle, critics blasted Facebook for allegedly injecting editorial bias into an apparently neutral content generator: its “Trending Topics” feature. Immediately after the election when the extent of dissemination of “fake news” through social media became known, commentators chastised Facebook for not proactively policing user- generated content to block and remove untrustworthy information. Which one is it then? Should Facebook have employed policy- directed technologies or should its content algorithm have remained policy-neutral?

This article examines the potential for bias and discrimination in automated algorithmic decision-making. As a group of commentators recently asserted, “[t]he accountability mechanisms and legal standards that govern such decision processes have not kept pace with technology.” Yet this article rejects an approach that depicts every algorithmic process as a “black box” that is inevitably plagued by bias and potential injustice. While recognizing that algorithms are man-made artifacts, written and edited by humans in order to code decision-making processes, the article argues that a distinction should be drawn between “policy-neutral algorithms,” which lack an active editorial hand, and “policy-directed algorithms,” which are intentionally framed to further a designer’s policy agenda.

Policy-neutral algorithms could, in some cases, reflect existing societal biases and historical inequities. Companies, in turn, can choose to fix their results through active social engineering. For example, after facing controversy in light of an algorithmic determination to not offer same-day delivery in low-income neighborhoods, Amazon nevertheless recently decided to provide those services in order to pursue an agenda of equal opportunity. Recognizing that its decision-making process, which was based on logistical factors and expected demand, had the effect of facilitating prevailing social inequality, Amazon chose to level the playing field.

Policy-directed algorithms are purposely engineered to correct for apparent bias and discrimination or to advance a predefined policy agenda. In this case, it is essential that companies provide transparency about their active pursuits of editorial policies. For example, if a search engine decides to scrub results clean of opposing viewpoints, it should let users know they are seeing a manicured version of the world. If a service optimizes results for financial motives without alerting users, it risks violating FTC standards for disclosure. So too should service providers consider themselves obligated to prominently disclose important criteria that reflect an unexpected policy agenda. The transparency called for is not one based on revealing source code but rather public accountability about the editorial nature of the algorithm.

The article addresses questions surrounding the boundaries of responsibility for algorithmic fairness and analyzes a series of case studies under the proposed framework.

READ ARTICLE