FPF Comments on Minnesota Student Privacy Bill HF 1507
Yesterday, the Future of Privacy Forum submitted written comments to members of the Minnesota House of Representatives in response to the pending student privacy bill, the Student Data Privacy Act (HF 1507). FPF expressed concerns about the proposed language of the bill, which would create conflicting requirements for schools and education technology companies, and likely cause unintended consequences for Minnesota schools and students. The primary concerns were:
Inconsistent definitions in the Student Data Privacy Act versus the federal student privacy law,
FERPA;
Potentially increasing the likelihood of data breaches by requiring companies to publicize their
security practices and procedures; and
Students likely missing out on important educational programs and services due to opt-out
FPF Welcomes New Members to the Education Privacy Project
We are thrilled to announce four new members of FPF’s Education Privacy Project. Led by Amelia Vance, Director of Education Privacy, the Project works to equip and connect parents, educators, state and local education agencies, ed tech companies, and other stakeholders with substantive practices, policies, and other solutions to address education privacy challenges.
Our new Senior Fellow, Dr. Monica Bulger, is leading FPF’s initiatives to curate and create state and district-level teaching resources to support those responsible for student data privacy, security, and confidentiality. Our three new staff members – Sara Collins, Tyler Park, and Erika Ross – will help expand FPF’s state and federal legislative tracking, resource creation, and technical assistance.
You can read more about Monica, Sara, Tyler, and Erika below. Please join us in welcoming them to the team!
Monica Bulger
Dr. Monica Bulger joins FPF as a Senior Fellow. Her work will include studying privacy trade-offs regarding student data use; curating and developing student privacy training resources for K-12 schools; weighing in on student privacy news and developments; and surveying the priorities and needs of parents, teachers, FPF Advisory Board members, and other stakeholders. Monica’s recent publications include The Promises, Challenges, and Futures of Media Literacy(2018), The Legacy of inBloom(2017), Where Policy and Practice Collide: Comparing United States, South Africa, and European Union Approaches to Protecting Children Online (2017) and Personalized Learning: The Conversations We’re Not Having(2016). Dr. Bulger recently developed a twelve-part media literacy series aimed at teens for Crash Course on YouTube. She co-authors the Student Data Privacy, Equity and Digital Literacy newsletter with the Youth and Media team at the Berkman Klein Center for Internet & Society at Harvard University. She regularly convenes discussions among policymakers, technologists, researchers, journalists, and educators about issues affecting youth rights, such as privacy, online safety, and media literacy. A 2014-2015 Fellow at the Berkman Klein Center, she is a Research Associate at the Oxford Internet Institute, an Affiliate of the Data & Society Research Institute, and a Fellow of Fundación Ceibal. Dr. Bulger has contributed policy research to UNICEF, the U.S. Department of Justice, and the European Commission. She is a Teaching Fellow of the National Writing Project and holds a PhD in Education from the University of California, Santa Barbara with emphases in cognitive science and the social dimensions of technology.
Sara Collins
Sara Collins joins FPF as a Policy Counsel for the Education Privacy Project. Sara previously worked as an investigations attorney in the Enforcement Unit at Federal Student Aid. She has also worked as the Director of Legal Services for Veterans Education Success, a non-profit focused on helping veterans who have been harmed by predatory education institutions.
Sara graduated from the Georgetown University Law Center in 2014, where she was the symposium editor of the Journal of Gender and the Law. After graduating law school, she completed a Policy & Law Fellowship at the Amara Legal Center, an organization dedicated to fighting domestic sex trafficking within the DMV area. Originally from Chicago, Sara attended the University of Illinois, where she received a B.A. in both Political Science and English.
Tyler Park
Tyler Park joins FPF as an Education Privacy Fellow. Prior to joining FPF, he served as a Legal Fellow in the Wireless Telecommunications Bureau at the Federal Communications Commission, where he worked on a diverse set of spectrum management and licensing issues, most prominently implementing the 2015 3.5 GHZ Order and the ongoing Signal Boosters proceeding. Before joining the FCC, Tyler interned at Hogan Lovells BSTL in Mexico City where he worked in the firm’s Telecommunications Media and Technology group. He also worked as a Research Assistant for the University of Colorado Law School. Tyler graduated from the University of Colorado Law School, where he focused on communications law. While in law school, he was a member of the Colorado Technology Law Journal. Tyler earned his Bachelor’s Degree from the University of California, Davis, majoring in Political Science and Spanish.
Erika Ross
Erika joins FPF as a Communications Associate for the Education Privacy Project. She supports communications by maintaining and promoting FERPA|Sherpa, expanding FPF’s social media following regarding education privacy issues, and drafting content for newsletter development. Erika also supports FPF’s education privacy communications plan, manages media relations, and promotes events and publications.
Erika earned in Bachelor’s degree from the University of Michigan majoring in political science. There she worked as a research assistant for the Civil Rights Litigation Clearinghouse and an assistant facilitator of the Youth Arts Alliance Program, working with detained youth. After graduation, Erika became a Teach for America corps member in Charlotte, North Carolina, and earned certification in Middle School and Secondary Social Studies. Before joining FPF, Erika worked as a Policy Advocacy Fellow for the School Justice Project and Policy and a Communications Fellow at the National Council on Teacher Quality. She is currently pursuing her Masters of Public Policy degree at the Trachtenberg School at George Washington University, with a focus on Program Evaluation.
Policymakers, regulators, and privacy executives interact with latest connected tech at FPF’s Third Annual Tech Lab
FPF held the Third Annual Tech Lab Open House Monday, March 26, 2018, at our offices in Washington, D.C. The Tech Lab Open House provided an opportunity for us to host Privacy Commissioners and FPF members who were in town for the International Association of Privacy Professionals’ Global Privacy Summit. The Tech Lab contained several smart toys and smart home gadgets. The event provided a rare occasion for policymakers, regulators, and thought leaders to interact with the latest in privacy-impacting gadgets and new technologies.
Attendees had the opportunity to come face to face with facial recognition, learn how Wi-Fi and Proximity Sensors can be used to track smartphones in our space, compare ancestry analyses from leading direct-to-consumer genetics services, share fun moments with Snap Spectacles, play with Anki’s Cozmo, CognitToys Dino, ChiP Robot, Amazon Echo Show, and much more!
The Tech Lab was widely attended by chief privacy officers, regulators, advocates, academics, and other privacy professionals who encounter the policies and regulations governing the type of technology that was on display. We were delighted to be joined by several distinguished guests: Lahoussine Aniss, General Secretary of the Moroccan Data Protection Authority; Alon Bachar, Commissioner, Israel Privacy Protection Authority; Giovanni Buttarelli, European Data Protection Supervisor; Christian D’Cunha, Head of Private Office of the Supervisor, European Data Protection Supervisor; Bruno Gencarelli, Head of Unit – International Data Flows and Protection, European Commission; John O’Dwyer, Deputy Commissioner, Irish Data Protection Commission; Oz Shenhav, Director of Innovation and Policy Development, Israel Privacy Protection Authority; Zee Kin Yoeong, Personal Data Protection Commission of Singapore. We were honored to have special remarks by Supervisor Buttarelli, Secretary Aniss, and Commissioner Bachar. You can watch them below.
FPF launched a new fellowship in memory of Elise Berkower. Elise was a senior privacy executive at global measurement and data analytics company Nielsen for nearly a decade and was a valued, longtime member of the FPF Advisory Board. FPF graciously acknowledges the Berkower Family, Nielsen Foundation, and IAPP as founding sponsors of the Elise Berkower Memorial Fellowship.
Her Career
Elise served as Chief Privacy Counsel at Nielsen, playing a lead role in efforts to ensure the company was best in class in the way it handles consumer data, in addition to playing an active role in working across industry groups to help raise the bar of consumer protection across a range of organizations.
The Nielsen Foundation is a private foundation funded by Nielsen. The Nielsen Foundation seeks to enhance use of data by the social sector to reduce discrimination, ease global hunger, promote effective education, and build strong leadership.
While at Nielsen, Elise was heavily involved in recruiting, training and mentoring interns, creating a pathway for a generation of students to enter the privacy field. Elise provided nearly 100 students with an opportunity to gain the experience and build the relationships needed to launch their careers in this field. The Elise Berkower Memorial Fellowship is designed for recent law school graduates and will honor Elise’s legacy by identifying and nurturing young lawyers interested in contemporary privacy issues with a focus on consumer protection and business ethics.
The Fellowship
The Elise Berkower Memorial Fellowship is a one-year, public interest position for recent law school graduates committed to the advancement of responsible data practices. Candidates will be selected based on both academic qualifications and a commitment to the personal qualities exemplified by Elise – collaboration with co-workers, peers and the broader privacy community and a commitment to ethical conduct.
The Elise Berkower Memorial Fellow will focus on consumer and commercial privacy issues, from technology-specific areas such as advertising practices, drones, wearables, connected cars, and student privacy, to general data management and privacy issues related to ethics, de-identification, algorithms, and the Internet of Things.
FPF is inviting supporters to help us fully fund the Elise Berkower Memorial Fellowship. Donations of all sizes are welcome. Thank you for your support!
Future of Privacy Forum Launches Fellowship in Memory of Privacy Hero Elise Berkower
FOR IMMEDIATE RELEASE
March 26, 2018
Contact: Melanie Bates, Director of Communications, [email protected]
Future of Privacy Forum Launches Fellowship in Memory of Privacy Hero Elise Berkower
Washington, DC – Today, the Future of Privacy Forum announced the launch of a new fellowship in memory of Elise Berkower. Elise was a senior privacy executive at global measurement and data analytics company Nielsen for nearly a decade and was a valued, longtime member of the FPF Advisory Board. FPF graciously acknowledges the Berkower Family and the Nielsen Foundation as founding sponsors of the Elise Berkower Memorial Fellowship.
“Elise was passionate about mentoring and teaching recent law school graduates because in addition to spreading her vast knowledge and experience to others, Elise understood that when teaching the teacher also learns and fine tunes their understanding,” said Howard Berkower, Elise’s brother. “Our family can think of no better way to honor Elise’s personal and professional life of generosity, commitment and accomplishment than to establish this Privacy Fellowship.”
Elise served as Chief Privacy Counsel at Nielsen, playing a lead role in efforts to ensure the company was best in class in the way it handles consumer data, in addition to playing an active role in working across industry groups to help raise the bar of consumer protection across a range of organizations.
“Elise’s leadership was critical in laying the foundation for what today is Nielsen’s industry-leading privacy efforts,” said Eric J. Dale, Nielsen’s Chief Legal Officer and Secretary and Director of the Nielsen Foundation. “We are so excited about the FPF fellowship in her honor, which allows Elise’s legacy of excellence and development in privacy to continue for years to come. We are thrilled that the Nielsen Foundation is a founding sponsor of the Elise Berkower Memorial Fellowship.”
The Nielsen Foundation is a private foundation funded by Nielsen. The Nielsen Foundation seeks to enhance use of data by the social sector to reduce discrimination, ease global hunger, promote effective education, and build strong leadership.
While at Nielsen, Elise was heavily involved in recruiting, training and mentoring interns, creating a pathway for a generation of students to enter the privacy field. Elise provided nearly 100 students with an opportunity to gain the experience and build the relationships needed to launch their careers in this field. The Elise Berkower Memorial Fellowship is designed for recent law school graduates and will honor Elise’s legacy by identifying and nurturing young lawyers interested in contemporary privacy issues with a focus on consumer protection and business ethics.
“Elise had a passion for privacy that was infectious,” said Farah Zaman, Senior Global Data Privacy Counsel, Colgate-Palmolive Company. “She could produce a practical, insightful, and thorough legal and technical solution off the top of her head, and educate interns, junior and senior legal colleagues, and even counsel across the table while doing so. Elise was a remarkable person to work for not only because of her brilliance, but also because of her profound example of how to simultaneously be an effective in-house counsel, privacy advocate, and infinitely kind and thoughtful person. The legal profession and privacy community will be advanced by any attorney or intern who follows her example.”
The fellowship will be a one-year, public interest position for recent law school graduates committed to the advancement of responsible data practices. Candidates will be selected based on both academic qualifications and a commitment to the personal qualities exemplified by Elise – collaboration with co-workers, peers and the broader privacy community and a commitment to ethical conduct.
“She was one of the brightest stars in our privacy community,” said Zoe Strickland, Managing Director, Global Chief Privacy Officer, JPMorgan Chase. “She had such a deep understanding of privacy in that complicated intersection of data use, technology, tracking, and aggregation. And a quick and sharp wit as a bonus. This fellowship is a testament to her skill and legacy, and provides a wonderful avenue for students to follow in her big footsteps.”
“Elise was a quiet, but powerful, force in the privacy community,” said Trevor Hughes, President & CEO of the International Association of Privacy Professionals. “For almost two decades, she worked for better outcomes for all involved. Her influence can be seen in many of the privacy standards that we work with today. She is terribly missed, and fondly remembered.”
“Elise was one of the quiet heroines of the earliest days of privacy compliance,” said Nuala O’Connor, President & CEO of Center for Democracy & Technology. “She was so much more than just a respected and valued colleague; she was a friend and a teacher and a partner to many. She was the best example of how to imbue privacy and data ethics into an organization.”
“Elise was one of the founding Board Members of the NAI, providing groundbreaking and effective leadership in developing standards and best practices for digital technology companies facing emerging privacy and public policy issues raised by complicated business models,” said Leigh Freund, President & CEO of the Network Advertising Initiative (NAI). “Elise’s incredible knack for diplomatic negotiation and collaboration with industry leaders at a critical point in the development of the internet economy resulted in a lasting legacy of effective and responsible self-regulation that we can all be proud of.”
“When I became Chief Privacy Officer at DoubleClick, I needed someone with a deep commitment to consumer protection to work with and educate thousands of companies and clients, teaching many of them for the first time the basics of internet privacy,” said Jules Polonetsky, FPF’s CEO. “Elise joined DoubleClick from her role as Chief Administrative Law Judge at the New York City Department of Consumer Affairs and played a critical role in shaping DoubleClick’s best practices as well as setting a standard for the industry.”
The Elise Berkower Memorial Fellow will focus on consumer and commercial privacy issues, from technology-specific areas such as advertising practices, drones, wearables, connected cars, and student privacy, to general data management and privacy issues related to ethics, de-identification, algorithms, and the Internet of Things.
The Future of Privacy Forum (FPF) is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies.
Deciphering “Legitimate Interests”: Report based on more than 40 cases from practice
FPF and Nymity collaborated to compile a Report on actual cases from practice and relevant guidance from the Article 29 Working Party and individual Data Protection Authorities (DPAs) concerning the use of “legitimate interests” as a lawful ground for processing under EU data protection law. Our aim is to help organizations better understand how to use and apply legitimate interests as a lawful basis for processing, while at the same time contributing to enhanced personal data protection for individuals.
We have identified specific cases that have been decided at national level by DPAs and Courts from the European Economic Area (EEA), as well as the most relevant cases where the Court of Justice of the European Union interpreted and applied the “legitimate interests” ground. We looked at cases across industries and we compiled them in two lists: one for uses of this ground that were found lawful and one for uses that were found unlawful.
There are over 40 cases discussed representing a wide variety of data processing activities from over 15 countries, such as:
Using key-logger software for employee monitoring
Use of GPS tracking data for private investigations
Disclosing health data for litigation purposes
Disclosing personal data for debt collection purposes
Sending emails without consent for electoral purposes
Publishing the sale price of homes that are no longer on the market
Video surveillance of a swimming pool area
Recording data for historical research purposes
Recording employee misconduct
The summary of cases contain useful examples of how the “balancing exercise” is conducted in practice, and in many instances, the safeguards that were needed to tilt the balance and make the processing lawful. Two examples are provided below.
Facebook and Cambridge Analytica: Statement by Jules Polonetsky, FPF CEO
Yesterday, Mark Zuckerberg addressed concerns about misuse of Facebook users data by Cambridge Analytica.
I think Mark well addresses the key issues. The biggest change was made back in 2014 when Facebook altered the platform to reduce data access by apps. But did any other apps collect a suspicious amount of data back then? Facebook will conduct a full audit of any app with suspicious activity and ban apps that misused personal information. We hope those will also be reported to relevant authorities when appropriate.
Facebook will also tell people who are affected by apps that have misused their data and will build a way for people to know if their data might have been accessed via the “thisisyourdigitallife” app that provided data to Cambridge Analytica. Moving forward, if Facebook removes an app for misusing data, they will tell everyone who used it.
Facebook will also turn off access to information for unused apps, if someone hasn’t used an app for 3 months, which makes sense.
Do you ever use Login with Facebook on other apps or sites? In the next version, apps will only be able to request name, profile photo and email address, unless they get special approval.
I just checked and my Facebook profile is linked to dozens of apps. I turned a number of them off. A few are super useful – when I use TripAdvisor, I love seeing reviews by my FB friends listed first, so I can assess whether to take the review seriously! But as a Facebook power user, even I had to poke around to find the setting that displayed that information. Going forward, Facebook promises to make these choices more prominent and easier to manage.
And finally, Facebook will expand its bug bounty program to reward people who report misuses of data by app developers.
These are clearly all useful steps forward and should help shut the door on shady apps misusing Facebook data.
Thinking more broadly, it seems clear that many of the issues raised by the Cambridge Analytica controversy are not exclusive to a particular platform, data practice, or policy.
Do we need baseline, comprehensive privacy legislation in the US, with common sense data protections for users and greater certainty for companies?
Should the FTC have authority over political orgs and other non-profits to police unfair and deceptive practices? The Commission currently lacks this. And Congress is typically reluctant to pass laws that impact campaigns and political parties – the organizations that help members earn re-election.
How can new targeting capabilities – across the online ecosystem – be made more transparent for elections and issue campaigns? Do we need standards for election ads in the 21st century? Can we have data standards that are clear about what behavior is appropriate and what is not when communicating through TV, radio, in print, and online? This is an issue in the US and abroad, as the Irish Data Protection Commissioner (who handled and helped resolve a few years ago the issues of Facebook apps grabbing too much data) explains: “the micro-targeting of social media users with political ads remains an ongoing issue today.” Although GDPR does capture the activity of political actors in Europe, guidance in the form of a Code of Conduct for political advertisers would be welcome, according to a new opinion from the European Data Protection Supervisor.
How can we support more legitimate research – for transparency about platforms and their impact, and for broader needs of society and science? Can we have research programs managed by companies that provide more controls, a good vetting process, better informed consent for research, corporate ethics review processes?
Can we have a sophisticated conversation about the risks and mitigation strategies regarding data portability? Worries about Cambridge Analytica’s data exfiltration implicate many of the same issues raised by data portability tools and GDPR Article 20.
We are pleased to see Facebook’s response, but are looking forward to understanding how to best address the broader issues for all stakeholders. These issues are important to discuss – they are not going away.
Understanding Session Replay Scripts – a Guide for Privacy Professionals
Over the last few months, privacy researchers at Princeton University’s Center for Information Technology Policy (CITP) have published the results of ongoing research demonstrating that many website operators are using third-party tools called “session replay scripts” to track visitors’ individual browsing sessions, including their keystrokes and mouse movements. These “session replay scripts,” typically used as analytics tools for publishers to better understand how visitors are navigating their websites, were found on 482 of the 50,000 most trafficked websites, including government (.gov) and educational (.edu) websites, and websites of major retailers.
As the research demonstrated, session replay scripts can raise serious privacy concerns if implemented incorrectly, causing security vulnerabilities and the potential for inadvertent collection of personal data (e.g. credit card numbers, health information, or other sensitive data). Therefore, privacy professionals should be involved in decisions related to whether and how to use these kinds of tools, and should carefully consider their usefulness and potential risks. With the right privacy and security safeguards in place, however, limited implementation of session replay scripts can be part of a range of ordinary, useful third-party web analytics tools.
FPF has developed a three-page guide for privacy professionals, who can in turn assist website marketing and design teams with decisions about whether and how to implement these types of analytics scripts. In this guide, we define and describe the term “session replay scripts,” and provide a checklist of privacy tips to use when deciding how best to implement them. In deciding whether and how to implement third-party scripts, privacy professionals should evaluate script providers’ terms and privacy policies, carefully select which pages within a site may or may not be appropriate for their use, and continue to assess the strength of technical safeguards — such as automated and manual redaction tools — over time.
The prospect of digital manipulation on major online platforms reached fever pitch in the last election cycle in the United States. Jonathan Zittrain’s concern about “digital gerrymandering” found resonance in reports, which were resoundingly denied by Facebook, of the company’s alleged editing of content to tone down conservative voices. At the start of the last election cycle, critics blasted Facebook for allegedly injecting editorial bias into an apparently neutral content generator: its “Trending Topics” feature. Immediately after the election when the extent of dissemination of “fake news” through social media became known, commentators chastised Facebook for not proactively policing user- generated content to block and remove untrustworthy information. Which one is it then? Should Facebook have employed policy- directed technologies or should its content algorithm have remained policy-neutral?
This article examines the potential for bias and discrimination in automated algorithmic decision-making. As a group of commentators recently asserted, “[t]he accountability mechanisms and legal standards that govern such decision processes have not kept pace with technology.” Yet this article rejects an approach that depicts every algorithmic process as a “black box” that is inevitably plagued by bias and potential injustice. While recognizing that algorithms are man-made artifacts, written and edited by humans in order to code decision-making processes, the article argues that a distinction should be drawn between “policy-neutral algorithms,” which lack an active editorial hand, and “policy-directed algorithms,” which are intentionally framed to further a designer’s policy agenda.
Policy-neutral algorithms could, in some cases, reflect existing societal biases and historical inequities. Companies, in turn, can choose to fix their results through active social engineering. For example, after facing controversy in light of an algorithmic determination to not offer same-day delivery in low-income neighborhoods, Amazon nevertheless recently decided to provide those services in order to pursue an agenda of equal opportunity. Recognizing that its decision-making process, which was based on logistical factors and expected demand, had the effect of facilitating prevailing social inequality, Amazon chose to level the playing field.
Policy-directed algorithms are purposely engineered to correct for apparent bias and discrimination or to advance a predefined policy agenda. In this case, it is essential that companies provide transparency about their active pursuits of editorial policies. For example, if a search engine decides to scrub results clean of opposing viewpoints, it should let users know they are seeing a manicured version of the world. If a service optimizes results for financial motives without alerting users, it risks violating FTC standards for disclosure. So too should service providers consider themselves obligated to prominently disclose important criteria that reflect an unexpected policy agenda. The transparency called for is not one based on revealing source code but rather public accountability about the editorial nature of the algorithm.
The article addresses questions surrounding the boundaries of responsibility for algorithmic fairness and analyzes a series of case studies under the proposed framework.
The Top 10 (& Federal Actions): Student Privacy News (November 2017-February 2018)
The Future of Privacy Forum tracks student privacy news very closely, and shares relevant news stories with our newsletter subscribers.* Approximately every month, we post “The Top 10,” a blog with our top student privacy stories. This blog is cross-posted at studentprivacycompass.org.
The Top 10: Federal Edition
We had so many major student privacy news stories over the past two months that we decided to split our usual Top 10 into a federal vs other news edition. Scroll down to see the non-federal Top 10 stories.
President Trump’s budget proposes $3 million in funding for USED’s Privacy Technical Assistance Center (PTAC), “which serves as a valuable resource center to State and local educational agencies, the postsecondary community, and other parties engaged in building and using education data systems on issues related to privacy, security, and confidentiality of student records” (page 49). The budget also proposes to eliminate federal funding for Statewide Longitudinal Data Systems and Regional Educational Laboratories (page 51). The USED budget request documentation also notes that “One of the significant challenges that ED will work to address is that the large number of ED applications & IT systems currently externally hosted at contractor managed sites across the country are fully compliant w/ Federal & ED cybersecurity and privacy policy and guidelines” (h/t Doug Levin).
The Department of Education (USED) released the first FERPA complaint findings letter to mention an ed tech company in January. FPF released a blog analyzing the decision, and Jim Siegl posted some very interesting thoughts about the decision at his blog. The two key takeaways:
Requiring a parent/eligible student to accept a third party’s Terms of Use when they sign up for a school constitutes a forced waiver of rights under FERPA if those Terms of Use are not compliant with FERPA’s school official requirements.
When a school requires that an ed tech service be used as a condition of enrollment, that service must either comply with FERPA’s school official exception requirements or parents must be given the right to opt out of its use.
In addition to the Agora letter, USED also released a findings letter that clarifies how video recordings of multiple students should be treated under FERPA (see a write-upof the findings letter from Pullman & Comley).
The USED Inspector General is currently reviewing “whether Office of the Chief Privacy Officer effectively oversees and enforces compliance with selected provisions of [FERPA] and [PPRA]” (page 12).
The “Student Right to Know Before You Go Act” was introduced in the Senate. The bill “makes data available to prospective college students about schools’ graduation rates, debt levels, how much graduates can expect to earn and other critical education and workforce-related measures of success,” while protecting privacy by “requiring the use of privacy-enhancing technologies that encrypt and protect the data that are used to produce this consumer information for students and families” – namely secure multi-party computation. The ACLU seems to support the bill.
The legislation to reauthorize the Higher Education Act, known as the PROSPER Act, has been introduced. It maintains the ban on a federal student unit record, but does allow for a feasibility study to investigate whether to expand the National Student Clearinghouse to set up a third-party data system to analyze student outcomes (via Inside Higher Ed). PoliticoPro reports: “Push for better student data runs into privacy worries in higher education bill.”
Awesome new resource: the Utah State Board of Education has released a video that can be used to train administrators and educators on FERPA exceptions.