FPF Comments on Minnesota Student Privacy Bill HF 1507

Yesterday, the Future of Privacy Forum submitted written comments to members of the Minnesota House of Representatives in response to the pending student privacy bill, the Student Data Privacy Act (HF 1507). FPF expressed concerns about the proposed language of the bill, which would create conflicting requirements for schools and education technology companies, and likely cause unintended consequences for Minnesota schools and students. The primary concerns were:

Read the full letter here.

FPF Welcomes New Members to the Education Privacy Project

We are thrilled to announce four new members of FPF’s Education Privacy Project. Led by Amelia Vance, Director of Education Privacy, the Project works to equip and connect parents, educators, state and local education agencies, ed tech companies, and other stakeholders with substantive practices, policies, and other solutions to address education privacy challenges.

Our new Senior Fellow, Dr. Monica Bulger, is leading FPF’s initiatives to curate and create state and district-level teaching resources to support those responsible for student data privacy, security, and confidentiality.  Our three new staff members – Sara Collins, Tyler Park, and Erika Ross – will help expand FPF’s state and federal legislative tracking, resource creation, and technical assistance.

You can read more about Monica, Sara, Tyler, and Erika below. Please join us in welcoming them to the team!

Monica Bulger

Dr. Monica Bulger joins FPF as a Senior Fellow. Her work will include studying privacy trade-offs regarding student data use; curating and developing student privacy training resources for K-12 schools; weighing in on student privacy news and developments; and surveying the priorities and needs of parents, teachers, FPF Advisory Board members, and other stakeholders. Monica’s recent publications include The Promises, Challenges, and Futures of Media Literacy (2018)The Legacy of inBloom (2017), Where Policy and Practice Collide: Comparing United States, South Africa, and European Union Approaches to Protecting Children Online (2017) and Personalized Learning: The Conversations We’re Not Having (2016). Dr. Bulger recently developed a twelve-part media literacy series aimed at teens for Crash Course on YouTube. She co-authors the Student Data Privacy, Equity and Digital Literacy newsletter with the Youth and Media team at the Berkman Klein Center for Internet & Society at Harvard University. She regularly convenes discussions among policymakers, technologists, researchers, journalists, and educators about issues affecting youth rights, such as privacy, online safety, and media literacy. A 2014-2015 Fellow at the Berkman Klein Center, she is a Research Associate at the Oxford Internet Institute, an Affiliate of the Data & Society Research Institute, and a Fellow of Fundación Ceibal. Dr. Bulger has contributed policy research to UNICEF, the U.S. Department of Justice, and the European Commission. She is a Teaching Fellow of the National Writing Project and holds a PhD in Education from the University of California, Santa Barbara with emphases in cognitive science and the social dimensions of technology.


Sara Collins

Sara Collins joins FPF as a Policy Counsel for the Education Privacy Project.   Sara previously worked as an investigations attorney in the Enforcement Unit at Federal Student Aid.  She has also worked as the Director of Legal Services for Veterans Education Success, a non-profit focused on helping veterans who have been harmed by predatory education institutions.

Sara graduated from the Georgetown University Law Center in 2014, where she was the symposium editor of the Journal of Gender and the Law.  After graduating law school, she completed a Policy & Law Fellowship at the Amara Legal Center, an organization dedicated to fighting domestic sex trafficking within the DMV area.  Originally from Chicago, Sara attended the University of Illinois, where she received a B.A. in both Political Science and English.


Tyler Park

Tyler Park joins FPF as an Education Privacy Fellow. Prior to joining FPF, he served as a Legal Fellow in the Wireless Telecommunications Bureau at the Federal Communications Commission, where he worked on a diverse set of spectrum management and licensing issues, most prominently implementing the 2015 3.5 GHZ Order and the ongoing Signal Boosters proceeding. Before joining the FCC, Tyler interned at Hogan Lovells BSTL in Mexico City where he worked in the firm’s Telecommunications Media and Technology group. He also worked as a Research Assistant for the University of Colorado Law School.  Tyler graduated from the University of Colorado Law School, where he focused on communications law. While in law school, he was a member of the Colorado Technology Law Journal. Tyler earned his Bachelor’s Degree from the University of California, Davis, majoring in Political Science and Spanish.


Erika Ross

Erika joins FPF as a Communications Associate for the Education Privacy Project. She supports communications by maintaining and promoting FERPA|Sherpa, expanding FPF’s social media following regarding education privacy issues, and drafting content for newsletter development. Erika also supports FPF’s education privacy communications plan, manages media relations, and promotes events and publications.

Erika earned in Bachelor’s degree from the University of Michigan majoring in political science. There she worked as a research assistant for the Civil Rights Litigation Clearinghouse and an assistant facilitator of the Youth Arts Alliance Program, working with detained youth. After graduation, Erika became a Teach for America corps member in Charlotte, North Carolina, and earned certification in Middle School and Secondary Social Studies. Before joining FPF, Erika worked as a Policy Advocacy Fellow for the School Justice Project and Policy and a Communications Fellow at the National Council on Teacher Quality. She is currently pursuing her Masters of Public Policy degree at the Trachtenberg School at George Washington University, with a focus on Program Evaluation.

Policymakers, regulators, and privacy executives interact with latest connected tech at FPF’s Third Annual Tech Lab

FPF held the Third Annual Tech Lab Open House Monday, March 26, 2018, at our offices in Washington, D.C. The Tech Lab Open House provided an opportunity for us to host Privacy Commissioners and FPF members who were in town for the International Association of Privacy Professionals’ Global Privacy Summit. The Tech Lab contained several smart toys and smart home gadgets. The event provided a rare occasion for policymakers, regulators, and thought leaders to interact with the latest in privacy-impacting gadgets and new technologies.

Attendees had the opportunity to come face to face with facial recognition, learn how Wi-Fi and Proximity Sensors can be used to track smartphones in our space, compare ancestry analyses from leading direct-to-consumer genetics services, share fun moments with Snap Spectacles, play with Anki’s Cozmo, CognitToys Dino, ChiP Robot, Amazon Echo Show, and much more!

The Tech Lab was widely attended by chief privacy officers, regulators, advocates, academics, and other privacy professionals who encounter the policies and regulations governing the type of technology that was on display. We were delighted to be joined by several distinguished guests: Lahoussine Aniss, General Secretary of the Moroccan Data Protection Authority; Alon Bachar, Commissioner, Israel Privacy Protection Authority; Giovanni Buttarelli, European Data Protection Supervisor; Christian D’Cunha, Head of Private Office of the Supervisor, European Data Protection Supervisor; Bruno Gencarelli, Head of Unit – International Data Flows and Protection, European Commission; John O’Dwyer, Deputy Commissioner, Irish Data Protection Commission; Oz Shenhav, Director of Innovation and Policy Development, Israel Privacy Protection Authority; Zee Kin Yoeong, Personal Data Protection Commission of Singapore. We were honored to have special remarks by Supervisor Buttarelli, Secretary Aniss, and Commissioner Bachar. You can watch them below.

 

Video

VIEW PHOTOS

The Elise Berkower Memorial Fellowship

FPF launched a new fellowship in memory of Elise Berkower. Elise was a senior privacy executive at global measurement and data analytics company Nielsen for nearly a decade and was a valued, longtime member of the FPF Advisory Board. FPF graciously acknowledges the Berkower Family, Nielsen Foundation, and IAPP as founding sponsors of the Elise Berkower Memorial Fellowship.

Her Career

Elise served as Chief Privacy Counsel at Nielsen, playing a lead role in efforts to ensure the company was best in class in the way it handles consumer data, in addition to playing an active role in working across industry groups to help raise the bar of consumer protection across a range of organizations.

The Nielsen Foundation is a private foundation funded by Nielsen. The Nielsen Foundation seeks to enhance use of data by the social sector to reduce discrimination, ease global hunger, promote effective education, and build strong leadership.

While at Nielsen, Elise was heavily involved in recruiting, training and mentoring interns, creating a pathway for a generation of students to enter the privacy field. Elise provided nearly 100 students with an opportunity to gain the experience and build the relationships needed to launch their careers in this field. The Elise Berkower Memorial Fellowship is designed for recent law school graduates and will honor Elise’s legacy by identifying and nurturing young lawyers interested in contemporary privacy issues with a focus on consumer protection and business ethics.

The Fellowship

The Elise Berkower Memorial Fellowship is a one-year, public interest position for recent law school graduates committed to the advancement of responsible data practices. Candidates will be selected based on both academic qualifications and a commitment to the personal qualities exemplified by Elise – collaboration with co-workers, peers and the broader privacy community and a commitment to ethical conduct.

The Elise Berkower Memorial Fellow will focus on consumer and commercial privacy issues, from technology-specific areas such as advertising practices, drones, wearables, connected cars, and student privacy, to general data management and privacy issues related to ethics, de-identification, algorithms, and the Internet of Things.


Apply



Donate


FPF is inviting supporters to help us fully fund the Elise Berkower Memorial Fellowship. Donations of all sizes are welcome. Thank you for your support!

To donate, click “Tickets” on this page.


Additional Information


Future of Privacy Forum Launches Fellowship in Memory of Privacy Hero Elise Berkower

FOR IMMEDIATE RELEASE             

March 26, 2018

Contact: Melanie Bates, Director of Communications, [email protected]

Future of Privacy Forum Launches Fellowship in Memory of Privacy Hero Elise Berkower

Washington, DC – Today, the Future of Privacy Forum announced the launch of a new fellowship in memory of Elise Berkower. Elise was a senior privacy executive at global measurement and data analytics company Nielsen for nearly a decade and was a valued, longtime member of the FPF Advisory Board. FPF graciously acknowledges the Berkower Family and the Nielsen Foundation as founding sponsors of the Elise Berkower Memorial Fellowship.

“Elise was passionate about mentoring and teaching recent law school graduates because in addition to spreading her vast knowledge and experience to others, Elise understood that when teaching the teacher also learns and fine tunes their understanding,” said Howard Berkower, Elise’s brother. “Our family can think of no better way to honor Elise’s personal and professional life of generosity, commitment and accomplishment than to establish this Privacy Fellowship.”

Elise served as Chief Privacy Counsel at Nielsen, playing a lead role in efforts to ensure the company was best in class in the way it handles consumer data, in addition to playing an active role in working across industry groups to help raise the bar of consumer protection across a range of organizations.

“Elise’s leadership was critical in laying the foundation for what today is Nielsen’s industry-leading privacy efforts,” said Eric J. Dale, Nielsen’s Chief Legal Officer and Secretary and Director of the Nielsen Foundation.  “We are so excited about the FPF fellowship in her honor, which allows Elise’s legacy of excellence and development in privacy to continue for years to come.  We are thrilled that the Nielsen Foundation is a founding sponsor of the Elise Berkower Memorial Fellowship.”

The Nielsen Foundation is a private foundation funded by Nielsen. The Nielsen Foundation seeks to enhance use of data by the social sector to reduce discrimination, ease global hunger, promote effective education, and build strong leadership.

While at Nielsen, Elise was heavily involved in recruiting, training and mentoring interns, creating a pathway for a generation of students to enter the privacy field. Elise provided nearly 100 students with an opportunity to gain the experience and build the relationships needed to launch their careers in this field. The Elise Berkower Memorial Fellowship is designed for recent law school graduates and will honor Elise’s legacy by identifying and nurturing young lawyers interested in contemporary privacy issues with a focus on consumer protection and business ethics.

“Elise had a passion for privacy that was infectious,” said Farah Zaman, Senior Global Data Privacy Counsel, Colgate-Palmolive Company. “She could produce a practical, insightful, and thorough legal and technical solution off the top of her head, and educate interns, junior and senior legal colleagues, and even counsel across the table while doing so.  Elise was a remarkable person to work for not only because of her brilliance, but also because of her profound example of how to simultaneously be an effective in-house counsel, privacy advocate, and infinitely kind and thoughtful person. The legal profession and privacy community will be advanced by any attorney or intern who follows her example.”

The fellowship will be a one-year, public interest position for recent law school graduates committed to the advancement of responsible data practices. Candidates will be selected based on both academic qualifications and a commitment to the personal qualities exemplified by Elise – collaboration with co-workers, peers and the broader privacy community and a commitment to ethical conduct.

“She was one of the brightest stars in our privacy community,” said Zoe Strickland, Managing Director, Global Chief Privacy Officer, JPMorgan Chase. “She had such a deep understanding of privacy in that complicated intersection of data use, technology, tracking, and aggregation. And a quick and sharp wit as a bonus. This fellowship is a testament to her skill and legacy, and provides a wonderful avenue for students to follow in her big footsteps.”

“Elise was a quiet, but powerful, force in the privacy community,” said Trevor Hughes, President & CEO of the International Association of Privacy Professionals.  “For almost two decades, she worked for better outcomes for all involved. Her influence can be seen in many of the privacy standards that we work with today. She is terribly missed, and fondly remembered.”

“Elise was one of the quiet heroines of the earliest days of privacy compliance,” said Nuala O’Connor, President & CEO of Center for Democracy & Technology. “She was so much more than just a respected and valued colleague; she was a friend and a teacher and a partner to many. She was the best example of how to imbue privacy and data ethics into an organization.”

“Elise was one of the founding Board Members of the NAI, providing groundbreaking and effective leadership in developing standards and best practices for digital technology companies facing emerging privacy and public policy issues raised by complicated business models,” said Leigh Freund, President & CEO of the Network Advertising Initiative (NAI). “Elise’s incredible knack for diplomatic negotiation and collaboration with industry leaders at a critical point in the development of the internet economy resulted in a lasting legacy of effective and responsible self-regulation that we can all be proud of.”

“When I became Chief Privacy Officer at DoubleClick, I needed someone with a deep commitment to consumer protection to work with and educate thousands of companies and clients, teaching many of them for the first time the basics of internet privacy,” said Jules Polonetsky, FPF’s CEO. “Elise joined DoubleClick from her role as Chief Administrative Law Judge at the New York City Department of Consumer Affairs and played a critical role in shaping DoubleClick’s best practices as well as setting a standard for the industry.”

The Elise Berkower Memorial Fellow will focus on consumer and commercial privacy issues, from technology-specific areas such as advertising practices, drones, wearables, connected cars, and student privacy, to general data management and privacy issues related to ethics, de-identification, algorithms, and the Internet of Things.

To apply for the inaugural Elise Berkower Memorial Fellowship or to support our efforts, please visit https://fpf.org/2018/03/26/the-elise-berkower-memorial-fellowship.

 

### 

The Future of Privacy Forum (FPF) is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. 

Deciphering “Legitimate Interests”: Report based on more than 40 cases from practice

FPF and Nymity collaborated to compile a Report on actual cases from practice and relevant guidance from the Article 29 Working Party and individual Data Protection Authorities (DPAs) concerning the use of “legitimate interests” as a lawful ground for processing under EU data protection law. Our aim is to help organizations better understand how to use and apply legitimate interests as a lawful basis for processing, while at the same time contributing to enhanced personal data protection for individuals.

We have identified specific cases that have been decided at national level by DPAs and Courts from the European Economic Area (EEA), as well as the most relevant cases where the Court of Justice of the European Union interpreted and applied the “legitimate interests” ground. We looked at cases across industries and we compiled them in two lists: one for uses of this ground that were found lawful and one for uses that were found unlawful.

There are over 40 cases discussed representing a wide variety of data processing activities from over 15 countries, such as:

The summary of cases contain useful examples of how the “balancing exercise” is conducted in practice, and in many instances, the safeguards that were needed to tilt the balance and make the processing lawful. Two examples are provided below.

READ REPORT

Facebook and Cambridge Analytica: Statement by Jules Polonetsky, FPF CEO

Yesterday, Mark Zuckerberg addressed concerns about misuse of Facebook users data by Cambridge Analytica.

I think Mark well addresses the key issues. The biggest change was made back in 2014 when Facebook altered the platform to reduce data access by apps. But did any other apps collect a suspicious amount of data back then? Facebook will conduct a full audit of any app with suspicious activity and ban apps that misused personal information. We hope those will also be reported to relevant authorities when appropriate.

Facebook will also tell people who are affected by apps that have misused their data and will build a way for people to know if their data might have been accessed via the “thisisyourdigitallife” app that provided data to Cambridge Analytica. Moving forward, if Facebook removes an app for misusing data, they will tell everyone who used it.

Facebook will also turn off access to information for unused apps, if someone hasn’t used an app for 3 months, which makes sense.

Do you ever use Login with Facebook on other apps or sites? In the next version, apps will only be able to request name, profile photo and email address, unless they get special approval.

I just checked and my Facebook profile is linked to dozens of apps. I turned a number of them off. A few are super useful – when I use TripAdvisor, I love seeing reviews by my FB friends listed first, so I can assess whether to take the review seriously! But as a Facebook power user, even I had to poke around to find the setting that displayed that information. Going forward, Facebook promises to make these choices more prominent and easier to manage.

And finally, Facebook will expand its bug bounty program to reward people who report misuses of data by app developers.

These are clearly all useful steps forward and should help shut the door on shady apps misusing Facebook data.

Thinking more broadly, it seems clear that many of the issues raised by the Cambridge Analytica controversy are not exclusive to a particular platform, data practice, or policy.

Do we need baseline, comprehensive privacy legislation in the US, with common sense data protections for users and greater certainty for companies?

Should the FTC have authority over political orgs and other non-profits to police unfair and deceptive practices? The Commission currently lacks this. And Congress is typically reluctant to pass laws that impact campaigns and political parties – the organizations that help members earn re-election.

How can new targeting capabilities – across the online ecosystem – be made more transparent for elections and issue campaigns? Do we need standards for election ads in the 21st century? Can we have data standards that are clear about what behavior is appropriate and what is not when communicating through TV, radio, in print, and online? This is an issue in the US and abroad, as the Irish Data Protection Commissioner (who handled and helped resolve a few years ago the issues of Facebook apps grabbing too much data) explains: “the micro-targeting of social media users with political ads remains an ongoing issue today.”  Although GDPR does capture the activity of political actors in Europe, guidance in the form of a Code of Conduct for political advertisers would be welcome, according to a new opinion from the European Data Protection Supervisor.

How can we support more legitimate research – for transparency about platforms and their impact, and for broader needs of society and science?  Can we have research programs managed by companies that provide more controls, a good vetting process, better informed consent for research, corporate ethics review processes?

Can we have a sophisticated conversation about the risks and mitigation strategies regarding data portability? Worries about Cambridge Analytica’s data exfiltration implicate many of the same issues raised by data portability tools and GDPR Article 20.

We are pleased to see Facebook’s response, but are looking forward to understanding how to best address the broader issues for all stakeholders.  These issues are important to discuss – they are not going away.

Understanding Session Replay Scripts – a Guide for Privacy Professionals

Over the last few months, privacy researchers at Princeton University’s Center for Information Technology Policy (CITP) have published the results of ongoing research demonstrating that many website operators are using third-party tools called “session replay scripts” to track visitors’ individual browsing sessions, including their keystrokes and mouse movements. These “session replay scripts,” typically used as analytics tools for publishers to better understand how visitors are navigating their websites, were found on 482 of the 50,000 most trafficked websites, including government (.gov) and educational (.edu) websites, and websites of major retailers.

As the research demonstrated, session replay scripts can raise serious privacy concerns if implemented incorrectly, causing security vulnerabilities and the potential for inadvertent collection of personal data (e.g. credit card numbers, health information, or other sensitive data). Therefore, privacy professionals should be involved in decisions related to whether and how to use these kinds of tools, and should carefully consider their usefulness and potential risks. With the right privacy and security safeguards in place, however, limited implementation of session replay scripts can be part of a range of ordinary, useful third-party web analytics tools.

FPF has developed a three-page guide for privacy professionals, who can in turn assist website marketing and design teams with decisions about whether and how to implement these types of analytics scripts. In this guide, we define and describe the term “session replay scripts,” and provide a checklist of privacy tips to use when deciding how best to implement them. In deciding whether and how to implement third-party scripts, privacy professionals should evaluate script providers’ terms and privacy policies, carefully select which pages within a site may or may not be appropriate for their use, and continue to assess the strength of technical safeguards — such as automated and manual redaction tools — over time.

Download the 3-page Guide here (link to PDF).

More Resources:

Taming The Golem: Challenges of Ethical Algorithmic Decision-Making

Omer Tene and Jules Polonetsky recently published, “Taming The Golem: Challenges of Ethical Algorithmic Decision-Making,” in Volume 19, Issue 1, of the North Carolina Journal of Law and Technology.

The prospect of digital manipulation on major online platforms reached fever pitch in the last election cycle in the United States. Jonathan Zittrain’s concern about “digital gerrymandering” found resonance in reports, which were resoundingly denied by Facebook, of the company’s alleged editing of content to tone down conservative voices. At the start of the last election cycle, critics blasted Facebook for allegedly injecting editorial bias into an apparently neutral content generator: its “Trending Topics” feature. Immediately after the election when the extent of dissemination of “fake news” through social media became known, commentators chastised Facebook for not proactively policing user- generated content to block and remove untrustworthy information. Which one is it then? Should Facebook have employed policy- directed technologies or should its content algorithm have remained policy-neutral?

This article examines the potential for bias and discrimination in automated algorithmic decision-making. As a group of commentators recently asserted, “[t]he accountability mechanisms and legal standards that govern such decision processes have not kept pace with technology.” Yet this article rejects an approach that depicts every algorithmic process as a “black box” that is inevitably plagued by bias and potential injustice. While recognizing that algorithms are man-made artifacts, written and edited by humans in order to code decision-making processes, the article argues that a distinction should be drawn between “policy-neutral algorithms,” which lack an active editorial hand, and “policy-directed algorithms,” which are intentionally framed to further a designer’s policy agenda.

Policy-neutral algorithms could, in some cases, reflect existing societal biases and historical inequities. Companies, in turn, can choose to fix their results through active social engineering. For example, after facing controversy in light of an algorithmic determination to not offer same-day delivery in low-income neighborhoods, Amazon nevertheless recently decided to provide those services in order to pursue an agenda of equal opportunity. Recognizing that its decision-making process, which was based on logistical factors and expected demand, had the effect of facilitating prevailing social inequality, Amazon chose to level the playing field.

Policy-directed algorithms are purposely engineered to correct for apparent bias and discrimination or to advance a predefined policy agenda. In this case, it is essential that companies provide transparency about their active pursuits of editorial policies. For example, if a search engine decides to scrub results clean of opposing viewpoints, it should let users know they are seeing a manicured version of the world. If a service optimizes results for financial motives without alerting users, it risks violating FTC standards for disclosure. So too should service providers consider themselves obligated to prominently disclose important criteria that reflect an unexpected policy agenda. The transparency called for is not one based on revealing source code but rather public accountability about the editorial nature of the algorithm.

The article addresses questions surrounding the boundaries of responsibility for algorithmic fairness and analyzes a series of case studies under the proposed framework.

READ ARTICLE

The Top 10 (& Federal Actions): Student Privacy News (November 2017-February 2018)

The Future of Privacy Forum tracks student privacy news very closely, and shares relevant news stories with our newsletter subscribers.* Approximately every month, we post “The Top 10,” a blog with our top student privacy stories. This blog is cross-posted at studentprivacycompass.org. 

The Top 10: Federal Edition

We had so many major student privacy news stories over the past two months that we decided to split our usual Top 10 into a federal vs other news edition. Scroll down to see the non-federal Top 10 stories.

  1. President Trump’s budget proposes $3 million in funding for USED’s Privacy Technical Assistance Center (PTAC), “which serves as a valuable resource center to State and local educational agencies, the postsecondary community, and other parties engaged in building and using education data systems on issues related to privacy, security, and confidentiality of student records” (page 49). The budget also proposes to eliminate federal funding for Statewide Longitudinal Data Systems and Regional Educational Laboratories (page 51). The USED budget request documentation also notes that “One of the significant challenges that ED will work to address is that the large number of ED applications & IT systems currently externally hosted at contractor managed sites across the country are fully compliant w/ Federal & ED cybersecurity and privacy policy and guidelines” (h/t Doug Levin).
  2. The Department of Education (USED) released the first FERPA complaint findings letter to mention an ed tech company in January. FPF released a blog analyzing the decision, and Jim Siegl posted some very interesting thoughts about the decision at his blog. The two key takeaways:
    • Requiring a parent/eligible student to accept a third party’s Terms of Use when they sign up for a school constitutes a forced waiver of rights under FERPA if those Terms of Use are not compliant with FERPA’s school official requirements.
    • When a school requires that an ed tech service be used as a condition of enrollment, that service must either comply with FERPA’s school official exception requirements or parents must be given the right to opt out of its use.
  3. In addition to the Agora letterUSED also released a findings letter that clarifies how video recordings of multiple students should be treated under FERPA (see a write-upof the findings letter from Pullman & Comley).
  4. The FTC and USED held a “Student Privacy and Ed Tech” workshop on December 1stthat examined how the FTC’s “Rule implementing the Children’s Online Privacy Protection Act (COPPA) applies to schools and intersects with the Family Educational Rights and Privacy Act (FERPA),” administered by USED. You can watch a recording of the workshop panels (at the bottom under “Video”), read write-ups via EdWeek and THE Journal, or read the comments filed before the workshop.
  5. On January 30th, the House Education and the Workforce Committee held a hearing, “Protecting Privacy, Promoting Policy: Evidence-Based Policymaking and the Future of Education (watch the recording or read the EdWeek summary). The hearing was very similar to previous hearings in both 2016 and 2017 about protecting student privacy while allowing for ed research. It “remains up for debate whether increased protections for student data should come from updating the law, bolstering parental consent, improving technology or shaking up education policy research,” via Politico. Meanwhile, EdWeek reports that “The Tricky Dance of Researchers and Educators Gets Even More Complex,” in part due to privacy concerns.
  6. After the cyber threats made to schools last fall, the “FBI is asking school districts to make cyber security a priority” and the FBI and USED issued a “warning about cyber-extortion schemes focused on public schools.” This notice shared that the DarkOverlord hackers were responsible for “‘at least 69 intrusions into schools and other businesses, the attempted sale of over 100 million records containing personally identifiable information (PII), and the release of over 200,000 records including the PII of over 7,000 students due to nonpayment of ransoms.’” Meanwhile, “Cyberattacks Increasingly Target Student Data” (GovTech) and EdWeek reports that “Schools Struggle to Keep Pace With Hackings, Other Cyber Threats.” IT leaders are “likely underestimating the dangers they face;” only 15% of school technology leaders say “they have implemented a cybersecurity plan in their own district” according to a recent CoSN survey. FPF published an op-ed in The Hill with Bill Fitzgerald of Common Sense Media noting that “Securing student data is a challenge that requires sufficient funding,” not more legislation.
  7. The USED Inspector General is currently reviewing “whether Office of the Chief Privacy Officer effectively oversees and enforces compliance with selected provisions of [FERPA] and [PPRA]” (page 12).
  8. The “Student Right to Know Before You Go Act” was introduced in the Senate. The bill “makes data available to prospective college students about schools’ graduation rates, debt levels, how much graduates can expect to earn and other critical education and workforce-related measures of success,” while protecting privacy by “requiring the use of privacy-enhancing technologies that encrypt and protect the data that are used to produce this consumer information for students and families” – namely secure multi-party computation. The ACLU seems to support the bill.
  9. The legislation to reauthorize the Higher Education Act, known as the PROSPER Act, has been introduced. It maintains the ban on a federal student unit record, but does allow for a feasibility study to investigate whether to expand the National Student Clearinghouse to set up a third-party data system to analyze student outcomes (via Inside Higher Ed). PoliticoPro reports: “Push for better student data runs into privacy worries in higher education bill.”
  10. The Education Department isn’t doing enough to protect student data collected as part of financial aid programs, according to an audit from the Government Accountability Office,” via Politico.

 

The Top 10

  1. Former tech company employees are “forming a coalition to fight what they built.” Among other actions (including a livestreamed event on February 7th), Common Sense Media and the coalition plan “an anti-tech addiction lobbying effort and an ad campaign at [the] 55,000 public schools in the United States [that currently use the CSM Digital Citizenship Curriculum]… It will be aimed at educating students, parents and teachers about the dangers of technology, including the depression that can come from heavy use of social media” (via NYTimes). The focus on children in schools is because they “had little agency over whether they opted into or out of a technology platform because of pressure from both peers and educators handing out assignments.” In the meantime, there were several articles over the past two months on technology and children: The Wall Street Journal reported on “New guidance on children and technology makes the distinction between passive exposure and active play and learning with screens;” The Atlantic asked “Should Children Form Emotional Bonds With Robots?”; and a couple Apple investors “called on the tech giant to take more steps to curb the ill effects of smartphones.” Axios reports that “The internet was created by adults for adults, but it has seen a sharp uptick in kid users over the past 10 years… Silicon Valley has bet its future on younger users, but has come under fire recently for building products that critics say aren’t safe for children.”
  2. Awesome new resource: the Utah State Board of Education has released video that can be used to train administrators and educators on FERPA exceptions.
  3. Jim Siegl wrote a great blog about “Privacy Differences between Consumer Gmail and G Suite for Education,” and just released a creative commons document that districts can use to inform their community about the district’s G Suite for Education choices and provide information on the difference between Google’s consumer products and G Suite. In related news, some new features for G Suite for Education will “come with a fee.”
  4. Amazon Web Services released a whitepaper on “FERPA Compliance on AWS.”
  5. More news on privacy and personalized learning: EdWeek reports that “States [are taking] Steps to Fuel Personalized Learning,” and PoliticoPro notes that “A dozen states try out ‘personalized learning,’ but questions persist.” Inside Philanthropy asks “Is Personalized Learning the Next Big Thing in K-12 Philanthropy?” and Getting Smart discusses how “Chan Zuckerberg Backs Personalized Learning R&D Agenda” (read the blog on their approach to personalized learning). In the meantime, we have continued to see pushback against personalized learning over the past two months: Diane Ravitch, in an op-ed about “increasing misuse of technology in schools,” writes that “‘Personalized learning,’ …[is a] euphemism for computer adaptive instruction…parents want their children taught by a human being, not a computer.” The 74 shares “5 Ways to Talk (and Think) About Personalized Learning,” EdWeek reports that “Two Districts Roll Back Summit Personalized Learning Platform,” and A Long View on Education discusses “The Propaganda behind Personalised Learning,” But The Atlantic reports on “The Futile Resistance Against Classroom Tech,” noting that ed tech “is here to stay. It’s up to teachers, then, to build networks of learning, solidarity, mutual respect, and even trust.”
  6. Doug Levin has published the full results of his look at the security and privacy of state and (some) district public-facing websites. The primary reported finding is that “State and District Education Websites Fail to Disclose Ad Trackers” (see Doug’s Twitter thread of findings here and coverage from EdSurge). Another security researcher reported their findings that “Website operators are in the dark about privacy violations by third-party scripts,” including at least one ed tech provider described in the article.
  7. Common Sense Media announced that it will be releasing simpler “ratings” for ed tech apps on privacy in early 2018, based on large part on its previous evaluations. They explain more details about the new “roll-up score” apps will receive here. In the meantime, they released a blog on how “it’s often necessary to look in multiple places to find” privacy policies in order to evaluate software.”
  8. very interesting lawsuitan autistic, nonverbal student wants “his school to let him wear a device that records his day, but the district says that would violate the privacy of other students.” In related news, a Virginia mom was “charged for putting recording device in daughter’s backpack to catch bullies.”
  9. Virginia has introduced a few bills after a progressive political group “filed FOIA requests [last fall] seeking the publicly available student directories to get student cell phone numbers at every one of Virginia’s 39 public colleges.” The first bill would simply “remove student cell phone numbers and email addresses from publicly available directories,” but was “scrapped” by the House committee (however, its counterpart in the Senate did pass). The second bill, which has passed the House, is much broader, and could cause unintended consequences in schools: it would change how FERPA’s directory information exception currently works to require an opt-in from students for sharing of their directory information instead of an opt-out (this article provides a good overview of the issue and bills). Virginia Lawyers Weekly advocates for a “scalpel” (the first bill) instead of a “sledgehammer” (the second bill).
  10. Audrey Watters, the “Cassandra of Ed Tech,” has published her end-of-year top ed-tech trends articles. I highly recommend reading them; while you might not agree with her perspective, her trend articles provide a fantastic overview of the major student privacy and ed tech stories from 2017. The most relevant to student privacy: “The Weaponization of Education Data,” “Robots Are Coming For Your Children,” and “Education Technology and the New Behaviorism.”

Image: “School days” by Rachel  is licensed under CC BY-NC-SA 2.0.