Privacy and the Connected Vehicle: A Global Event, March 9 in Detroit

The Future of Privacy Forum and EY are hosting an event to advance the conversations around the management and use of personal information in the vehicle ecosystem. We will have a half day of panel discussions led by our team of privacy professionals and colleagues from the privacy and automotive space in the US and EU. If you work in the connected car ecosystem in automotive privacy, security, or compliance management, contact Lauren Smith at [email protected] to request an invitation.

Wednesday, March 9, 2016 from 8:00 AM to 12:00 PM (EST) 

Detroit, Michigan

The program will include:

Welcome Remarks

Panel 1: Legal and Self-Regulatory Standards for Automotive Privacy

This panel discusses steps companies are taking to adopt the Auto Alliance and Global Automakers Consumer Privacy Protection Principles, data issues in the automotive ecosystem, international considerations and practices, and the privacy impact of emerging technologies such as V2V, V2I.

Panel 2: Practical Considerations and Compliance

This panel discusses managing privacy in a distributed ecosystem, critical issues in managing the supply chain, responsibility for good privacy practices, and ensuring the Privacy Principles are integrated into new technologies.

Panel 3: Communications and Policy

This panel discusses managing consumer communications around privacy, security, and new features, the challenges of ensuring policymakers understand the complexity of data flows, legislative actions in the short term, and critical questions about autonomous technologies, ethics and privacy considerations.

This program is invitation-only and closed to press.

Important logistical information:

Time: Registration opens at 7:30am. Breakfast refreshments will be served. The program begins promptly at 8:00am. Lunch will be provided following closing remarks.

This live event will be available for videoconferencing in select EY offices throughout Europe: Sweden, Germany, France, and the UK.

Contact [email protected] to request an invitation.

 

 

The FBI and the iPhone in Your Pocket

Consider the data on your iPhone for a moment. Emails, pictures, passwords, credit cards, location history, contacts and more. Imagine your phone unlocked in the hands of a criminal who snatched it, or someone who wanted to embarrass you who peeked at it, or a hacker who remotely accessed it.

Today, if you have a good password protecting your phone, none of this is easy to do. Encryption ensures that without a password, your data is locked up, even if your phone was taken apart or attacked while it was booting up. Rate limiting ensures that it isn’t possible to brute force attack the phone by entering thousands of passwords. A small delay required in between password attempts ensures thousands of attempts will take a very long time. And another security feature, if enabled, will delete all the data on an iPhone, after 10 failed password attempts. With iCloud Activation, as a deterrent for thieves, if the iPhone is remotely wiped and locked then the original owner’s username and password are required to reactivate the phone for use with a wireless carrier. Without these credentials the iPhone remains encrypted and locked preventing anyone from using it. The iPhone is so secure, that even Apple has no way to get in to the phone, even if you bring it in to visit a Genius at a local Apple store or if you sent the phone back to Apple headquarters.

These protections ensure that your phone is not an easy target for thieves, anymore. iPhone thefts have plummeted by as much as 50% in some cities, after these features were introduced.

Unfortunately, the FBI now wants to put the safety of all iPhone users at risk. The FBI wants access to the iPhone of one of the San Bernardino killers. Phones used by the killers were destroyed, but a phone reportedly provided by an employer is in the hands of the FBI, but locked. The FBI has the iPhone back ups stored with Apple, until late October, but hasn’t been able to break into the iPhone to determine if there any clues stored on the device. The FBI wants Apple to devise a method of breaking in to iPhones that can be used here.

But – if Apple does so, this method will be used by others. Once Apple creates a bypass, the methodology will be analyzed, studied and exploited by sophisticated criminals at first, and then by others. There may or may not be any useful clues on the iPhone of the San Bernardino killers, but it is for certain that many criminals in the future will find data they want on the iPhones of consumers, if an exploit is created to bypass passwords, encryption and other protections.

Of course, there are many other negative implications, if iPhone security is circumvented. Repressive governments will seek to force Apple to unlock iPhones, investigating and persecuting those who oppose them. As Future of Privacy Forum Senior Fellow Peter Swire explained, “I wish there was some magic way to break security for exactly one phone, without breaking security for millions of smartphones. There isn’t.” Former White House Deputy CTO put it clearly, noting that once those back doors are there for the FBI, “all of our private communications become much more vulnerable to attack by malicious criminals and terrorists.”

The FBI needs every tool it can get to investigate terror attacks and prevent them in advance. But forcing Apple to create a tool that risks the security and privacy of every iPhone in the world is asking for a tool that will cause more harm than it prevents.

Jules Polonetsky is Executive Director of the Future of Privacy Forum.

Google Responds to Sen. Franken

Google provided a response this week to Senator Franken’s request for information on their policies and practices with regards to their Google Apps For Education (GAFE) suite of services. Ed Week reviewed Google’s letter, and asked FPF to comment on the response.

Full article here.

ACLU, Tenth Amendment Center Join Forces on Data Privacy

“In consultation with the center—a think tank that advocates strict limits on federal power—the ACLU wrote model legislation that both organizations are urging legislators around the country to support. …

“The Future of Privacy Forum—a Washington-based think tank and a co-author of the Student Privacy Pledge, a commitment by ed-tech companies to safeguard data—offered a measured endorsement of the provisions in the ACLU’s model bill.

“The forum applauded the model legislation’s language on parental-release mechanisms and its calls for teacher professional development on basic data-privacy issues, but is worried that an overly strict definition of “personally identifiable information” and the risk of personallegal liability for teachers who make mistakes could undermine both the ed-tech industry and the work of classroom educators.”

 

Read full article here.

FPF Supports Connect Safely for Safer Internet Day

Our friends at Connect Safely have put together an amazing program to support student awareness and education as part of Safer Internet Day on Tuesday, February 9, 2016.  Below is their writeup – we invite you all to participate and spread the word – particularly to the students in your life and their educational institutions – this is too good to miss!

They’ve built a stellar agenda for the 300+ students attending Safer Internet Day at Universal Studios Hollywood, and fortunately, for those of us who can’t make it in person,we can watch the live stream. Along with a professional video crew, the LA Unified School District has arranged for student journalists to use Periscope as roving reporters to live stream the offstage action including when the students are in small groups answering the tough questions and as they interact with the exhibits and walk the red carpet.

There is an exciting lineup of great speakers assembled with help from the Yale Center for Emotional Intelligence and Facebook. Young panelists from Beyond Differences, #ICANHELP, and IspirED will discuss the topic “Rejecting Hate, Building Resilience & Growing the Good Online,” moderated by college filmmaker, Instagram personality and transgender activist Leo Sheng.

Professional wrestler and reality TV star Mike “The Miz” Mizanin will speak about how it’s possible to play a bad guy on TV but be a gentle soul in real life with messaging about his own experiences with bullying.

LA Media personality Tshaka Armstrong, a parent and founder of Digital Shepards, will inspire the kids to take responsibility for making the world a better place.

Connect Safely’s newest team member – K-12 education director Kerry Gallagher – will lead the students as they break into small groups with the help of distinguished coaches and judges who will pick the top student proposals for special recognition.

Finally, Connect Safely brings in Zoë Quinn founder of Crash Override who works hard to fight the online harassment that she herself faced as “patient zero of Gamergate.”

Join the live stream and Periscope webcasts, and when you do, be part of the discussion – tweet using #SIDUS16, #SID2016 and #SaferInternetDay.

The link for the live stream:  http://saferinternetday.us/livestream/

Congratulations and thanks to Connect Safely for their excellent work to keep children and students safe on-line.  See you Tuesday!

SXSW Edu – FPF and Colleagues Take on "Trust" Question

I am heading to SxSW Edu to join the US Department of Education Chief Privacy Officer Kathleen Styles, Common Sense Media’s Bill Fitzgerald, Data Quality Campaign’s Aimee Guidera, and several other distinguished education and privacy speakers for this session on student data privacy.  Sponsored by CoSN, this is not just a panel but a full 3-hour session to dig into the top issues, questions, and challenges of how to maximize the benefits of ed tech for our school children while protecting their data, their interests, and their future. It can only happen if we can find a way to inspire and maintain trust among and between all the players.  As advertised, this session will cover:

“Concerns around the privacy of student data have been rising and educators are increasingly on the defensive as parents believe too much data is collected about their children, it is not secure and too often inappropriately used for commercial gain. Trust is at the heart of the privacy debate. Last year the Student Data Principles were endorsed by 40 education associations, and over 200 companies have signed a Student Privacy Pledge. Learn how we can build trust through better educator training, improved review of education apps, clearer vendor agreements and building a “seal” validating trusted learning environments. Participate in an interactive forum on what else is needed.”
The key word there is “participate”! This session is about listening and learning and considering – all points of view, all ideas and suggestions. If you can’t be there – reach out to one of the speakers in advance with your thoughts and inputs.  We’ll take it all!
Here are some of my thoughts on the background of the discussion I hope that we’ll be having:
At the point of impact, there are two primary actors: the school and the vendor.  Schools and vendors generally want the same thing – good outcomes for students; and the DQC-sponsored Student Data Principles (values from the educator perspective) the FPF/SIIA-led Student Privacy Pledge (commitments by ed tech companies) reflect that priority.  Ideally, they work in partnership – the vendor providing a useful product or service which then allows the school to offer strong educational opportunities. But as with any relationship, “it’s complicated.”
Which vendors should schools choose to incorporate – how do they know who to trust?  Screening platforms, Seals, Pledges – all these and more are available, but from the school’s perspective there’s no silver bullet or easy way through the wickets of writing a contract or selecting an app and being completely sure you’re doing the right thing.
Even before that point, though, schools are making choices. What counts as a vendor and who should be able to incorporate the product – an app the teacher tells everyone to download? – only an account the school requires the student to create? What role should school IT offices play – and what about the many, many districts where there simply aren’t resources for this type of oversight? What is the burden on teachers – let’s try not to conclude they have to be IT specialists now too!
Even when schools and vendors work together efficiently toward a productive result, their job isn’t done. They operate as the agents, but the true “stakeholders” are the parents and students for whom this process takes place. They must be made aware of the decisions being made, the controls in place for them to exercise, and their own responsibilities.
Of course, we can never skip over the legal implications – some laws put the burden schools; some on vendors, and in either case, how do we control the risks without limiting the ability of either to function most effectively? New laws are considered out of the public debate about how much limitation over student data is the right amount; who should have access and when; what are those controls should parents have.
I’m sure you’ve noticed more questions than answers here – and I know, despite my immense respect for my co-speakers, we will not solve them all in one session! But if trust is part of the answer – and I think we all believe it must be – then we have to start somewhere, and this session is a “next step” in this on-going process to implement ed tech smartly, responsibly, and thoughtfully to minimize risk and prevent harm so we can enjoy the tremendous benefits data and technology offer to our schools and our students.
Hope to see you there!