Promoting Innovation, and Protecting Privacy in the Classroom

Today, FPF announces the release of two new student privacy related papers.  They are:

Who Is Reading Whom Now: Privacy in Education from Books to MOOCs (Jules Polonetsky and Omer Tene (October 7, 2014), Vanderbilt Journal of Entertainment & Technology Law, Forthcoming. Available at SSRN: http://ssrn.com/abstract=2507044) and,

Student Data: Trust, Transparency and the Role of Consent (October 2014, Jules Polonetsky and Joseph Jerome, available at http://fpf.org/wp-content/uploads/FPF_Education_Consent_StudentData_Oct2014.pdf).

The first paper, Who Is Reading Whom Now: Privacy in Education from Books to MOOCs, discusses how education technologies, while presenting tremendous opportunities to make education more personalized, accountable, collaborative and engaging through social media, gamification and interactive content, also engender privacy concerns arising from the combination of enhanced data collection with highly sensitive information about children and teens.

The paper seeks to separate the core ed tech privacy issues from the broader policy debates surrounding education standardization, the Common Core, longitudinal data systems and the role of business in education. It argues that decision makers should unpack the distinct policy issues surrounding ed tech to facilitate levelheaded discussion and appropriate policy responses. It further separates privacy problems related to “small data,” the personalization enabled by optimization solutions that “read students” even as they read their books, from concerns about “big data” analysis and measurement, including algorithmic biases, discreet discrimination, narrowcasting and chilling effects.

The paper proposes a broad range of solutions, from deployment of traditional privacy tools, such as contractual and organizational governance mechanisms, to greater data literacy by teachers and increased parental involvement. It warns against kneejerk reactions to ed tech data processes, which may unwittingly accentuate a growing equity gap between the “haves” and “have nots.” It advocates enhanced transparency to increase parental engagement in and understanding of the role of technology in their children’s education. The paper builds on the authors’ previous work to balance big data rewards against privacy risks, while complying with applicable regulation.

The second paper, Student Data: Trust, Transparency and the Role of Consent, discusses how over the past decade, new technologies in schools have generated an “explosion of data” for public school systems to use and analyze. Accordingly, the Department of Education has identified the use of student data systems to improve education as a top national priority even as an increased focus on data has raised legitimate privacy concerns.

With parents worried that student data is being used for marketing purposes without appropriate contractual and legal safeguards, a “notice and choice” regime has had intuitive appeal.  However, the paper shows that providing parents more notice and choice may do little to further student privacy while at the same time unintentionally excluding children from necessary education services.

The paper discusses the practical implications of consent requirements; explores how existing federal laws protect student data; and compares the activities of data vendors and the role of individual consent in the health and financial sectors. It argues that parents should not be resigned to opt-out of embracing new technologies as a result of legitimate privacy concerns. The paper concludes that policymakers must devise better ways to inform parents about how their children’s data is being used, and to provide students and parents with better tools to inform learning.

Other FPF efforts on student privacy include a resource web site at studentprivacycompass.org, and the launch of a Student Privacy Pledge with SIIA and leading tech companies.  See www.studentprivacypledge.org to sign on to the pledge or for more details.

To learn more about FPF’s ongoing student privacy work, contact Brenda Leong, FPF Education Privacy Counsel at [email protected].

White House Office of Science and Technology Policy blog

We are pleased to see the White House Office of Science and Technology take note of the Student Privacy Pledge.  Read more at Promoting Innovation and Protecting Privacy in the Classroom, October 9, 2014.

K-12 Student Privacy Pledge Announced

Today Future of Privacy Forum (FPF) and Software and Information Industry Association (SIIA) provided the following press release.  Additional FPF Resources and Publications on this topic are listed below the announcement.

Leading K-12 School Service Providers Announce Pledge

To Advance Student Data Privacy Protection

New Effort Builds on Legal Protections to Enhance Confidence in the Handling of Student Personal Information; Addresses Sales, Retention, Security, Profiling, Advertising and More

Washington, D.C. – October 7, 2014 – The Future of Privacy Forum (FPF) and the Software & Information Industry Association (SIIA) today announced a K-12 school service providers Pledge to safeguard student privacy built around a dozen commitments regarding the collection, maintenance, and use of student personal information.

An initial leadership group of major school service providers are joining SIIA and FPF to introduce and sign the Pledge. The group is made up of some of the leading names in education technology, including Amplify, Code.org, DreamBox Learning, Edmodo, Follett, Gaggle, Houghton Mifflin Harcourt, Knewton, Knovation, Lifetouch, Microsoft, MIND Research Institute, myON (a business unit of Capstone), and Think Through Math. SIIA and FPF will continue reaching out to educators, parent groups and companies over the next several months to promote the pledge and garner participation from other school service providers.

“These commitments clearly and concisely articulate a set of expectations that parents and education officials have for the safeguarding of children’s sensitive data,” said Jules Polonetsky, executive director and co-chair, Future of Privacy Forum. “The Pledge will enhance the trust between families, schools and third party service providers necessary to support the safe and effective use of student information for student, teacher and school success.”

“We introduce this Pledge as a clear industry commitment to safeguard the privacy and security of all student personal information,” said Mark Schneiderman, senior director of education policy, Software & Information Industry Association. “Current law provides extensive restrictions on the use of student information, and this industry pledge will build on and detail that protection to promote even greater confidence in the appropriate use of student data.”

The commitments are intended to detail ongoing industry practices that meet and go beyond all federal requirements and to encourage service providers to more clearly articulate these practices to further ensure confidence in how they handle student data. The Pledge would apply to all student personal information whether or not it is part of an “educational record” as defined by federal law, and whether collected and controlled by the school but warehoused offsite by a service provider or collected directly through student use of a mobile app or website assigned by their teacher. It would apply to school service providers whether or not there is a formal contract with the school.

The Pledge will make clear that school service providers are accountable to:

The Pledge was developed by the FPF and SIIA with guidance from the school service providers, educator organizations, and other stakeholders following a convening by U.S. Representatives Jared Polis (CO) and Luke Messer (IN).

“The potential of using student data to drive effective instruction and personalize education is promising,” said Rep. Polis. “While there can be tremendous benefits from this data, we must ensure that there are appropriate safeguards to protect student privacy. I am pleased that these companies have taken an important step in making a commitment to parents, educators, and communities. This voluntary pledge can help address parents’ legitimate concerns about privacy issues and help keep us on track towards new and exciting educational developments for all students.”

“I applaud these companies for their commitment to protecting the privacy of sensitive student information,” said Congressman Luke Messer.  “Their willingness to comply with these standards and not misuse student information shows they care about protecting student privacy while leveraging technology to unleash each child’s talents and enhance their ability to learn and succeed.”  Messer, who serves on the House Committee on Education and the Workforce, added that the principles  “will better inform the debate about what, if any, legislative remedy may be needed to ensure that child privacy is protected and this information is only used for academic purposes.”

“While technology is a powerful tool for teaching and learning, it is imperative that students’ personal information is protected at all times,” said Otha Thornton, president of National PTA. “National PTA applauds K-12 school service providers that have pledged to safeguard student data and privacy and effectively communicate with parents about how student information is used and protected. We look forward to even more support going forward.”

“This industry-led pledge to honor student data privacy is an important step in the right direction,” said Thomas J. Gentzel, executive director, National School Boards Association. “Those vendors who opt to take the pledge are demonstrating their public commitment to responsible data practices in a manner that will help support school boards’ efforts to safeguard student privacy.”

School service providers support schools – including their teachers, students and parents – to manage student data, carry out school operations, support instruction and learning opportunities, and develop and improve products/services intended for educational/school use. In so doing, it is critical that school service providers effectively communicate with parents, teachers and education officials about how student information is used and safeguarded.

About FPF

The Future of Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts Jules Polonetsky and Christopher Wolf and includes an advisory board comprised of leading figures from industry, academia, law and advocacy groups.  For more information, visit fpf.org.

About SIIA

SIIA is the leading association representing the software and digital content industries. SIIA represents approximately 800 member companies worldwide that develop software and digital information content. SIIA provides global services in government relations, business development, corporate education and intellectual property protection to the leading companies that are setting the pace for the digital age. For more information, visit www.siia.net.

About National PTA

National PTA® comprises millions of families, students, teachers, administrators, and business and community leaders devoted to the educational success of children and the promotion of parent involvement in schools. PTA is a registered 501(c) (3) nonprofit association that prides itself on being a powerful voice for all children, a relevant resource for families and communities, and a strong advocate for public education. Membership in PTA is open to anyone who wants to be involved and make a difference for the education, health, and welfare of children and youth.

About NSBA

About Us NSBA represents state school boards associations and their more than 90,000 local school board members. We believe education is a civil right, and public education is America’s most vital institution.

Media Contacts

FPF Nicholas Graham, 571-291-2967 [email protected]

SIIA Sabrina Eyob, 202-789-4480 [email protected]

 

Further FPF materials can be found at:

The Pledge and more information about how to support the Pledge is available at http://studentprivacypledge.org/.

studentprivacycompass.org  FERPA|SHERPA aims to provide service providers, parents, school officials, and policymakers with easy access to those materials to help guide responsible uses of student’s data.

The Ethics of Student Privacy: Building Trust for Ed Tech

 

Do Beacons Track You? No, You Track Beacons

 

BuzzFeed News today reports that phone booths in NYC are tracking people and can send them ads.

Let’s explain this rapidly spreading new technology we often see described inaccurately.

First, let’s step back and understand how your phone or apps on your phone that you grant permission to access your location are able to determine where you are.

In general, there are a number of useful reasons for your phone to know its location. Location can be useful to find a lost phone, for example, or so that apps can provide location-related services like map services, finding friends nearby or location specific offers.

Apps can access your phone’s location services – with your permission. iPhone apps must use Apple’s location services, while Google Play apps generally use Google’s location services. These location services leverage GPS signals and the location of nearby cell towers to estimate where you are. In recent years, these location services also began noting the MAC addresses of nearby Wi-Fi networks, adding additional precision. These can be accurate to within 10-12 feet, depending on your proximity to Wi-Fi routers.

What’s new is the advent of low-powered beacon technologies, where stores, museums, airports or other spaces set up a device that broadcasts a unique code. If your phone has Bluetooth turned on and you download an app that you allow permission to use Bluetooth and location, the app can detect that beacon. If the app has a relationship with the beacon provider, the app can determine more precisely where you are. This enables features like allowing an app to display information on your phone about a certain museum painting when you are front of that painting, or a retailer app or restaurant app or any other app that you have enabled can send precisely targeted offers to your exact location – in front of a display at a store, or in this case, near a specific phone booth. This is possible ONLY if you have given that specific app permission to use notification functions. Your phone also allows you to deny an app access to contact you via notifications.

Beacons themselves don’t collect any data. They do not send marketing messages to your phone. They broadcast location marks that your phone and apps using your phone can take advantage of to understand more precisely where you are.

Unlike mobile location analytics technologies, beacons do not detect or collect MAC addresses. For more info about mobile location analytics, go to smart-places.org.

In short, beacons are not tracking your phone. In fact, your phone is tracking them.

We don’t suggest that beacons or other location technologies don’t raise privacy issues. Apps using location must do an effective job at explaining why and what they collect. Apps “listening” for Bluetooth beacons require that you grant them permission “always.” In iOS 8, users who allow apps to always have location will be reminded by the operating system and invited to turn off location or allow the app to continue to collect location. Apps collecting location, of any sort, should make sure that they have policies to delete the history of a user’s location. There is real work to be done in this area, to ensure that location services are used in ways that benefit users, and properly manage information. A good way to start is by understanding the technology.

 

Photo: Phil Roeder: Flickr

Facebook Moves Forward with an Ethics Review Panel

Jules and Omer opine in the Hill on today’s announcement from Facebook, commenting about the “essential accountability mechanism for companies that are struggling to balance the risks and opportunities of big data.”

The Hill

“Facebook’s announcement — establishing guidelines, review processes, training and enhanced transparency for research projects — marks another milestone in the emergence of data ethics as a crucial component of corporate governance programs.

With the proliferation of personal data generated from smartphones, apps, social networks and ubiquitous sensors, companies have come under increasing pressure to put in place internal institutional review processes more befitting of academic philosophy departments than corporate boardrooms.”  Continue at:

http://thehill.com/blogs/pundits-blog/technology/219620-facebook-calls-in-the-philosophers

CDT's "Always On" – the Digital Student

FPF attended the “Always On – Digital Student” forum hosted by the Center for Democracy & Technology and the Data Quality Campaign on September 24, 2014. 

Nuala O’Connor, President and CEO of CDT; Aimee Guidera, founder and Executive Director, DQC; and Brenda Leong, FPF Education Policy Fellow.

The Digital Student Challenge

Jon Phillips of Dell opened the session with a challenge that we need to change the learning process as well as change how we measure the learning that is occurring.  He emphasized the need for students to be more active players in their journey, with ownership of the goals and the process to achieve them.  This is a growing refrain in privacy circles, recently addressed by FPF’s parent blogger, Olga Garcia-Kaplan.

Phillips emphasized the 4 aspects of good digital design:

–       helping the teacher and student orchestrate the learning environment

–       creating a collaborative environment for meaningful learning to occur

–       maintaining focus on the critical value of experiential learning

–       applying assistive or adaptive tools to make the tech a tool that enable learning for all students

Phillips’ key is that true personalization should be the central tenet, while allowing students a personal voice and self-direction to ensure they understand the relevance of the learning process for their own goals.  He sees personalization as:  digital literacy; data security; and integration. He also pointed out that in such a model, even with an increasing role for technology, the teacher’s impact is dramatically important for success.  Because teachers can be overwhelmed with technology opportunities, we should offer help to deliver the relevant and effective tools they can use.

Phillips concluded with the challenge to create the freedom for teachers to take risks, and for those risks to be rewarded, and to find ways to bring students into the technology conversation and process.

Principles and Ethics

Following Phillips’ presentation, small groups of attendees addressed a variety of questions facing the edtech community, which were then pulled into a larger discussion with a panel of privacy experts.  These addressed many of the key issues that FPF is currently working on and publicizing.  First, participants agreed that there are many “basics” that most schools, companies, and parents agree on for privacy and data security protections…many of the things that schools have always done (grade work, take attendance, discipline students) but is now digitized and thus maintained, evaluated, and aggregated in ways not historically possible.

Most attendees agreed that after some of this “low hanging fruit” is addressed, the tougher issues of ethics in the new uses of data previously inconceivable must be addressed.  The role of parents, the possibilities of certification processes for specific education apps, within the need to clarify what a student’s expectation of privacy really is, or should be, were all explored.

The panel also agreed that part of the way forward is to clarify the conversation, separating out not only “privacy” from “security” as individual requirements and challenges, but also to disentangle privacy-specific concerns from the broader curriculum and educational reform movements.  FPF recently published. on the ethics of student privacy, and the trust required between parents and schools.

Importantly, as pointed out by panelist Shannon Sevier, President of the National PTA, parents must be included in the conversation.  Since most parents will not be technology experts, much less privacy savvy, communication must include the what/why/how of data collection and use by schools, as well as clear descriptions of the benefits, rather than the rarer (but media-attractive) scare stories. A blended approach to classroom and homework is an important part of changing the way learning occurs, but safeguards are needed to ensure equity of access to the various technology portals, something not yet available for all students.

All agreed that today, use of technology is growing and changing faster than best practices and communication have adapted, so it is incumbent on companies and privacy advocates to help families and policymakers catch up, with outreach and programming that helps make clear what the existing rules are, and guides the conversation toward future enforceable standards.

At the end of the day, many voices agreed that citizenship, while moving into a digital age, is still a question of values and judgment and learned behaviors. Students must be taught their “digital citizenship” with its associated rights and responsibilities, just as they have been in more traditional forms in the past.  FPF continues its work with a group of education and industry stakeholders.  If you are interested in participating, contact us at [email protected].

Transparency About What Data is Used is Key for Parents and Students

Last week, the New York Times hosted a Room for Debate on Protecting Student Privacy in On-Line Learning.  The question: “Is the collection of data from schools an invasion of students’ privacy?”  Jules’ weighed in, making the key point that schools must continuously and effectively communicate with parents on what technology is being used, what data is being collected, and how that information is safeguarded.  FERPA/Sherpa parent blogger Olga Garcia-Kaplan also participated, emphasizing that the focus must stay on the education and needs of individual students.

Thoughts on the Data Innovation Pledge

Yesterday, as he accepted the IAPP Privacy Vanguard award, Intel’s David Hoffman made a “data innovation pledge” that he would work only to promote ethical and innovative uses of data. As someone who only relatively recently entered the privacy world by diving headfirst into the sea of challenges surrounding big data, I think an affirmative pledge of the sort David is proposing is a great idea.

While pledges can be accused of being mere rhetorical flourishes, words do matter. A simple pledge can communicate a good deal — and engage the public in a way that drives conversation forward. Think of Google’s early motto to “don’t be evil.” For years this commitment fueled a large reservoir of trust that many have for the company. With every new product and service that Google releases, its viewed through the lens of whether or not its “evil.” That places a high standard on the folks at Google, and for that, we should pleased.

Of course, pledges present obligations and challenges. Using data only for good presents a host a new questions. As FPF explored in our whitepaper on benefit-risk analysis for Big Data, there are different aspects to consider when evaluating the benefits of data use — and some of these factors are largely subjective. Ethics is a broad field, and it also exposes the challenging philosophical underpinnings of privacy.

The very concept of privacy has always been a philosophical conundrum, but so much of the rise of the privacy profession has focused on compliance issues and the day-to-day reality of data protection. But today, we’re swimming in a sea of data, and all of this information makes us more and more transparent to governments, industry, and each other. It’s the perfect catalyst to consider what the value of “privacy” truly is. Privacy as an information on/off switch may be untenable, but privacy as a broader ethical code makes a lot of sense.

There are models to learn from. As David points out, other professions are bound by ethical codes, and much of that seeps into how we think about privacy. Doctors not only pledge to do no harm, but they also pledge to keep our confidences about our most embarrassing or serious health concerns. Questionable practices around innovation and data in the medical field led to review boards to protect patients and human test subjects and reaffirmed the role of every medical professional to do no evil.

Similar efforts are needed today as everything from our wristwatches to our cars is “datafied.” In particular, I think about all of the debates that have swirled around the use of technology in the classroom in recent years. A data innovation pledge could help relieve worried parents. If Monday’s FTC workshop is indication, similar ethical conversations may even be needed for everyday marketing and advertising.

The fact is that there are a host of different data uses that could benefit from greater public confidence. A data innovation pledge is a good first start. There is no question that companies need to do more to show the public how they are promoting innovative and ethical uses of data. Getting that balance right is tough, but here’s to privacy professionals helping to lead that effort!

-Joseph Jerome, Policy Counsel

FTC Wants Tools to Increase Transparency and Trust in Big Data

However we want to define “Big Data” – and the FTC’s latest workshop on the subject suggests a consensus definition remains elusive – the path forward seems to call for more transparency and the establishment of firmer frameworks on the use of data. As Chairwoman Ramirez suggested in her opening remarks, Big Data calls for a serious conversation about “industry’s ethical obligations as stewards of information detailing nearly every facet of consumers’ lives.”

Part of that challenge is that some Big Data uses are often “discriminatory”. Highlighting findings from his paper on Big Data and discrimination, Solon Barocas began the workshop by noting that whole point of data mining is to differentiate and to draw distinctions. In effect, Big Data is a rational form of discrimination, driven by apparent statistical relationships rather than any capriciousness. When humans introduce unintentional biases into the data, there is no ready solution at a technical or legal level. Barocas called for a conversation for lawyers and public policy makers to have a conversation with the technologists and computer scientists working directly with data analytics – a sentiment echoed when panelists realized a predictive analytics conference was going on simultaneously across town.

But the key takeaway from the workshop wasn’t that Big Data could be used as tool to exclude or include. Everyone in the civil rights community agreed that data could be a good thing, and a number of examples were put forth to suggest once more that data had the potential to be used for good or for ill. Pam Dixon of the World Privacy Forum classifying individuals creates a “data paradox,” where the same data can be used to help or to harm that individual. For our part, FPF released a report alongside the Anti-Defamation League detailing Big Data’s ability to combat discrimination. Instead, there was considerable desire to understand more about industry’s approach to big data. FTC staff repeatedly asked not just for more positive uses of big data by the private sector, but inquired as to what degree of transparency would help policy makers understand Big Data decision-making.

FTC Chief Technologist Latanya Sweeney followed up her study that suggested web searches for African-American names were more likely than searches of white-sounding names to return ads suggesting the person had an arrest record by looking at credit card advertising and website demographics. Sweeney presented evidence that advertisements for harshly criticized credit cards were often directed to the homepage of Omega Psi Phi, a popular black fraternity.

danah boyd observed that there was a general lack of transparency about how Big Data is being used within industry, for a variety of complex reasons. FTC staff and Kristin Amerling from Senate Commerce singled out the opacity surrounding the practices of data brokers when describing some of the obstacles being faced when policy makers try to under how Big Data is being used.

Moreover, while consumers and policy makers are trying to grapple with what companies are doing with their streams of data, industry is also placed in the difficult position of making huge decisions about how that data can be used. For example, boyd cited the challenges JPMorgan Chase faces when using analytics to evaluate human trafficking. She applauded the positive work the company was doing, but noted that expecting it to have the ability or expertise to effectively intervene in trafficking perhaps asks too much. They don’t know when to intervene or whether to contact law enforcement or social services.

These questions are outside the scope of their expertise, but even general use of Big Data can prove challenging for companies. “A lot of the big names are trying their best, but they don’t always know what the best practices should be,” she concluded.

FTC Commissioner Brill explained that her support for a legislative approach to increase transparency and accountability among data brokers, their data sources, and their consumers, was to help consumers and policy makers “begin to understand how these profiles are being used in fact, and whether and under what circumstances they are harming vulnerable populations.” In the meantime, she encouraged industry to take more proactive steps. Specifically, she recommended again that data brokers explore how their clients are using their information and take steps to prevent any inappropriate uses and further inform the public. Companies can begin this work now, and provide all of us with greater insight into – and greater assurances about – their models,” she concluded.

A number of legal regimes may already apply to Big Data, however. Laws that govern the provision of credit, housing, and employment will likely play a role in the Big Data ecosystem. Carol Miaskoff at the Equal Employment Opportunity Commission suggested there was real potential with Big Data to gather information about successful employees and use that to screen people for employment in a way that exacerbates prejudices built into the data. Emphasizing his recent white paper, Peter Swire suggested there were analogies to be made between sectoral regulation in privacy and sectoral legislation in anti-discrimination law. With existing legal laws in place, he argued that it was past time to “go do the research and see what those laws cover” in the context of Big Data.

“Data is the economic lubricant of the economy,” the Better Business Bureau’s C. Lee Peeler argued, and he supported the FTC’s continued efforts to explore the subject of Big Data. He cited earlier efforts by the Commission to examine inner-city marketing practices, which produced a number of best practices still valid today. He encouraged the FTC to look at what companies are doing with Big Data on a self-regulatory basis as a basis for developing workable solutions to potential problems.

So what is the path forward? Because Big Data is, in the words of Promontory’s Michael Spadea, a nascent industry, there is a very real need for guidelines on not just how to evaluate the risks and benefits of Big Data but also how to understand what is ethically appropriate for business. Chris Wolf highlighted FPF’s recent Data-Benefit Analysis and suggested companies were already engaged in detailed analysis of the use of Big Data, though everyone recognized that businesses practices and trade secrets precluded making much of this public.

FTC staff noted there was a “transparency hurdle” to get over in Big Data. Recognizing that “dumping tons of information” onto consumers would be unhelpful, staff picked up on Swire’s suggestion that industry needed some mechanism to justify what is going on to either regulators or self-regulatory bodies. Spadea argued that “the answer isn’t more transparency, but better transparency.” The Electronic Frontier Foundation’s Jeremy Gillula recognized the challenge companies face revealing their “secret sauce,” but encouraged them to look at more way to give consumer more general information about what was going on. Otherwise, he recommended, consumers ought to collect big data on big data and turn data analysis back on data brokers and industry at large through open-source efforts.

At the same time, Institutional Review Boards, which are used in human subject testing research, were again proposed as a model for how companies can begin affirmatively working through these problems. Citing a KPMG report, Chris Wolf insisted that strong governance regimes, including “a strong ethical code, along with process, training, people, and metrics,” were essential to confront the many ethical and philosophical challenges that flirted around the day’s discussions.

Jessica Rich, the Director on the FTC’s Consumer Protection Bureau, cautioned that the FTC would be watching. In the meantime, industry is on notice. The need for clearer data governance frameworks is clear, and careful consideration of Big Data project should be both reflexive and something every industry privacy professional talks about.

//

Other Relevant Reading from the Workshop:

 

iOS 8 and Privacy: Major New Privacy Features

iOS 8 includes several new privacy features founded on Apple’s core privacy principles of consent, choice and transparency. With these principles in mind, Apple created and incorporated increasingly granular controls for location, opportunities for developers to communicate to users how and why they use data, and limits on how third parties can track your device.

Users now have greater visibility regarding application access to location information.

In previous versions of iOS, apps could prompt users for permission to use Location Services, and, once a user gave an app access, the app could access the user’s location any time it was running, including when the app was not on screen (i.e. in the background).  In iOS 8, Location Services has two modes: “While Using the App” – whereby the app can only access location when the app is on screen or made visible to a user by iOS making the status bar blue – or “Always.”  Apps have to decide which Location Services mode to request and are encouraged by Apple to only request access “Always” location permission when users would “thank them for doing so.” In fact, iOS 8 will at times present a reminder notice to users if an app that has “Always” location permission uses Location Services while the app is not on screen.

Users will be able to limit access to their contacts.

In iOS 8, users can use a picker, controlled and mediated by iOS, that allows users to share a specific contact with an app without giving the app access to their entire address book.

Apps will be able to link directly to privacy settings.

With iOS 8, apps will be able to link directly to their settings, including their privacy settings, making it easier for users to control their privacy. Before, apps could only give instructions on how to go to the phone’s settings to change the privacy controls. This new feature makes control over privacy settings more accessible to users.

Apple’s new Health app implements additional protections for user’s health data.

Apple’s new Health app and HealthKit APIs give third party health and fitness apps a secure location to store their data and gives users an easy-to-read dashboard for their health and fitness data. Apple has implemented a number of features and safeguards to protect user privacy. First, a user has full control as to which apps can input data into Health and which apps can access Health data. Second, all Health data on the iOS device is encrypted with keys protected by a user’s passcode. Finally, developers are required to obtain express user consent before sharing Health data with third parties, and even then they may only do so for the limited purpose of providing health or fitness services to the user. These features and restrictions allow users to have control over their HealthKit data.

Apple requires apps accessing sensitive data to have a privacy policy disclosing their practices to users

Apple requires apps that utilize the HealthKit or HomeAPIs, offer third party keyboards, or target kids, to have a privacy policy, supporting industry standards and California law. App privacy policies should include what data is collected, what the app plans to do with that data, and if the app plans to share it with any third parties, who they are. Users will be able to see the privacy policy on the App Store before and after downloading an app.

iOS 8 places additional emphasis on disclosure of why developers want access to data.

Apple strongly encourages developers to explain why their apps request a user’s data or location when a user is prompted to give an app access. Developers can do so in “purpose strings,” which are part of the notice that appears when an app first tries to access a protected data class.

Apple’s iOS encourages a “just in time” model, where users should be prompted for access after they take an action in an app that requires the data.  The “just in time” prompt and access flow is mediated by iOS and replaces consent models such as those consisting of strings of permissions that pop up after installation like a conga line or users having to give an app access to all data if they want to use an app. Moreover, Apple continues its practice of encouraging app developers to only ask for access to data when needed, and to gracefully handle not getting permission to access a user’s data.

MAC address randomization makes it more difficult to track and individualize iOS devices.

Wi-Fi enabled devices generally scan for available wireless networks.  These scans include the device Media Access Control (MAC) address, which is a 12 character string of letters and numbers, required by networking standards to identify a device on a network and assigned by the manufacturer.  Mobile Location Analytic companies have, at times, relied on these scans, and the fact that Wi-Fi devices’ MAC addresses do not change, to track individual mobile devices as they move around a venue.

In iOS 8, Apple devices will generate and use random MAC addresses to passively scan for networks, shielding users’ true MAC addresses until a user decides to associate with a specific network. Randomizing MAC addresses makes this kind of tracking much more difficult. However, your device can still be tracked when you are connected to a Wi-Fi network or using Bluetooth.  FPF’s Mobile Location Analytics Code of Conduct governs the practices of the leading location analytics companies and provides an opt-out from mobile location tracking. Visit Smart-Places for more details or to opt-out.

Summary

iOS 8’s new “prompting with purpose” disclosures, refined location settings, strict requirements for HealthHit, HomeKit, and kids apps, and MAC address randomization will present greater transparency, protection, and control over privacy for iOS users.