As a privacy expert, if my local health department develops a mobile app for people with a COVID diagnosis to alert anyone they were near, will I use it?
Yes, I will. And I will urge friends, neighbors and colleagues to download such an app. I have an immuno-compromised family member in my household. I am also lucky to live one block from my senior citizen in-laws. If a health department app can inform me that I am possibly at risk, I can take measures to keep them safe from me. I want that app to be built with privacy protections in place, collecting only the data needed and deleting it as soon as possible. Today, Apple and Google have launched new capabilities for health department apps, with strict technical privacy restrictions to try to provide these apps with the ability to scan for nearby devices and to delete data in 30 days.
In my home state of Maryland, Governor Hogan is seeking to quadruple the current staffing to 1,000 state employees and outside contractors supporting manual contact tracing, but hiring and training will take time. Contact tracing relies on interviewing people about who they may have come into contact with recently and then painstakingly finding contact information needed to contact everyone of those potentially exposed individuals. It also relies on people to accurately remember all of their interactions. Can you remember the people you stood next to on the long line at the grocery store last week?
Should my health department offer an app to supplement this process? I hope they will look closely at the way apps have been used by health departments for exposure notification around the world and decide whether it would be a useful supplement to the human contact tracing effort they are setting up.
In an ideal world, we would have a national response that deployed hundreds of thousands of human contact tracers, so that use of an app would be a very minor supplemental option. Exposure notification apps would be tested for efficacy in a careful controlled study. The CDC would be working with the WHO to advise based on the results of studies of the app efforts in Singapore, Israel, Hong Kong, South Korea and elsewhere. We might learn if they are helpful and what data they need. Do health department apps need precise location, despite the risks of revealing the private activities of individuals? Can the apps rely solely on information from Bluetooth about proximity to nearby phones to be effective? Are the apps effective if they are voluntary and work in a decentralized manner? What is the risk of abuse of data collected in countries without strong data protection legislation or countries with dangerous human rights records? But we do not live in a perfect world, and timely preventive measures can save lives today.
I realize that the data may be imprecise, untested, imperfect. I will look to my reasonably competent health department for guidance. I realize I am privileged in this regard. If I get an alert, I can work from home and be paid. I can err on the side of safety out of caution. Many can not. I realize that not everyone has a smartphone, so this is not a service that all can benefit from, but it is one of the most widely adopted technologies in the world. I hope we can find ways to ensure everyone can have access and that we can address economic and racial disparities.
I vote, donate and actively campaign for candidates who I hope will work to make society more just. I have served in government at the city, state and federal level and have been elected to office and have been appointed to office. But in an imperfect world and during an emergency, we all need to make the most ethical decision with the facts at hand. Relying on such apps is in my view a potentially helpful supplemental safety measure that fills a gap created by the current challenges.
Let’s turn to what Apple and Google should be doing to support local health departments. First, let’s note that Apple and Google haven’t invented the idea of using a phone for exposure notification or contact tracing during this pandemic. Health departments in countries that moved quickly to respond to the outbreak quickly commissioned apps that used the mobile phone location services, and sometimes Bluetooth capabilities and promoted them to their local populations. But it turns out that due to privacy settings and power limitations, mobile phones aren’t the most effective tool for the highly precise information collection needed for tracing. These privacy protections have been baked deep into the devices operating system, due to years of work to prevent misuse by human rights abusing governments, stalkers and criminals and by advertisers and marketers.
- The first protection is one that everyone knows about – you have to give permission to apps to access your location. But over time, apps were able to get around that setting by using other signals like Bluetooth beacons and WiFi signals to infer location.
- Another protection is that apps are blocked from scanning passively and continuously for Bluetooth signals when they’re running in the background on your phone.
- Both of these protections exist to prevent apps from using Bluetooth signals, for example from beacons, to reverse-engineer your location and get around privacy settings.
- Another limitation is that apps (prior to now) could not passively and continuously send out their own Bluetooth signals to be picked up by others. If they could, it would be a major risk for all kinds of surveillance.
- Finally, there is the limitation of interoperability – Bluetooth signals from Android and Apple phones could not easily be interpreted by each other.
Another current interoperability problem that the Google-Apple API will solve for is that existing exposure notification apps are often not interoperable with each other. If a person downloads an app from one public health authority but then comes into contact with a user of an app from another jurisdiction, the apps often will not recognize one another. However, all apps using the Apple-Google API will recognize one another. This type of scalability is essential to enable effective notifications, thereby beginning to enable society to cautiously reopen.
These are the limitations that public health authorities are facing in developing apps. The apps that have launched to date have usually relied on asking users to opt in to sharing their location, revealing precise location data can reveal intimate information – where you’re going, where you’ve been, your character, interests, habits, religion, political inclinations.
So health departments began looking to Google and Apple to give them better access to the limited bluetooth APIs currently available. Remarkably, for two competitors who rarely cooperate, Apple and Google partnered on providing a new API that allows background sending and receiving of rotating Bluetooth identifiers. This gives apps access to new information that they couldn’t get before, but with limits to how it can be accessed or used. Only health departments will be approved to use this new API, to limit the sending of fake signals. Health departments are not sent information about individual users, as the app and device handles the communications locally.
Apple and Google did not create an app. It’s an API, which means a technical method for apps to get information off of the device. Public health authorities will create the apps that use this information, and be responsible for how it is communicated and how users receive alerts and what those alerts say. Public health authorities will have options to determine who should be alerted based on Bluetooth signal strength and time period of proximity to trigger an alert.
But, if you are like me, and you want to protect those around you by being able to get and share these alerts, with minimal risk to privacy, health department apps that use the new API should be able to provide an additional tool in the effort to re-open society as we fight the pandemic.
For more privacy and data protection resources related to COVID-19, click here.