Fairness, Ethics, & Privacy in Tech: A Discussion with Chanda Marlowe
After beginning her career as a high school English teacher, Chanda Marlowe’s career change led her to become FPF’s inaugural Christopher Wolf Diversity Law Fellow. She’s an expert on location and advertising technology, algorithmic fairness, and how vulnerable populations can be uniquely affected by privacy issues.
What led you to the Future of Privacy Forum?
I was a high school English teacher and I decided I wanted to be an advocate for student rights. I went back to school and earned a dual law degree and Master’s in Communications from the University of North Carolina at Chapel Hill. That’s where I found that I was really drawn to student privacy. I first came to FPF because I wanted to intern someplace that was leading in student privacy. I had cited FPF’s work in my papers and presentations. Also around that time, President Obama endorsed FPF’s Student Privacy Pledge, which was a big moment. I left FPF after my internship, and then about two years ago, I came back as the Christopher Wolf Diversity Law Fellow.
What attracts you to privacy issues?
There is a lot of room for thoughtful discussion among people who are very smart in the privacy space because there is so much grey area. It’s special to be part of these conversations when technology is emerging so rapidly.
What areas of privacy have attracted you?
I came to FPF intending to work primarily on education issues, and I ended up with opportunities to work on much more, including location and ad practices, the Internet of Things, and algorithmic fairness. So now I follow those issues closely and I really get into the granular details, like what does it mean to be compliant when the legal landscape is constantly changing with GDPR, state laws, and federal legislation.
I’m also drawn to how privacy impacts global populations. I did my Master’s thesis on surveillance of students, and I’ve been fortunate to work on the issue of algorithmic fairness.
Increasingly, everything is being automated and the decisions that were once made by humans are now being made by machines. This can lead to challenges – using machines doesn’t get rid of all bias. There could be bias in the data set, you could have problems with the machine itself, or the programmer could be biased. So those are three ways bias can be injected into a machine learning system.
I’ve been working very closely with Lauren Smith on a paper that maps out the potential harms that can arise from automated decision-making, and considers whether existing laws are adequate to address them.
I also enjoyed working on an FPF project that considers how the Internet of Things can have unique privacy impacts on people with disabilities. It was amazing to be are part of a project where we convened academics, consumer advocates (including disability organizations), and companies that create products for people with disabilities, ranging from startups to large platforms, and have a conversation about privacy concerns. We were able to work very closely with the American Association of People with Disabilities to draft a paper that not only explores the nuances of privacy considerations for people with disabilities using IoT devices and services, but also provides recommendations to address privacy considerations, which includes considering what mechanisms for informing people with disabilities are making their way into products. For example, the Amazon Echo added auditory cues so people who are blind aren’t expected to react to a light. We want to encourage more thoughtful approaches like that. It was an amazing experience to collaborate with so many people in the disability community.
What do you see as up-and-coming privacy issues?
I’m happy to see that there’s more talk about what privacy means for vulnerable populations. There has been increased attention to bringing in groups that have not previously been invited to the table to talk. Privacy isn’t just in one small bucket anymore, separate from broader conversations.
It’s really important that FPF is taking deliberate efforts to make sure the next generation of privacy professionals is diverse. The creation of the Christopher Wolf Diversity Law Fellowship embodies a commitment to valuing diversity in all fields that involve privacy.
Your fellowship is scheduled to end in a few months. What’s next for you?
This has already been a wonderful experience, and I have more to accomplish before it ends. Recently, I had the amazing experience of speaking before the Congressional Black Caucus Institute about privacy legislation. That’s something I never would have had the opportunity to do if I had stayed in North Carolina.
I’m interested in staying in the privacy law and policy space. I have learned so much and have been given so many opportunities that I cannot wait to launch the next steps of my privacy career.
FPF will host our next Privacy Book Club on April 24 at 2:00 PM EST. Join us to discuss Habeas Data: Privacy vs. the Rise of Surveillance Tech by Cyrus Farivar. Sign up for the book club here.
We hope you will join us at our 10th Anniversary Celebration on April 30. Buy your ticket here.