Privacy's Zietgeist Moment

This second guest post from FPF Advisory Board member Professor Danielle Citron is cross-posted with Concurringopinions.com, one of the top legal blogs. Check out Concurringopinions.com for regular legal and policy from top privacy scholars, including among others FPF Advisory Board members Daniel Solove, Frank Pasquale and Danielle Citron.

Privacy’s Zietgeist Moment

By Danielle Citron

Privacy has seemingly come center stage. Companies like Google, Microsoft, and eBay have joined forces to support a federal law that would impose uniform standards for the collection, use, and transfer of information across the private sector. Activists and officials hope to update the Privacy Act of 1974 for the twenty-first century. Senator Leahy has a renewed interest in data breach legislation, proposing the Personal Data Privacy and Security Act in July. The American Recovery and Reinvestment Act of 2009, the stimulus bill, includes a data breach notification requirement for health providers. The Federal Trade Commission recently published its final rule on data breach notification for e-health records.

Strengthening the nation’s commitment to privacy is crucial. But, as Paul Schwartz’s engrossing Preemption and Privacy essay (Yale Law Journal) illuminates, a unitary federal information privacy statute should give us pause. Today’s information privacy law landscape is mainly comprised of federal sector-specific statutes and stronger state regulation. Schwartz makes a compelling case for remaining on that course, rather than adopting a uniform federal privacy statute. As Schwartz underscores, a uniform federal approach would likely preempt stronger state law rules, eliminating successful experimentation at the state level. California exemplifies this trend: its privacy innovations include allowing consumers to freeze their credit in the face of identity theft among others. New York and Connecticut are now considering bills that would set limits on companies that track consumers across websites to deliver targeted advertisements based on their online behavior. A uniform federal law would likely extinguish state-driven innovations whereas most federal sectoral privacy laws, such as the Gramm-Leach-Bliley Act, only provide a federal floor for information privacy and security, not a ceiling. Schwartz highlights the possibility that a comprehensive information privacy law may ossify, thus making the loss of state experimentation all the more grave. The piece also spearheads an important discussion about whether the centralizing forces at work today undermines the contributions of competitive federalism.

Schwartz’s piece is a must read. Here is the abstract for Preemption and Privacy:

A broad coalition, including companies formerly opposed to the enactment of

privacy statutes, has now formed behind the idea of a national information privacy law. Among the benefits that proponents attribute to such a law is that it would harmonize the U.S. regulatory approach with that of the European Union and possibly minimize international regulatory conflicts about privacy. This Essay argues, however, that it would be a mistake for the United States to enact a comprehensive or omnibus federal privacy law for the private sector that preempts sectoral privacy law. In a sectoral approach, a privacy statute regulates only a specific context of information use. An omnibus federal privacy law would be a dubious proposition because of its impact on experimentation in federal and state sectoral laws, and the consequences of ossification in the statute itself. In contrast to its skepticism about a federal omnibus statute, this Essay views federal sectoral laws as a promising regulatory instrument. The critical question is the optimal nature of a dual federal-state system for information privacy law, and this Essay analyzes three aspects of this topic. First, there are general circumstances under which federal sectoral consolidation of state law can bring benefits. Second, the choice between federal ceilings and floors is far from the only preemptive decision that regulators face. Finally, there are second-best solutions that become important should Congress choose to engage in broad sectoral preemption.

Ensuring that We Leave Children Behind

This guest post from FPF Advisory Board member Professor Danielle Citron is cross-posted with Concurringopinions.com, one of the top legal blogs. Check out Concurringopinions.com for regular legal and policy from top privacy scholars, including among others FPF Advisory Board members Daniel Solove, Frank Pasquale and Danielle Citron.

Ensuring that We Leave Children Behind

October 29, 2009 by Danielle Citron

Talk about children, their educations, and security abound. Politicians declare their devotion to children’s issues. Singers and actors assure us that “children are our future.” Books enlist villages to raise them. But when the rubber hits the road we routinely fail children in so many ways, including privacy. Today, Joel Reidenberg’s Center on Law and Information Policy released a report attesting to our utter inability to protect the privacy of children’s educational records. Reviewing publicly available information from all 50 states, the CLIP study found that states collect information far in excess of what law requires, including data about pregnancy, mental illness, family wealth, jail sentences, and Social Security numbers. Despite the sensitive nature of the information collected, state databases have weak privacy protections. The study found that oftentimes the flow of information from local schools to state departments of education failed to comply with the privacy requirements of the Family Educational Rights and Privacy Act.

This appalling state of affairs cannot stand. Such databases are ripe for identity thieves and hackers who will enjoy plundering the Social Security numbers. They can lead to discrimination based on inappropriately shared health information. The CLIP study has offered a number of wise recommendations, including the minimization of data collection, adoption of clear retention policies, and maintenance of audit logs. It also suggests the anonymization of data through the use of dual database architectures, which I wonder if Paul Ohm’s important work on the myth of anonymity would question. Otherwise, this study must be read and heeded.

Facebook announces new privacy principles for developers of apps.

Here are Facebook’s revised privacy principles for app developers.  They make some very good points,and many apps are going to have to do some work to comply – many apps dont even have a privac policy, something now required to be easily available.

Keeping on top of the million developers globally is going to be quite a task.  Everyone in the eco-system will need to help!

Principles

  1. Be trustworthy
    • Respect privacy
    • Don’t mislead or surprise users
    • Don’t spam – encourage authentic communications
  2. Create a great user experience
    • Build social and engaging applications
    • Give users choice and control
    • Help users share expressive and relevant content

Policies

  1. Presenting Your Policies
    1. You must provide a link to your privacy policy and any other applicable policies in the Info section of your application’s Profile page and on every page of your application.
  2. Features and Functionality
    1. You must not confuse, mislead, surprise, or defraud anyone.
    2. You must not violate any law or the rights of any individual or entity, and must not expose Facebook or Facebook users to harm or legal liability as determined by us in our sole discretion.
    3. You must not use a user’s session key to make an API call on behalf of another user.
    4. You must not include functionality that proxies, requests or collects Facebook usernames or passwords.
    5. You must not circumvent our intended limitations on core Facebook features. For example:
      1. You must not notify a user that someone has removed the user as a friend.
      2. You must not track visits to a user’s profile, or estimate the number of such visits, whether aggregated anonymously or identified individually.
    6. You must not significantly alter the purpose of your application such that users would view it as entirely unfamiliar or different.
    7. To change the name of your application, you must use one of the following formats for 30 days before completely switching to your new application name: “New name (formerly ‘old name’)” or “New name (renamed).” For example, “App 2 (formerly App 1)” or “App 2 (renamed).”
  3. Storing and Using Data You Receive From Us
    1. You must not store or cache any data you receive from us for more than 24 hours unless doing so is permitted by the offline exception, or that data is explicitly designated as Storable Data.
    2. You must not give data you receive from us to any third party, including ad networks.
    3. You must not use user data you receive from us or collect through running an ad, including information you derive from your targeting criteria, for any purpose off of Facebook, without user consent.
    4. Unless authorized by us, your ads must not display user data – such as users’ names or profile photos – whether that data was obtained from us or otherwise.
    5. You cannot convert user data you receive from us into Independent Data (e.g., by pre-filling user information with data obtained from the API and then asking the user to save the data).
    6. Before making use of user data that may be protected by intellectual property rights (e.g., photos, videos), you must obtain permission from those who provided that data to us.
    7. You must not give your secret key to another party, unless that party is an agent acting on your behalf as an operator of your application, but you must never give your secret key to an ad network. You are responsible for all activities that occur under your account identifiers.

FTC Commissioner Thomas Rosch Reflections on Behavioral Advertising

In his speech to the US Chamber of Commerce, FTC Commissioner Thomas Rosch provides an interesting analysis of the Commission’s legal authority in the area of online behavioral advertising.  He calls non-personal behavioral tracking a “vexing” policy issue, but suggests that there are ” some hard legal issues that need to be resolved even if one thinks, as a policy matter, that undisclosed online behavioral tracking should be prohibited”.

Connecticut NPR Interview with Christopher Wolf

Thursday, October 22, 2009

http://www.cpbn.org/program/colin-mcenroe-show

Understanding Will Breed Trust…

Consumers need to understand more about what is being done online. The understanding will breed trust, and the trust will breed a more viable advertising solution.

We agree with Jeff Hirsch, CEO of AudienceScience. But talk is cheap. Will industry really seek to deliver on user trust? We hope so but we also think that many aren’t yet engaged in the hard work needed to move the online ad system in the right direction. In addition to FPF’s ongoing efforts involving consumer testing of the best words and symbols to educate users about online personalization, this process requires serious effort to fix the broken opt-out process, to define which sensitive profiling categories should be restricted and to set maximum retention times for clickstream profiles. And let’s not forget giving users accesss to their profile requests.

We created a leading practices gallery to highlight leaders in offering consumers transparency and genuine choice. Unfortunately too many are sitting back, waiting for the problem to go away.

"We just connect"

Steve Lohr’s “Bits” column in The NY Times today gave us a chuckle. A senior official at Adchemy, a privately held online marketing company, said this about his firm’s expertise in “statistical personalization”:

“We don’t hold any data. We just connect to 30 or 40 data sources,” Mr.Nukala said.

Businesses today should be seeking to support their data use by being accountable, transparent and putting users in control. But this comment reminded us of the late Sen. Russell Long’s old saying: Don’t look at you, don’t look at me, look at the fellow behind the tree.

A privacy ombudsman for the smart grid?

http://www.stumbleupon.com/s/#1MECwn/www.treehugger.com/files/2009/10/smart-grid-privacy-issues-big-brother.php/

Brits express privacy concerns with smart meters

http://web.archive.org/web/20091108150053/http://www.smartmeters.com:80/the-news/656-brits-express-privacy-concerns-with-smart-meters.html

Update from the UK:The Department for Energy and Climate Change (DECC) expressed privacy concerns within its impact assessment, stating that there “is theoretically scope…for using the smart metering communications infrastructure to enable a variety of other services, such as monitoring of vulnerable householders by health authorities or social services.”

“Information from smart meters could also make it possible for a supplier to determine when electricity or gas was being used in a property and, to a degree, the types of technology that were being used within the property,” the assessment reads. “This could be used to target energy efficiency advice and offers of measures, social programmes etc. to householders.”

Europe and Online Profiling…

The Council of Europe proposes rules for online profiling and wants your opinion.

Learn more here!