Not to be outdone, Vice President Al Gore issued a statement hailing P3P as a powerful tool “for giving consumers greater control over their personal information.” And to prove its point, the White House set up a P3P policy for the White House home page.
But P3P isn’t technology, it’s politics. The Clinton administration and companies such as Microsoft are all set to use P3P as the latest excuse to promote their campaign of “industry self-regulation” and delay meaningful legislation on Internet privacy. At least that’s the claim being made by two of the most trusted names in Internet privacy, Electronic Privacy Information Center and Junkbusters, both of which have called P3P a Trojan horse.
The privacy organizations are right: P3P lacks the power to create a new privacy-rich Internet. That’s because P3P is nothing more than an optional labeling standard. And as long as the protocol remains optional, organizations will have few incentives to create P3P labels that can help consumers.
Still, the privacy organizations needn’t turn their backs on P3P. P3P could actually be a wonderful tool to help promote meaningful privacy protections on the Internet — both technical protections and legislative ones.
P3P has its roots in the World Wide Web Consortium’s (W3C) much-maligned Platform for Internet Content Selection (PICS), a protocol that was created to enable censorship. Conceived in 1996, during Congress’ first attempt to regulate pornography on the Internet, PICS was designed to prevent the children of protective parents from being able to view pornography on the Web. The grand idea behind PICS was that every Web site on the Internet would have a pornography rating, and that Web browsers would consult those ratings before displaying the Web pages. If a Web site had pornography and the computer was configured not to show pornography, no inappropriate images would appear on young Jimmy’s screen.
Privacy activists looked at the PICS technology and said to themselves, “You know, if technology can do this for pornography, perhaps we could build a similar system to protect privacy.” The technology would be similar. If you clicked into a Web site whose policy for dealing with personal information didn’t match your preferences, your browser would throw up a warning and prevent you from going further.
XML looks a lot like HTML, the language of the Web, but it allows developers to create special tags that have particular meanings. For example, the tags mean that a Web site holds on to its personally identifiable information forever. The tags mean that a Web site collects data but doesn’t share it with others. Your browser would check these tags and give you a warning if the Web site’s policies disagreed with your preferences.
Basically, P3P is a rather straightforward system for translating human-readable privacy policies, which are filled with legalese, into a compact block of data that can be understood by your computer.
The P3P standard also allows you to specify different policies for different parts of your Web site. For example, you might have some Web pages that collect a lot of personal information, while others might not collect any. This feature, it turns out, is critical to both P3P’s success and possible failure.
The flexibility is critical, because large Web sites need to be able to say that different rules apply to different areas. But the flexibility also makes it possible to restrict P3P policies to very small segments of a site, and leave the rest unlabeled.
The Clinton administration, being a strong proponent of self-regulation, hopes that the P3P technology will make it easier for consumers to rate the privacy policies of Web sites. If consumers demand more privacy, the thinking goes, they will flock to the Web sites that protect consumer privacy and thereby force the market to do the right thing. And if the market does the right thing all by itself, there will be no need to pass and enforce privacy-protecting laws.
In reality, P3P technology won’t do much of anything to guarantee your privacy, and you need go no further than the White House’s own Web site to understand why. Any company or organization can create P3P policies and then issue self-congratulatory press releases gushing about their P3P-compliance. They look like good guys. But they might not be good guys at all.
Consider the White House Web site’s much touted P3P policy. I took a close look at the original White House policy when it was first unveiled in June. Translated from XML back into English, it said that the organization named “the White House” had the contact e-mail address of email@example.com, a phone number of 202-456-1414, a place of business at 1600 Pennsylvania Ave., N.W., Washington, D.C., 20500, that it recorded clickstream information, the name of the Web browser that every visitor uses, the previous Web site that they visited and that such information would be retained indefinitely. The information would be used for administrative purposes, development and operations. But the policy only covered the Web site’s home page and simply ignored the other 4,000 pages on the Web server.
I called the White House and asked for the reason for the apparent disparity. Spokespeople told me that the P3P policy for the home page was merely a demonstration, and that they were doing a redesign in a few weeks and would do a better job then.
The redesigned Web site, ironically, doesn’t have a P3P policy at all.
By creating so much fanfare about its P3P policy and then lazily delivering so little, the White House is simply playing into the hands of P3P critics. Its behavior alone makes it clear that P3P doesn’t give consumers “control over their personal information,” as Gore says it does. Only federal data protection legislation, laws that would force the White House to get all of its pages properly coded to tell visitors what data they’re handing out and what it will be used for, could have that result. In the meantime, the ability for P3P policies to carve out huge exceptions makes it a pretty ineffective tool for “self-regulation” as well.
The beauty of P3P is that it reduces such legalese into unambiguous XML statements. Some future Web browser could translate these statements into any human-readable language, allowing a person who only speaks Japanese to make sense of privacy statements posted in English, for example. Search engines could automatically display a site’s policy along with its links. Economists researching the field of privacy could rapidly survey the policies of 10,000 Web sites and show how they change over time. That’s the promise of P3P.
Ultimately, though, Americans shouldn’t be put in the position of having to decide whether or not they want to give up their privacy in order to partake in the pleasure of viewing pages on the Internet: We should have base-level privacy protections in law. We do this in other areas, such as food, drugs and the environment. Likewise, there should be certain privacy guarantees that are fundamental to our society; privacy guarantees such as the right to see information that a company has collected on you and the right to have erroneous information expunged.
P3P can’t create these rights and it can’t enforce them. But P3P will make it easier to cut through the legalese and tell the difference between Web sites that are truly committed to protecting privacy and those that are information sharks — provided, of course, that both kinds of Web sites post P3P policies that are comprehensive and accurate. And, of course, there’s a fat chance of that happening without meaningful legislation.