“One girl can be silenced, but a nation of girls telling their stories becomes free” slideshow
A photo contest winner
The first congressional consideration of Total Information Awareness, President Bush’s superpowered spy intitiative, was remarkably brief and uncomplicated. Indeed, it wasn’t even a debate.
On Jan. 23, the Senate unanimously passed a measure requiring the government to produce a detailed report on the TIA system’s goals, costs and consequences. The senator sponsoring the amendment, Ron Wyden, D-Ore., said he was worried that TIA — a Defense Department research project that aims to identify terrorists by analyzing personal data collected in computer databases — was being developed without congressional oversight; his plan, he said, would cause the program to be “respectful of constitutional protections and safeguards, while still ensuring that our nation can continue to fight terrorism.”
But while Congress asks for reports, TIA is already steaming forward. According to people with knowledge of the program, TIA has now advanced to the point where it’s much more than a mere “research project.” There is a working prototype of the system, and federal agencies outside the Defense Department have expressed interest in it.
Most alarmingly, an examination of the research that has been conducted so far into TIA reveals that even while the project has been charging ahead, only token attention has been paid to perhaps its most critical aspect — privacy and civil liberties protection. The Defense Advanced Research Projects Agency (DARPA), which is taking the lead on TIA, has stymied the efforts of outside groups to find out more about TIA’s protections.
Critics of TIA applauded the Senate’s move as putting the kibosh on what they see as the most frightening example of government Big Brother-ism yet invented. But even if Wyden’s amendment does become law, that doesn’t mean that TIA opponents can relax. The federal government is unlikely to stop doing research into how to glean information from credit card databases, driver’s license registrations and every other point at which human lives meet the computer.
Sept. 11, 2001, led many Americans to expect some loss of individual liberties in the interest of improved security. But how far is too far? TIA is the most ambitious government attempt ever to gather information on ordinary citizens, and, if realized, it will be a tool that could easily be abused by those who have (or can steal) the keys to use it.
“It’s hard to come up with an appropriate analogy for this,” says Marc Rotenberg, executive director of Electronic Privacy Information Center, a Washington-based advocacy group. “We’re talking about the most massive surveillance program ever tried by the federal government, and you ask them what they’re doing for privacy and you get one 26-page paper. I mean, let’s get serious.”
TIA’s lack of attention to privacy has politicians across the political spectrum riled up. The fact that TIA is headed by John Poindexter, the former Reagan administration official who was closely involved in the Iran-Contra affair, also hasn’t helped the program’s public image.
What is the Defense Department doing to make TIA safe? Not very much. It is on the verge of granting $1 million for the development of what DARPA calls a “privacy appliance” to be used in the system. But participants in a recent study DARPA commissioned to look into the security of databases containing private information — a study that Poindexter has often said proves his office is serious about privacy — don’t believe the agency is looking closely enough at protections. Some of them say that TIA will never be safe. “I was stunned,” said one participant, describing her reaction to a TIA briefing from Poindexter. “I’m too stunned to say anything.”
During the past year, many people working on TIA, including Poindexter, have made a number of briefings to prominent engineers and other VIPs. (Poindexter has given only a few interviews to the press and did not respond to an e-mail interview request from Salon.) DARPA officials also presented their scheme at DARPA Tech, the agency’s technology conference, held in Anaheim, Calif., in August. The briefings were not classified; people who attended them — including experts from privacy groups — are free to talk about what was said, and slides from the DARPA Tech presentations are available online.
The 18th slide out of 20 in Poindexter’s standard presentation is titled “Privacy.” A subtitle says, “Taking several actions to address.” And the first bulleted item on his list is “ISAT summer study.”
ISAT is the Information Systems Advanced Technology (ISAT) panel, a group within DARPA. Last year, ISAT commissioned a study of technological methods to protect private data in information systems. The study was not meant to focus on TIA; the participants, several of whom were willing to speak on the condition they were not quoted, took a more general look at methods to protect individual data contained in commercial and government databases. But TIA was discussed at the group’s meetings, and not often in the most flattering of terms.
The ISAT group held five meetings during the spring and summer of 2002; those were attended by computer scientists, privacy experts, government officials and computer industry executives — sometimes as many as 30 to 40 people discussing privacy. The panel talked about technological issues, not matters of policy; in its final report, the group outlined a few areas of research for DARPA to look into further.
One way to make sure that private information in a database is kept safe, the panel suggested, is “selective revelation” — or the idea that when an analyst searches the databases, the computer should respond in a need-to-know fashion, giving out information only if the analyst has obtained certain legal clearances. “For example,” the ISAT report says, “an analyst might issue a query asking whether there is any individual who has recently bought unusual quantities of certain chemicals, and has rented a large truck. The algorithm could respond by saying yes or no, rather than revealing the identity of an individual. The analyst might then take that information to a judge or other appropriate body, seeking permission to learn the individual’s name, or other information about the individual. By revealing information iteratively, we prevent the disclosure of private information except when a sufficient showing has been made to justify that revelation.”
The ISAT group also proposed that DARPA consider building databases that keep track of what has been asked of them, allowing for later inspection of possible abuses. If such a system has been used nefariously — if an analyst has leveraged other people’s personal information for his own gain, for example — investigators could always find an “audit trail” of the bad deeds. “We need to ‘watch the watchers,’” the group says in its report — though it concedes that building such audit mechanisms is technologically very complex, and they raise problems of their own: What if a foreign government got hold of a list of queries U.S. authorities were asking?
The group’s report was intended to outline broad areas of study; the technology required to enable those areas requires money and time to produce, the group said. The report was also meant to be confidential. But last November, the Electronic Privacy Information Center filed a request under the Freedom of Information Act — or FOIA — asking the Defense Department for all information on the privacy implications of TIA; in December, the agency released only the ISAT study in its response to that request. (The PDF of the ISAT study is here.)
EPIC was not satisfied by this response. It had asked for all documents relating to Total Information Awareness, and the ISAT report explicitly states that it is “not a review of Total Information Awareness.” Rotenberg, who directs the privacy group, says that while he believes some of the ideas in the ISAT report are “interesting,” the report “doesn’t begin to answer the larger issues. I think it’s important to know that we’re operating in an area that needs legal, technological and policy dimensions.” On Jan. 16, EPIC won a lawsuit against the Pentagon regarding its FOIA requests; it now expects more information from DARPA.
Nor were participants in the ISAT study happy that DARPA released only their report in its response to EPIC. “I thought it was appropriate for them to turn this over,” said one member of the study, “because if you look at the language of the request it’s fairly broad. But it seems unlikely that this is the only document they had which relates to the topic of the request.”
Barbara Simons, a computer scientist who is the co-chair of the U.S. Public Policy Committee of the Association for Computing Machinery, a worldwide computer science group, and who attended all the ISAT study meetings, is more critical. “It’s really astounding that our study was the only thing they gave in response to the EPIC FOIA,” she says. “If that’s all they have on the issue of privacy, then all those people who are saying, ‘Don’t worry about TIA,’ they’re wrong. We should be worrying.”
Simons first heard of TIA last May, when the ISAT group was briefed by Poindexter. “And I just couldn’t believe what he was saying,” she says. “Because he was talking about catching terrorists before they strike — and I just didn’t think that was possible.”
In a follow-up e-mail, she added: “I’m just not convinced that the TIA will give us tools for catching terrorists that we don’t already have or that could be developed with far less expensive and less intrusive systems.”
“Even if one were able to construct a system which did protect privacy in some sense, we certainly have not been very successful with building humongous databases that are secure,” says Simon. “And think about all the people who have access to this thing. How many people are going to be authorized? What was the name of the guy in the CIA?” She was referring to Aldrich Ames, the CIA official who was convicted of spying for the Soviet Union. “What about someone like that? This takes huge amounts of data — tremendous amounts of data about people are being stored. You run the risk of rogue agents, of bribery, of blackmail.” She said she did not believe that the privacy technologies included in the ISAT report would do much to solve such problems.
Does DARPA’s response to EPIC mean that the only thing the agency has done on privacy in Total Information Awareness is hold five meetings and produce one 26-page report?
All public information seems to point in that dismal direction. “If you look at some of the presentations that the TIA program managers have made, you’ve seen they’ve all referenced privacy safeguards,” says Lee Tien, an attorney at the Electronic Frontier Foundation. “But my recollection is that there isn’t anything they talked about that wasn’t bulleted out in the ISAT study. So if I’m a TIA project manager, I can say what I said after I read the ISAT study and look like I’m doing something good on privacy. And at this point the jury’s out about whether they’ve actually done any more than that.”
TIA has not unveiled any actual technology it’s created with the specific aim of protecting private information collected by TIA. But in February, the agency is expected to approve its first grant for such technology; the contract has been promised to Teresa Lunt, a computer scientist at the Palo Alto Research Center, who is a veteran in the world of computer security.
Lunt will work on the privacy system for Genisys, the part of TIA that will actually access personal data. Genisys gets its information from third-party databases, such as the computer at your credit card company or driver’s license agency. “We envisioned that these companies would want to have some serious control over how the government accessed that data,” Lunt says, “so we proposed this idea of a privacy control — a ‘privacy appliance,’ is what the government calls it. The appliance is a physically separate box that could be inspected by third parties for accuracy and strength.”
The privacy appliance would perform the “selective revelation” functions suggested in the ISAT study. The technology would not reveal any identifying data to analysts doing a search at the lowest level of authorization. It would block out names, credit card numbers, Social Security numbers — “any unique or close-to-unique data field would be withheld. We start out with the assumption that it is withheld,” Lunt says.
But the systems would go beyond that by blocking out any of the remaining fields that could be used together to “infer” a personal identity. For example, a clever investigator who has determined enough information about you — how much money you make, how many cars you have registered to you, whether you like to pay off your credit cards each month or prefer to keep a balance — could search through all this information for those characteristics, and possibly ferret out more information. Lunt’s “inference analysis” tools are designed to prevent that sort of misbehavior.
Many computer scientists said that although they thought Lunt’s ideas would be difficult to pull off, they’re fundamentally sound. Still, some people in the computer security community, who declined to comment on the record, were nevertheless dismayed that Lunt looks like the only recipient of a privacy technology award from DARPA — and some wondered whether part of the reason she might have received that award was the fact that she once worked for the agency.
Lunt brushed off the suggestion that DARPA picked her for any reason other than her technology. “I don’t see any connection between the two,” said Lunt, “except the security research field is a very small field. There’s maybe a community of a few hundred doing work in this kind of security. And how many are doing database security?”
She also said that DARPA was in fact taking a risk in going with her project, because nobody knew if such a system would actually work. “It may not allow the data to be useful,” Lunt said. “If you start withholding a lot of this stuff it might make it hard for the analysts to see the relationships between people.”
Experts involved in the ISAT study are dismayed at the size of Lunt’s grant — $1 million a year, for three years. This is small compared to the entire TIA program, estimated to cost tens of millions. (Any dollar figures more specific than that are in dispute.) Lunt agrees that the grant is small; she wishes DARPA would put more money toward privacy research. The million dollars, she said, is “enough for us to succeed on our project, and I think our project goes a long way toward getting them there. DARPA sometimes funds research areas that are leading to some strategic thing — but they could have a research program on privacy which doesn’t have a specific application in mind, one that is just designed to improve privacy tools.”
Experts involved in the ISAT study echoed these thoughts. Wouldn’t it be nice if, in addition to spending millions on security tools, the federal government made a large outlay toward privacy research? Something huge, dramatic, on the order of TIA itself?
But that’s not happening. The sad fact about TIA is that, Lunt’s research notwithstanding, the privacy schemes in it appear to be tangential, token ideas, added on to the system as a political afterthought.
Bob Barr, the former Republican congressman from Georgia, is among TIA’s harshest critics. Barr, who lost his seat last year after redistricting in Georgia, was known as one of the House’s most conservative members: He is famous for having introduced a resolution to impeach Bill Clinton in 1997, a few months before anyone had ever heard of Monica Lewinsky.
Barr calls TIA “one of the worst invasions of privacy I’ve ever seen,” and he says that the only thing he regrets about the Senate’s action is that it did not kill TIA entirely. On his Web site, Barr congratulates Russ Feingold, one of the Senate’s most liberal members, who has proposed legislation that would stop the TIA program. Nothing short of that would satisfy him, Barr says, because “I suspect that based on my experience with the executive branch they will keep pushing and pushing and pushing, hoping Congress will not pay attention. And I think we’ll see this program resurrected in other avenues.”
Barr’s stance proves that privacy is not a left-right issue; everyone should fight for strong privacy protections, he says, and he believes many members of Congress have been hearing that from their constituents. “And I think the hue and cry against these invasions of privacy will be heard much more in this Congress.”
Last year, Barr sponsored a bill that would have required the federal government to produce a “privacy impact statement” each time it launches programs that could threaten civil liberties. The bill enjoyed broad support; both the ACLU and the NRA thought it was a good idea. In October, the bill passed the House, but it didn’t get very far in the Senate. “Last year,” Barr says, “it was so difficult to get members to focus on doing anything that would appear to be not giving the government anything it wanted to do as far as terrorism was concerned. The most we thought we could do was get it through the House.”
Barr’s idea is good policy; currently, federal agencies are not required to consider how their ideas affect civil liberties. After they tell a fearful public that a certain program is necessary to prevent our annihilation, officials seem to have a free pass to build programs like TIA, promising nothing on privacy other than “Taking actions to address.” Thankfully, Wyden’s proposal will prompt a cost-benefit calculation for TIA, but other security programs could also do with such an analysis — the Transportation Security Administration’s Computer Assisted Passenger Prescreening System II, which profiles airline passengers, or the USA PATRIOT act, passed without much deliberation a few weeks after 9/11.
Recently, Ralph Nader and tech activist Jamie Love have taken up the cause to promote privacy impact statements. On Jan. 7, the two met for the second time with Mitch Daniels, the White House budget director, in an effort to persuade the administration to embrace the idea of putting a “value” on “privacy and dignity.”
“Benefits to national security are measured and assigned monetary values [in the budget process],” Love says. “So we’re asking to assign values on privacy and dignity — that’s something people care about, and there’s benefit in that.” Love added that he didn’t think the Bush administration was averse to the idea of preparing such impact studies; the White House has always said that it believes in a “strong regulatory review,” relying heavily on cost-benefit analyses to determine the value of government projects.
And a cost-benefit analysis of privacy, Love says, one in which some “nonzero” value is assigned to the loss of privacy that comes from such programs as TIA, would lead to better government, and a more efficient fight on terrorism — government money would only go to projects that really worked.
A spokesman for the Office of Management and Budget said that no specific decisions had come of Daniels’ meeting with Nader and Love, but that the White House was interested in “smart regulation,” which it believes comes from a rigorous analysis of costs and benefits of government programs.
Ever since TIA was unmasked, the media’s main source for information about it has been the program’s Web site, which contained a lode of documents on the system’s scope and design, as well as information about the people working on it. Much of what the public knows about TIA comes from the site, which was adorned with a notoriously creepy logo — the eye atop the masonic pyramid, familiar from the dollar bill, looking out over the world — and a Latin slogan, “Sciencia est Potentia“: Knowledge is power. During the last couple of months, however, perhaps in response to public opinion, DARPA has quietly removed some Web documents. The logo and the slogan are gone. So are the short biographies of the personnel involved.
One hopes that with the Senate action, Congress will convince TIA to get more responsive than that. “I hope the Congress engages in some serious oversight,” Barr says, “to make sure the executive branch doesn’t resurrect this thing. I think that chances are overwhelming that they will — they always do.”
Farhad Manjoo is a Salon staff writer and the author of True Enough: Learning to Live in a Post-Fact Society.More Farhad Manjoo.
A photo contest winner
A photo contest winner
“In life many people have two faces. You think you know someone, but they are not always what they seem. You can’t always trust people. My hero would be someone who is trustworthy, honest and always has their heart in the right place.” Ateya Grade 9 @ Mirman Hayati School (Herat, Afghanistan)
“I pray every night before I go to bed for a hero or an angel capable of helping defenseless children and bringing them happiness. I reach up into the sky hoping to touch a spirit who can make my wish come true.” Fatimah Grade 9 @ Majoba Hervey (Herat, Afghanistan)