It’s true that snooping has been with us from the earliest gossip grapevines linking our grass huts. And we’ve endured modern forms of surveillance such as wiretaps and bugs. But those methods tended to be obvious—picture a couple of bored cops sitting over their coffee, listening on their earphones, transcribing tapes, filing folders. It was labor-intensive and tedious. But what we have today is different by several leaps and bounds. Old spying had to be selective, targeted. New spying is like a high-powered vacuum cleaner; one that scoops up every bit of information it comes across, even the extraneous or incomprehensible stuff.
In the pre-Snowden world, the government surveillance programs I wrote about had different names, such as the NSA’s Echelon and the FBI’s Carnivore, but the pattern was the same: a reckless disregard for Americans’ right to privacy in the name of our security. Less than a year later came the September 11 attacks, the perfect rationale to dramatically ramp up both domestic and international spying to a magnitude that portends nothing less than the end of this nation’s grand experiment in democracy.
Now, nearly a decade and a half later, we are clearly losing the continuing battle between individual freedom and collective conformity that has marked the history of human society. The right to be left alone—to think, to experiment, to contemplate—has been essential to the development of individual personality. The private space, protected from the intimidating observation of others, is the sacred ground where self-discovery occurs, and to the degree that external forces intrude, whether of the state, church, or marketplace, the sense of self will wither.
The information revolution that has been properly celebrated for irrevocably destroying the parochial restraints of language, society, and religion, while exposing much of the world’s population to a boundless world of universally shared information, is at the same time stripping both passive and active participants of their privacy in ways most don’t comprehend.
No dictator or intelligence agency, with its network of spies, could ever have hoped to obtain such access to the thoughts and aspirations of their subjects that today’s off-the-shelf information technology so easily provides. This is the great contradiction of our time: the unprecedented liberating power of the supercomputer combined with the worldwide Internet—credited with facilitating such dramatic expansions of the democratic experience as the Arab Spring—also contain the seeds of freedom’s destruction because of the awesome power of this new technology to support a surveillance state that exceeds the wildest dream of the most ingenuous dictator.
Do I exaggerate? Don’t take my word for it; let the toolmakers tell you. In a burst of public honesty, Google executives Eric Schmidt and Jared Cohen wrote in April 2013: “Despite the expense, everything a regime would need to build an incredibly intimidating digital police state—including software that facilitates data mining and real-time monitoring of citizens—is commercially available right now. . . . It’s the digital analog to arms sales.”
Of course, none of this is a complete surprise. Privacy is inevitably linked to technology, because anything that extends the reach of the eye, ear, or nose threatens intrusion into another’s personal space. The ability of the first primitive telescope would allow a snooper to check on your bedroom antics, for example, and the ability of the first camera to record your activities—and the press that allowed publication of those images for the entire world to see—was what initiated the first sweeping defense of privacy in the United States.
That defense came in a forceful affirmation of the right to privacy in a Harvard Law Journal article by future United States Supreme Court justice Louis Brandeis in 1890. It vastly extended the reach of the Fourth Amendment, which protects one’s personal freedom in the home, in papers, and in movement, but it also assumed a static and immediate point of intrusion, as befitted the limited technology of the day.
The freedom to be left alone, simple though it may sound, embodies the most basic of human rights. If the individual cannot find sanctuary in his home and person, he is easy prey for avenging governments, rapacious mobs, and exploitive robbers. Some call it the right to privacy, which puts the emphasis on seclusion for personal activity, but the framers of the Constitution used the stronger concept of security, as in creating a strong barrier to the intrusions of outside power. That’s why the Constitution’s protection of individual space in the wording of the Fourth Amendment stresses physical security, but its intent, certainly as interpreted by the courts, is broader: to form a cocoon of safety around the individual, which over the years we have come to associate with the word privacy.
Brandeis was concerned about technology-driven threats to one’s privacy that by the late nineteenth century were becoming more invasive, particularly the use of the telephone. His article titled “The Right to Privacy,” co-authored with Samuel Warren, is widely considered to be the decisive factor in establishing a broadened constitutional protection. “Political, social, and economic changes entail the recognition of new rights, and the common law, in its eternal youth, grows to meet the demands of society,” wrote the lawyers. “Thus, in very early times, the law gave a remedy only for physical interference with life and property. . . . Gradually the scope of these legal rights broadened.”
Brandeis went on to join the Supreme Court, where his famous dissenting decision in Olmstead v. United States (1928) provided a landmark warning on privacy threats posed by technological advances: The government was identified as a potential privacy evader because “discovery and invention have made it possible for the government, by means far more effective than stretching upon the rack, to obtain disclosure in court of what is whispered in the closet,” he wrote.
Thanks to advances in technology that Brandeis could not have imagined, the reach of government and private corporations now extends far beyond the whisper in the closet. The basic economic model of the Internet—the sticky adherence of advertising to one’s comments and curiosities—leaves exposed, for example, not only purchase histories but also random keystrokes and scattered drafts of one’s flights of fancy. Overhearing conversations that the participants are lulled into assuming are private, when they are anything but, is the advertising model of the Internet.
Facebook, as far back as July 2010, according to USA Today, had over 500 million members who would upload more than 25 billion pieces of content each month. Twitter has more than 200 million active users, and the Library of Congress recently announced that it will be acquiring and permanently storing the entire record of public Twitter posts. As the New York Times noted on July 19, 2010, in reporting that factoid, “the Web means the end of forgetting.” Whether it also means the end of privacy depends on what control individuals have, and are able to regain, over how their personally generated data is compiled and disseminated.
It wasn't until the first week of June 2013, however, that the nation’s media finally caught up with the news that for a decade Americans had abjectly surrendered their historic respect for individual privacy to the chimera of national security. That week, the NSA files leaked by Snowden revealed that the federal government has full access to all phone records and to the vast trove of presumably private personal data posted on the Internet.
The most sacred tenet of American individualism, the right to be left alone, had been squandered, almost without notice. In the wake of the 9/11 attacks, a snooper state had come to be accepted as the norm in which an already robust corporate intrusion into our privacy came to be conjoined with the heightened obsession with security at all costs. A new military-intelligence complex had come into its own based on the unprecedented power of spy agencies, unleashed by the urgency of a so-called War on Terror, fused with the previously unimaginable intrusion into private space wrought by the Internet revolution. The Constitution’s Fourth Amendment guarantee of the sovereignty of the individual—“The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated”—was being treated as an irrelevant relic of a bygone civilization.
It was a perfect storm. The snoopers, both state and corporate, now had the capacity for massive storage and search functions, able to move far beyond the level of surveillance ever deemed permissible, or even possible, by the constitutional framers. Now the government, rationalizing the excessive use of state power with a hysterically defended claim that the very existence of our country was threatened by terrorism, merged the private and the public data fields in order to create instant, absurdly detailed portraits of any citizen “of interest.” One’s most intimate habits, from private correspondence, book pages read, and lists of friends and phone conversations, can be seamlessly merged with a detailed map of an individual’s DNA, both biological and social.
The radical transformation of technology today has turned the NSA “into the virtual landlord of the digital assets of Americans and foreigners alike,” a New York Times editorial aptly stated in response to the disclosures of the monitoring of commercially based data by the nation’s most invasive security agency. The mainstream media had finally began to focus seriously on the loss of privacy, which was old news to a brave group of government whistleblowers who had attempted to sound the alarm and been severely punished for their efforts to hold their government accountable.
The boldest justification for unfettered government intrusion in our privacy came back in October 2001, when, without hearings or even serious discussion, Congress rushed to pass “an Act to deter terrorist acts in the United States and around the world, to enhance law enforcement investigatory tools, and for other purposes.” With less consideration than two months of fleeting thought in an atmosphere of panic could provide, two centuries of constitutional law had been swept aside with only sixty-six dissenting votes in the House and one in the Senate. President Bush signed the measure into law on October 26, 2001. Laws regarding wiretapping, searches of homes, examination of mail, the right to due process, and even freedom from arbitrary and secret arrest were suddenly neutered.
The power of the Patriot Act to undermine our freedoms was evident as a nagging thought even in the minds of some members of Congress who voted for its far-reaching provisions. They assuaged their conscience by scheduling the act to expire in four years. In March 2006, however, President Bush signed new legislation passed by Congress that reauthorized the most controversial provisions, and on May 26, 2011, President Obama, while traveling in France, used an autopen to extend the reach of onerous provisions of the act another four years.
As when the colonists championed the principles that became the Fourth Amendment during their war of independence against the British Crown, individual rights are most severely tested when foreign policy and national security are at the fore. Hunts for spies or subversive plots become more compelling under such circumstances and are generally backed by wartime laws based on the idea that the present crisis demands special measures.
The targets may be scapegoated citizens with some “foreign” attachment, as in the case of Japanese Americans interned in camps during World War II or, today, those Americans accused of terrorist ties or aspirations—usually Muslim—who can be arrested and even executed abroad without the restraint of due process or any sort of judicial procedure.
In July 2014, Human Rights Watch released a devastating report on the anti-Muslim bias in terrorism prosecutions. The lengthy report, written in collaboration with the Columbia University Law School’s Human Right Institute, showed that the FBI entrapped Muslims with mental and intellectual disabilities to appear to be engaged in planning terrorist actions for which they were then arrested. “The report clearly shows, in many respects, the American public is being sold a false bill of goods,” said Andrea Prasow, one of the Human Rights Watch report’s authors. “To be sure, the threat of terrorism is real,” she said. “But in many cases we documented, there was no threat until the FBI showed up and helped turn people into terrorists.”
Wartime, and that includes any period of extreme fear of an enemy, including the irregular forces of terrorism or other subversion, is in every nation the great enabler of suppression of freedom—and it begins with the invasion of privacy, in order to collect the evidence of attacks, plots, or rebellions. It is also a time when governments can act with the impunity afforded by the secrecy that wartime demands, as the concealment of government’s purposes and evidence becomes formalized through classification of documents and communications. That, in turn, becomes the excuse for denying the sovereign rights of the individual.
As in other nations when citizens have traded freedom for security, throughout our history our safety is the excuse to weaken or destroy the freedoms of speech, press, and assembly our Con- stitution asserts as fundamental rights. And what that venerable yet resilient document could not anticipate is both the immense promise and the chilling threat to those freedoms posed by life in the digital age.
Suddenly, people anywhere, or at least those in the wired world, could view any document challenging the government’s case for going to war. Consider the attack of 9/11 as a marker in that information revolution, when, like no time before, the public could be riven by debates over what it takes to destroy a skyscraper, the political and religious motives of its destroyers, and the history of their connections with various governments. And when the Bush administration chose to invade Iraq in response to that attack, its claims of a connection between that country and al Qaeda, as well as the presence of weapons of mass destruction, were quickly exposed by documentary evidence, never before so readily and widely available, that contradicted the official narrative.
Of course, fomenting healthy debate is only the first step to making good decisions, and in this case the Bush administration, with the cover of powerful Democrats like Senators Hillary Clinton and John Kerry, pushed through a war regardless of the facts. Meanwhile, the mainstream media, never consistently brave in wartime, seemed cowed by the blind rage spawned by 9/11 and provided only minimal resistance to the tsunami of cherry-picked evidence Bush officials Dick Cheney, Colin Powell, and others used to intimidate Congress, the UN, and a good chunk of the public. (Or, as in the infamous case of Judith Miller, then of the New York Times, touted sightings of Iraqi weapons of mass destruction that turned out to be nonexistent.)
At the same time, and in the media’s defense, the Internet had, indirectly but devastatingly, weakened the traditional press, long led in content, if not audience, by daily and weekly print publications. Without glamorizing an often sordid, rancorous, or erratic industry, it must be remembered that the United States has an established and venerated tradition of muckraking, independent journalism going all the way back to the barbed commentary the founders themselves had squirmed under but which they nonetheless felt was necessary to protect. Undermined economically by everything from Craigslist (no more classifieds!) to the flattening power of search engines and news aggregators, the biggest reporting machines were suddenly operating under much-reduced budgets with smaller readership and reach, and competition from nontraditional—and free—news and content recycling sources like blogs and Twitter.
But two other factors also emerged—factors of great consequence in weakening the role of the traditional media’s ability to cover national security issues. One was that the government had a much-increased ability to use new media to propagate their message, bypassing the tough questions from traditional media.The other was that the modern technology of surveillance allowed the government to spy on independent reporters and their sources to a degree never before imagined, even in overtly totalitarian societies. This dangerous confluence of new surveillance technologies, combined with the hyping of a never-ending but shadowy terrorism “war,” ushered in a dangerous threat to freedom—particularly that of the press.
One of the best surveys of that transformation came in October 2013, in the form of the aforementioned report for the Committee to Protect Journalists written by former Washington Post executive editor Leonard Downie, Jr. Downie was a leading journalist who had steered the Post through some of that paper’s most legendary investigative exposés. His report was informed by interviews with some of the most accomplished veterans of that sort of journalism. It provides a chilling marker in the deterioration of press freedom in the surveillance age.
The report’s conclusions also delineated how scrambled the political spectrum had become regarding the protection of personal and press freedom. Although the report is quite tough in its criticism of the Republican President Bush in seizing upon the trauma of the 9/11 attack to justify a vast increase in the prerogatives of the national security state, his administration was actually less damaging to press freedom than that of his Democratic Party successor, according to Downie. Writing under the subheading “Leak Investigations and Surveillance in post-9/11 America,” Downie argued:
U.S. President Barack Obama came into office pledging open government, but he has fallen short of this promise. Journalists and transparency advocates say the White House curbs routine disclosure of information and deploys its own media to evade scrutiny by the press. Aggressive prosecution of leakers of classified information and broad electronic surveillance programs deter government sources from speaking to journalists.
As we have seen, even before the Snowden bombshell there had been a series of cases brought by the Obama administration against reporters and their whistleblowing sources that illustrated just how effective massive electronic surveillance was in preventing them from conducting private conversations unless they deployed extraordinary encryption tools. These captured communications were already showing up as evidence in government legal proceedings against journalists and their sources with such regularity as to clearly reflect the use of a new—and massive— surveillance mechanism. The Snowden revelations, Downie wrote, confirmed this, adding a chilling caution to the relationship between reporter and source.
Excerpted from "They Know Everything About You: How Data-Collecting Corporations and Snooping Government Agencies Are Destroying Democracy" by Robert Scheer. Published by Nation Books. Copyright 2015 by Robert Scheer. Reprinted with permission of the publisher. All rights reserved.