Authors: Betty Medsger
The expanded use of invasive surveillance was made possible by the passage, shortly after 9/11, of the
Patriot Act, which considerably loosened restraints on the FBI. As the Bush administration was about to leave office, Attorney General
Michael Mukasey, a former federal judge in New York, issued new attorney general guidelines for the FBI. As of his rewriting of the guidelines, first put in place in 1976 by Attorney General Levi and modified repeatedly since then, the FBI could operate with even fewer restrictions.
Agents could target individuals without a clear basis for suspecting they were planning a crime.
Mukasey validated the casting of wide nets that pulled people into investigations on the basis of their race, religion, or political activities, as long as they met any additional criteria that drew suspicion. It was widely assumed that
Eric Holder, attorney general in the Obama administration, would revise Mukasey's FBI guidelines and restore some of the restrictions.
Instead, the Obama administration loosened some FBI guidelines more. It also deepened secrecy about national security policies and practices, including secrecy about policies and laws governing its greatly increased use overseas of unmanned drones to kill suspected terrorists and conduct surveillance.
AS THE BUREAU STRUGGLED
to transform itself after 9/11, it was forced to embrace new methods and new tools in order to fulfill its greatly expanded intelligence-gathering mission. New data collection and analysis systems were installed. The new equipment and data systems were to be used by agents who, because of wrongheaded decisions by directors in the 1980s and 1990s, had little familiarity with even minimal computer use. The computers in their offices were so old that agents could not send or
receive email, and they could not do basic research of bureau files from their computers, let alone conduct searches on the Internet. Millions of dollars were spent on failed attempts to update and integrate the bureau's computer systems.
Current technology was not the only important element missing at the FBI in 2001.
So were crucial language skills. Thousands of hours of Arabic-language phone conversations recorded by the bureau before the 9/11 attacks had not been translated because the bureau had few translators who understood Arabic.
By three years after 9/11, a report by the inspector general revealed that so many recordings believed to be relevant to terrorism had not been translated that, as Senator
Charles E. Grassley, Republican from Iowa, said at the time, the FBI was still drowning in information about terrorism activities. Even with a great increase in the number of translators of Arabic, Farsi, and other languages, the influx of new material from wiretaps and other intelligence sources vastly outpaced the bureau's translation capacity. By 2004, more than 123,000 hours of audio recordings collected since 9/11 in languages associated with terrorism had not yet been translated. For all languages, nearly half a million hours of audiotapes, or 30 percent of what had been collected since 9/11, had not been translated. A bureau rule required that audio recordings directly related to active
al-Qaeda investigations be transcribed within twelve hours of interception, but because of the backlog, such transcriptions routinely were not made until at least a month after they were recorded. Computer problems also aggravated the translation problems. Without FBI officials realizing it, as computer drives filled with recordings to be translated, older recordings, including many that had not been translated, were automatically deleted, never to be heard. In an understatement, the inspector general's report said the bureau faced “significant management challenges” in developing quick and accurate translations.
The bureau got more money, more agents, more informers, more translators, more analysts, more computers, and much, much more data to use in its effort to prevent another terrorist attackâan effort that by now has continued for more than a decade.
As this growth took place, the bureau became part of the vastly expanded national security system that has burgeoned since 9/11 in government and private industry, a system that
Washington Post
journalists
Dana Priest and William M. Arkin call “Top Secret America.”
In their series and book based on their extensive examination of the expansion and quality of the nation's security system a decade after 9/11, they concluded that “the top-secret
world the government created in response to the terrorist attacks of September 11, 2001, had become so large, so unwieldy and so secretive that no one knows how much money it costs, how many people it employs, how many programs exist within it or exactly how many agencies do the same work.”
The influx of information became enormous. Priest and Arkin reported in 2011 that the National Security Agency, the government's major surveillance system, “now ingests 1.7 billion pieces of intercepted communications every twenty-four hoursâtelephone calls, radio signals, cell phone conversations, emails, text and Twitter messages, bulletin board postings, instant messages, website changes, computer network pings, and IP addresses.”
As large digital haystacks have grown in the FBIâand in other intelligence agenciesâfrom a constant inward rush of new data from collection systems, other government agencies, and private industry, so has frustration about developing the capacity to extract valuable information from these digital haystacks. The increasing flow of data, and consequent greater demand to analyze data, was developed to increase the capacity of the bureau to discover and piece together crucial information that, if acted upon, might prevent another attack. But some people question whether that endless flow may instead decrease the bureau's capacity to succeed at that task. The sheer volume is overwhelming at times, much of it duplicative and much of it ignored or never seen. A system meant to send alerts to its users often instead produces a numbing effect.
Assessing the impact of the volume of undifferentiated information intelligence agencies receive daily,
Richard Clarke, who served as chief counterterrorism adviser on the
National Security Council under both President
Bill Clinton and President
George W. Bush, said in a 2013 interview, “
More is good. A hell of a lot more can be bad.”
THERE HAVE BEEN
alarming signs that the systems put in place have not worked well at crucial times.
For instance, available clues were not connected, or were not taken seriously, in two major attacks, one that happened and one that failedâthe rampage shootings by an Army psychiatrist at Fort Hood that resulted in the deaths of thirteen people, and the failed attempt by a Nigerian radical,
Umar Farouk Abdulmutallab, to bomb Detroit as the commercial jet in which he was a passenger landed there on Christmas Day 2009.
In the latter case, the would-be attacker's father had informed officials at the American embassy in Lagos, Nigeria, about his son's radicalization and interest in attacking the United States. In the end, no steps
were taken to prevent him from entering the country. His effort to bomb Detroit failed because he was tackled by a passenger who observed the man trying to ignite explosives hidden in his underwear. Explicit information about the Christmas bomber's plans was missed, Priest and Arkin reported, because, as an official admitted to them, “the system had gotten so big that the lines of responsibility had become hopelessly blurred.” As far as the FBI was concerned, the information in its own files about Abdulmutallab was a secretâa secret that was inaccessible to the FBI.
After the Senate Intelligence Committee investigated the handling of the failed Detroit bombing, it issued a report that was a sweeping indictment. The problems included failure of agents to communicate with one another as well as mistakes in computer programing. A counterterrorism analyst at the FBI never received relevant information sent to her about Abdulmutallab because the incorrect configuration of her computer profile blocked reception of the information. Commenting on the glitches that prevented intelligence agencies from stopping the Christmas bomber before he nearly succeeded, former Senator
Christopher S. Bond, Republican from Missouri, then the vice chair of the Senate Intelligence Committee, summarized the persisting problem:
“We cannot depend on dumb luck, incompetent terrorists and alert citizens to keep our families safe.”
In another Senate report, “A Ticking Time Bomb”âthis one an investigation of intelligence agencies' handling of the Fort Hood attack by the Senate Committee on Homeland Security and Government Affairsâit was found that crucial information the committee concluded might have averted the attack was mishandled as it moved among FBI offices and in exchanges between the FBI and military intelligence offices.
JUST AS
Hoover's secret FBI's emphasis on conducting political surveillance and dirty tricksâalong with his emphasis on solving easy crimes, stolen cars and bank robberies, while neglecting the crimes that damaged society most, organized crime and government corruptionâdistorted the mission and competence of the bureau while he was director, the difficulty of accessing and analyzing the enormous data collections of Top Secret America has threatened bureau competence since 9/11. When the bureau does not know what dots it has, it cannot connect dots.
The new FBI secrecy is potentially dangerous to the public in two ways: first, in regard to what the FBI needs to know but cannot find, and
second, in regard to what the FBI does not need to know but that it has been collecting since 9/11 and storing in the vast databases shared across the nation's seventeen intelligence agencies. That information about millions of Americansâmuch of it needlessly collected, much of it duplicative, much of it irrelevant to the mission of the FBI or any other intelligence agency, much of it so overwhelming in volume that it is ignoredâsits there, available, as Hoover's secret files were, for potential abusive use at any time, including invasion of privacy and inhibiting
civil liberties.
Since 9/11, many people in the FBI have engaged in near-heroic efforts to keep the bureau an effective and law-abiding agency in the face of major operational changes and massive internal and external pressures to succeed in its assignment as the nation's chief bulwark against another terrorist attack in the United States. To some degree, struggles inside the bureau reflect conflicting public concerns. Many Americans are very afraid of another attack taking place in the United States and want maximum protection from the bureau, even if liberties must be sacrificed. At the same time, many other Americans feel conflicted. They too are afraid, but they do not want fear of enemies, external or internal, to overwhelm either the bureau's capacity to protect the country or its ability to function as a lawful intelligence and law enforcement agency that also protects privacy and civil liberties.
W
HILE THE FBI
was spending millions of dollars in the years immediately after 9/11 on repeated attempts to create a useful bureau-wide computer system, one that would make it possible for agents to be able, finally, to send email messages to one another and do basic searches of the bureau's own files, the
National Security Agency (NSA) was executing a far more advanced high-tech plan. During that time the agency put in place sophisticated equipment and software programs that allowed it to monitor and absorb the world.
The most startling aspect of that expansion was the NSA's decision less than a month after the 9/11 attacks to aim its powerful electronic surveillance equipment at Americans' communications and take it all in, literally. Until then, the mission of the NSA, the nation's largest intelligence agency, had been limited to surveillance of enemies overseas. Now it was targeting law-abiding Americans, hoping to find the few terrorists among them, and also conducting blanket surveillance of the citizens of some of America's closest allies.
Phone calls by landline or cell, email messages, Internet searches, text messages, Facebook messages, audio messages, video streamingâthe NSA was
accessing all of it and storing it.
As the NSA exponentially increased its surveillance powers, so did the FBI. As the
NSA's main partner in surveillance operations, the FBI now had access to the vast array of domestic intelligence retrieved by the NSA. This data was the primary source of the huge new haystacks of data that grew inside the FBI after 9/11. With the bureau's capacity expanded by the NSA
developing cutting-edge surveillance equipment, it was now able to sweep up more information than the late FBI director, J. Edgar Hoover, could have imagined as a capability.
The extent and nature of the NSA's expansion since 9/11 had only been hinted at before June 2013, when a former NSA contractor,
Edward J. Snowden,
released to journalists Laura Poitras, documentary filmmaker, and Glenn Greenwald, then of
The Guardian,
NSA files that provided extensive evidence of the vast NSA expansion and how it operated. The evidence in the NSA files raised profound questions. In addition to important personal and political questions about the impact of mass surveillance of Americans and the residents of other countries, the evidence of the penetrating capacity of the new NSAâand by extension, of the new FBIâalso raised profound questions about the possibility that
Internet freedom was being seriously damaged by its use by powerful intelligence agencies as a giant surveillance machine. Equally important were questions about the Internet now being seen by the government as a mechanism for new forms of warfare and, indeed, as a new landscape for war. As one top-secret memo put it, cyber operations will be turned into “
another capability alongside air, sea and land forces.”