By Other Means

Automatic for the People

How to end Obama's culture of secrecy in just a few lines of code.

George Washington University National Security Archive
George Washington University National Security Archive

Some dilemmas are unresolvable. Into this category, I’d place that old chestnut, "How much government secrecy is compatible with democracy?" There’s no "answer" beyond "some, but not too much." Insofar as debates about U.S. drone policy or Justice Department leak investigations are debates about secrecy and democracy, we tend to go round and round in circles without arriving anywhere.

But other dilemmas have relatively straightforward solutions. Take this one: How can we reduce wholly unnecessary government secret-keeping? How can we prevent government documents from being classified when they should be unclassified, or from being classified "top secret" when they should be classified "confidential" — and how can we ensure that classified documents are promptly unclassified when the reasons for their original classification cease to exist? This is a thorny dilemma, but far from unresolvable — and thanks to several recent studies, both governmental and non-governmental, we have a pretty good idea of how to get moving on reforms.

The problem of over-classification isn’t new. My first government job was at the State Department, during the waning days of the Clinton administration. During that period, I recall reading top-secret memo after top-secret memo, all signed by the same senior colleague, all dealing with a particular issue.

I’d tell you what the issue was, but then, as we say, I’d have to kill you. Believe me, that’s not because revealing it would do the slightest damage to U.S. national security. I’m keeping mum for one reason only: Long ago, I signed a bunch of terrifying papers swearing to give Eric Holder, his predecessors, and his successors my firstborn child if ever I should reveal the contents of a classified document.

Aside from that, there’s no reason for secrecy. Suffice it to say that these memos contained no information worthy of a top-secret designation. I was young and naïve, though, so at the time I assumed I must be missing something. I finally worked up the courage to ask my senior colleague directly: "Why," I wondered tentatively, "are all these memos classified top secret? Is this issue more sensitive than meets the eye?"

His response was brisk. "No, not really, but if I don’t put ‘top secret’ on my memos, people will think they’re not important, and no one will read them."

Since then, I’ve seen the same phenomenon over and over. To mix metaphors, classified information is the currency of the realm inside the national security sausage-making machine. Increasingly, it’s the only way to be special.

You don’t have a security clearance? You’re no one. You have a secret-level clearance? I’m sorry, a top-secret clearance is required for you to be part of this meeting. You have a top-secret clearance? Regrettably, this document is part of a compartmented special-access program and you’re not read-in. In fact, it’s part of a waived, unattributed special access program that only I and four other people know about! Sorry ‘bout that.

As the national security bureaucracy has expanded and more and more classified documents are produced, more and more people need security clearances in order to do their jobs. But as more and more people receive security clearances, the iron law of supply and demand kicks in, and the value of clearances goes down.

According to a 2010 Washington Post series on "Top Secret America," an estimated 854,000 people hold top-secret clearances. That’s not a very exclusive club: Any secret held by 854,000 people isn’t much of a secret. Throw in the people with lower-level clearances and we get up to more than 4 million, or nearly 2 percent of the adult population of the United States. Who let those guys into the club?

As a result, we keep finding new ways to distinguish between levels and types of access, and more and more documents are (often reflexively) given a high classification, even when there’s really no secret to keep. The U.S. government’s Information Security Oversight Office reported that 92 million decisions to classify information were made in 2011, representing a 20 percent increase in classification decisions from 2010 and a 40 percent increase from 2009.

And as I said, this problem isn’t new. An excellent 2011 report by the Brennan Center for Justice offers some choice glimpses into history. By 1956, only a decade and a half after an executive order signed by FDR launched the modern classification system, a DOD panel was already warning that "overclassification has reached serious proportions." In 1970, a Defense Science Board task force reported that "the volume of scientific and technical information that is classified could profitably be decreased by perhaps as much as 90 percent." In 1985, yet another DOD committee concluded sadly that "too much information appears to be classified." In 1994, a joint CIA-DOD commission found that "the classification system… has grown out of control."

Fast-forward to the 9/11 Commission’s report, which was every bit as damning: "Current security requirements nurture overclassification and excessive compartmentation of information among agencies." Even more recently, the congressionally established Public Interest Declassification Board (PDIB) concluded, in a November 2012 report, "The system is compromised by over-classification and, not coincidentally, by increasing instances of unauthorized disclosures."

Documents over-classified by U.S. government employees range from the embarrassing and illegal to the merely goofy. According to the Brennan Center report, FBI efforts to "completely discredit" Martin Luther King Jr. "as the leader of the Negro people" were classified, as were government radiation experiments involving human subjects. Ditto, oddly, a 1940s Navy study on sharks attacking human beings. A 2003 report by the National Security Archive notes that during a 1999 declassification review, a CIA employee even redacted a humorous portion of a 1974 memo dealing with a (wholly fictitious) effort by an "organization of uncertain makeup, under the name of ‘Group of the Martyr Ebenezer Scrooge,’…to sabotage the annual courier flight of the government of the North Pole. Prime Minister and Chief Courier S. Claus has been notified."

Fictional threats against "S. Claus" notwithstanding, some classified material needs to be kept secret. Even the most ardent WikiLeaks supporter might accept that launch codes for U.S. nuclear weapons don’t belong in the public domain, for instance, and the claim that we need to protect "intelligence sources and methods" isn’t a frivolous one. Some of the intelligence the United States collects comes from individuals who pass information to us at great personal risk. Even when the information itself has little value to us, revealing that we have it can lead foreign organizations to identi
fy our source — sometimes with lethal consequences.

The problem, of course, is that from the outside there’s just no way for a reporter or a citizen to know for sure: Is access to a document restricted because it contains information that, if revealed, could truly harm the nation? Or is access restricted to cover up fraud or stupidity, or simply to make an uninteresting issue seem more important or help a mid-level bureaucrat gain status?

Closely linked to the problem of over-classification is the failure of government officials to declassify documents when the need for secrecy is over. In theory, most classified documents are classified for 10 years or less, after which they are "automatically" declassified. Under a Clinton-era executive order, those exceptional documents classified for longer periods must be "automatically" declassified after 25 years.

But as report after report has demonstrated, "automatic" declassification is usually anything but. Agencies can review documents subject to "automatic" declassification to ensure that they don’t fall into any exceptions to declassification, and agencies lack the authority to declassify information initially classified by other agencies. As a result, the "automatic" declassification process can take years. In the era of electronic information, documents of all kinds have proliferated, worsening the backload still more.

But here’s some good news: None of these problems is insoluble. To a significant degree, simple technological changes can address both the over-classification and declassification problems.

The trick is to shift the default. At the moment, the default is to over-classify rather than risk under-classifying. Erring on the side of over-classification is perceived as safer — those who inadvertently disclose classified information can lose clearances, jobs, and even their freedom, but those who over-classify face virtually no consequences in the current system. Over-classifying is also easier: In most existing secure email systems, for instance, the system defaults to repeating the highest classification setting of any email in a chain. As a result, even unclassified comments can end up inadvertently being classified.

The Public Interest Declassification Board’s report made 14 recommendations for addressing over-classification. Key suggestions included simplifying the multitiered classification hierarchy into only two categories, adding a requirement that classifiers specify the nature of the harm that is likely if particular information is revealed, and requiring classifiers to identify information with only "short-lived" sensitivity (e.g., certain tactical information) and tag it, in advance, for truly automatic declassification without further review once the short-term need for secrecy is over.

Both the 2012 PDIB report and the 2011 Brennan Center report note that much of this can be semi-automated. In our increasingly paperless world, most classification is done using electronic templates. Changing these templates can help shift the default from over-classification to classification to the minimum degree necessary. If, before a document can be finalized, classifiers are required to describe both the specific, identifiable harm likely to result from release of information and the specific harm likely to result from insufficient sharing of information, they’re less likely to reflexively over-classify. Similarly, computer-generated documents (or portions of documents) can be tagged for automatic electronic declassification and release once the date specified in the original classification has passed, without the need for cumbersome human review.

I don’t mean to suggest that implementing such reforms will be easy. It won’t. Antiquated and incompatible agency computer systems, budget restrictions and simple bureaucratic sloth, and passive resistance will plague any reform efforts. But reform is far from impossible, given sufficient political will.

As the PDIB has argued, reform will require high-level White House leadership. President Obama has gotten off to a good start, signing the "Reducing Over-Classification Act of 2010" and issuing several executive orders to the same end. But he could do more, including acting on the PDIB’s recommendation that he establish a White House-led Security Classification Reform Steering Committee to oversee implementation of PDIB-suggested reforms.

After all, some information needs to be released if we are to preserve our American values. I mean: Don’t we owe it to the world to reveal the sinister 1974 plot against S. Claus and the government of the North Pole?

Rosa Brooks is a law professor at Georgetown University and a senior fellow with the New America/Arizona State University Future of War Project. She served as a counselor to the U.S. defense undersecretary for policy from 2009 to 2011 and previously served as a senior advisor at the U.S. State Department. Her most recent book is How Everything Became War and the Military Became Everything.