The Intelligence Community (IC) is undergoing its biggest technological change ever as a team of hundreds works to build a computer system that links together nearly all of the 17 intelligence entities through a series of classified servers. To call it an ambitious project might be an understatement. The architects of the undertaking aim to get an initial version going by the end of the year.
The chief information officers at the most prominent agencies of the Intelligence Community were assigned the mission last summer when Director of National Intelligence James Clapper began to brace for budget cuts that would hit the community hard. For the first time in a decade, the IC would be forced to downsize under the strain of a budget that could no longer maintain the expansive growth the community had experienced since the terrorist attacks of September 11, 2001.
By last fall, Clapper was paraphrasing a favorite quote by a New Zealand physicist Earnest Rutherford who, in the midst of his own country’s budget deliberations in 1927, said, “We’re running out of money so we must begin to think.”
Clapper had heard talk for years about inefficiencies in the community’s information technology system. Each of the agencies relied on their own IT systems – problematic for information sharing, but comforting for information security. The traditional thinking seemed to be that the tighter the hold on the information, the easier it was to keep secure. When the budget issue presented itself, Clapper saw an opportunity.
“It was sort of a perfect storm kind of thing,” said Clapper, who had kept an eye on cloud computing technology and came to realize that getting the agencies aboard a single cloud might yield not just significant cost savings over time but also information sharing benefits in the shorter term.
What is a Classified Cloud?
The idea of putting the country’s sensitive intelligence information in a cloud may not sound all that safe since top intelligence and law enforcement officials have warned that the cyber threat facing the nation may surpass the threat of terrorism. To understand the risk, you have to understand the cloud concept.
Imagine the Internet as essentially a wide array of servers located in various locations around the world. They talk to each other, and they can store data. A cloud is where that data lives.
Businesses and individuals can opt to store information on those vast servers because it’s less of an information load than on their local servers. For a business, it means being able to run complicated programs without having to pay for the expensive technology needed to run them. It’s kind of like renting the space.
The CIA has been using cloud technology for years, but there’s a key difference between public servers and the Intelligence Community’s cloud. In the IC, the cloud does not live on a public server. The challenge until now was figuring out how to facilitate and make the community’s clouds interoperable. The model that the community is now moving toward tasks the individual agencies to act as a service provider to the rest of the community, essentially, creating their own IT support structure. This is a closed cloud concept. The hope is that by making the cloud internal to the IC only, it will mitigate the risk of intruders accessing sensitive or classified information.
Still, Clapper knew that selling such a concept across intelligence agencies that pride themselves on independence would require a different skill set entirely.
The Man for the Job
Getting all of the agencies on board with any initiative can be tricky. Clapper needed someone who would walk in the door with credibility. He turned to his principal deputy, Stephanie O’Sullivan, who herself hailed from the most protective of the intelligence agencies, the CIA. Years earlier, O’Sullivan had met Al Tarasiuk, who as CIA’s chief information officer for five years. He was on assignment in Europe when O’Sullivan called.
Tarasiuk’s new mission: to sell IC leaders on an initiative requiring them to surrender a measure of their independence.
He says he knew what challenges were ahead.
“It’s about enabling integration and enabling information sharing,” Tarasiuk said. “The big cultural change is that we agreed to a new business model on how we would manage this.”
Three months into execution, there have been hundreds involved in the effort. Their ultimate mission will also require the creation of a single desktop software program shared across the community. When analysts or operators or engineers log on to their computer, they will have the same set of tools at their fingertips.
It’s kind of like a super secret version of social media that the architects are hoping will enable missions that now take months to plan to be done in a matter of minutes. But with reward, comes risk.
Risk and Reward
Intelligence operatives are used to taking risk. Less so, perhaps, are the analysts.
A recent study by the non-profit Intelligence and National Security Alliance (INSA) took a hard look at the way the community is to implement the cloud concept. INSA researchers specifically looked at recent mission failures and determined that many were caused either by the way intelligence information had been compartmentalized into ‘data silos’, or the way people had been managing documents.
“In most cases,” the INSA report said, “the critical piece of information necessary for mission success was already possessed. The failure was not in obtaining the information but in locating and applying it to the mission.”
In a case like that, the cloud – and more importantly the bigger IT infrastructure change – could really help improve efficiencies if it works the way it is intended.
But there are other, more significant risks to consider as the transformation develops over the next several years. Tarasiuk says he’s aware of the challenges.
“We accepted some risk in moving out very quickly,” he said.
He won’t talk in detail about those risks, but does point out that the Director of the Cyber Command, General Keith Alexander, calls this a more defensible infrastructure. The idea being, if there is only one door into a house, that door is far easier to not only monitor, but to also protect. He also argues that with the new infrastructure, once a problem is detected, it’s easier to address.
“When it’s centralized you can implement patches,” Tarasiuk said. “If you have a malware issue, a vendor comes out and says, ‘Hey, we’ve got a code problem here,, and they do a patch right away. Today, (in the IC) the way we have to do a patch, is its’ distributed to every agency. They do it on their own; we’re not involved.”
There are some increased risks that Tarasiuk won’t talk about but the Government Accountability Office (GAO) has identified.
On Tuesday, the GAO released a report identifying what it called key supply chain-related threats that included the installation of defective hardware or software; counterfeit hardware or software; the intentional installation of intentionally harmful hardware or software; the failure of critical products; and the reliance on malicious or unqualified service providers for technical support issues.
By setting up its own IT force inside the IC, that last one might be mitigated. But all are concerns that the architects say they have considered.
Of course, there is one threat that may be far harder to mitigate.
The Bad Actor Scenario
Even a trusted insider with appropriate clearances can become a risk.
A ‘bad actor’ as they are called, or a person who spies on his or her own country could theoretically have a field day with the new system. It’s a nightmare scenario for the architects.
One need only look at the WikiLeaks disaster that the military says involved just one person accessing data and passing it along to the whistleblower website – which then published sensitive government cables.
“Insider threat is always something that we watch,” Tarasiuk said. “I know that private companies are very worried about the same kind of issues, and we have ways to deal with that. I think we as a government have to be sure that we put a lot of emphasis on safeguarding.”
The architects hope to help mitigate that risk by tagging data. The NSA already does it. It takes note of who accesses what data, and perhaps more importantly, what they do with it. Tagging the data is critical to keeping the cloud and the new architecture safe.
“We’re also tagging people,” Tarasiuk said. “When they authenticate themselves into the system, very much like we do today, the system will recognize that they have certain attributes and will allow them to see certain data.”
Richard “Dickie” George spent 41 years at the National Security Agency. He now works as a senior cyber expert at the Johns Hopkins Applied Physics Laboratory, which works on government projects. George says you have to be careful who is allowed to see what when you’re offering a view of the cloud.
“We have authorities to deal with different information. So if you’re putting it in a big cloud, you really have to be extremely careful,” he said.
George cites legal requirements that control who is allowed to see what, and the challenge of keeping that straight in a cloud architecture.
“It makes it easier to share because everybody has access to more information,” George said. “You have to be careful that people don’t inadvertently gain access to information they aren’t supposed to have and that data tagging – that labeling of information to ensure that only people with the proper authorities have access to it – that’s a critical part of the game.”
But as the man tasked to get this done, Tarasiuk has one more concern.
“My biggest worry is that we sold the big picture,” he said. “Theoretically it’s great. But now it’s making those things work together on the scale we are talking about. So I have a little fear on the technology side. Not that we can’t do it, but that it will take a little longer than we think it should.”