Intelligence agencies open doors to long-awaited cloud marketplace, invite analysts and developers to tinker with commercial technologies.
The U.S. intelligence community has just opened a new marketplace for cloud applications, the idea being to let analysts and developers test-drive thousands of commercial data analytic tools for a pittance and without waiting for their agencies to make large commitments of time and money via usual government procurement channels.
The marketplace, which opened last week, is integrated into the private cloud that Amazon Web Services built for the Central Intelligence Agency as part of the Intelligence Community Information Technology Environment (ICITE). The classified storefront is modeled on the one Amazon uses for its public cloud, which lets developers and users “pay by the drink” while they’re evaluating various software tools, development platforms and even entire operating systems. IC staff will get access to those tools instantaneously rather than having to wait for the government acquisition process to procure them one by one.
“The ability to use the marketplace to try out, essentially for pennies, something upwards of 2,000 IT products is going to be a huge benefit,” said Tom Hall, the technical director and chief data officer in the office of the Director of National Intelligence. “You can drop in a dime and play with something all you want.”
Hall declined to say precisely which tools the IC plans to include in its classified marketplace.
“But it’s basically any database system that anybody’s invented for market analytics in the real world that could also be used in the intelligence business,” he told a forum organized by the Intelligence and National Security Alliance and Defense One.
Although the marketplace project has been in the works for months, officials said it’s too early to predict exactly how IC employees will use it since it’s just come online. Product offerings, for now, are sparse, and intelligence agencies are primarily using it for market research for the moment, said Thomas Husband, the chief of the mission transformation task force at the Defense Intelligence Agency.
“We’ll be developing teams that will look at what capabilities these commercial tools provide, and the focus is on the capability — not on any individual company,” Husband said. “We’ll apply whatever any analytic problem it is we’re trying to resolve, and then we’ll evaluate how each characteristic of these tools might apply to our problems. Then, through feedback from our users who can tell us whether they’ve developed better intelligence insights as a result of using those, we can make sound judgments about exactly the products and services we’re going to need in the future. I think we’re also going to have a lot of new software development as a result of that feedback.”
The broader ICITE project — a major priority for James Clapper, the director of national intelligence – is premised on the notions that the overall intelligence community must cut its IT costs during a time of declining budgets, but that it can also increase the quality of its analytics if each of the 16 intelligence agencies share both their computing resources and their data.
ICITE, officials said, is well into its execution stage. A common desktop, jointly developed by the National Geospatial Intelligence Agency and the Defense Intelligence Agency, has grown from 17,000 users one year ago to 50,000 as of this month. The project has also caused agencies to consolidate hundreds of one-off software licenses into 14 enterprise licensing agreements that span the entire IC.
ICITE’s implementation focus is increasingly shifting toward the commercial cloud computing project, led by CIA and the NSA, because officials see that component of the plan as critical to the “intelligence integration” that Clapper has made his mantra since taking office as DNI in 2010.
Whether the data’s home is in the Amazon-operated cloud or the government-run variant first built by the NSA, the ICITE framework aspires to make it available to each agency, dependent only on an individual person’s security authorization to access a piece of information, not dependent on which agency signs his or her paycheck.
Although that’s a major change from the way intel agencies have historically managed their data from a technological standpoint, the bigger challenges have to do with a workforce whose traditional habits are to treat intelligence information as the exclusive property of one agency, one office, or even one person.
But Sally Holcomb, the deputy CIO at the National Security Agency, which began its migration to the government-operated cloud environment four years ago, said employees’ attitudes tend to change after seeing that other agencies’ intelligence can improve the quality of their own intelligence analyses.
“Working with folks to show them what can happen when the data is correlated has always gone very well in terms of outcome,” she said. “When you have all of these data sources combined and in a common format, analysts who may or may not be authorized to access the raw data can still gain value from that information when the analytic tools run across it. The analytic gets the access, puts out the result, and the analyst can combine it with their own information. That’s what cloud does for us because of the way it’s built. We can do very granular protection right down to the object level, protect the raw data from people who aren’t cleared to see it, but still let them use that information to correlate data between agencies. That’s not something we’ve been able to do before.”
But NSA’s experience also showed that moving users to common desktops and convincing them to experiment with new cloud-based tools is a non-trivial matter — and not just because of tribalism between intelligence agencies. It’s also because learning new IT systems is not necessarily at the top of an average analyst’s priority list, given his or her large daily workload.
“Transitioning away from the way you’ve been doing your work and into a new set of IT tools and searching for information can be overwhelming in and of itself,” Holcomb said. “So what we did to break through that was to set up what we called Fat Tuesdays. The FAT stood for Foreign Architecture Transition. Once a week, analysts were only allowed to use our cloud tools for their normal work. They may not have gotten the same number of reports done as they would have on a normal day, but they got used to what the new IT process was going to be like. You have to build in time for workforce familiarization and make time for that change to occur.”
Getting the workforce’s buy-in to the new tools being deployed via ICITE is critical, since the DNI and its subordinate agencies have already invested billions toward deploying the new IT philosophy.
Hall said the IC’s leadership views ICITE in general, and its cloud plans in particular, as the only way to keep U.S. intelligence technologies apace with commercial industry and its best bet for keeping an intelligence advantage over potential adversaries.
“In typical federal acquisition fashion, we were obsolete before we even started ICITE,” he said. “So one of the key commitments we made from the start was to buy commercial and stay commercial, stay and the edge of technology and not to do a classic government systems acquisition. If we bought our cloud that way, it would be obsolete in a week. That’s not what we did. We didn’t buy a system, we bought a vendor who’s obligated to stay up with the state of the art, drive the state of the art, and keep us as close as humanly possible to the edge of technology. We’re not telling the vendor what to build, we’re telling the vendor to build for us what they’re building for the rest of the world, and they’re measured on how quickly they can do that.”
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Jared Serbu is deputy editor of Federal News Network and reports on the Defense Department’s contracting, legislative, workforce and IT issues.
Follow @jserbuWFED