As a diverse agency, the Department of Health and Human Services has diverse cloud computing needs. HHS encompasses research, medical operations, regulation and medical testing, and finance and verification as a third-party payer. It performs these diverse functions – lab to bedside – on a very large scale.
“We have 11 operating divisions across the nation,” said George Chambers, executive director of application and platform solutions at HHS, at Federal News Network’s Cloud Exchange 2023.
Chambers said the pandemic response “became a convergence of all those things, and the need for real time systems to deal with them at pace during the escalation.”
“Needless to say, our ability to use data stores within the cloud, exchanges that were cloud based, and the tools we provided for analytics were critical in our ability to respond to the pandemic,” Chambers said.
If any good came from the pandemic, then, it included acceleration of cloud adoption and maturity, Chambers said. He said cloud computing doesn’t necessarily save money, but for the same money “you have a broader spectrum, your reach increases, your ability to interact with internal and external organizations becomes paramount.” He added, HHS already had a cloud-first approach, and the pandemic “bolstered and reinforced that need.”
Data, application boundaries
Management of data becomes a challenge in the cloud, Chambers said, because of how cloud changes systems boundaries. Typically, he said, a system boundary follows geographic lines. Or it may be associated with a specific function or domain – in HHS’s case, research, operations or direct patient care. Traditionally, data has remained associated with an application.
But because data has become more of a resource for multiple applications, “you’ll see an emerging practice of data exchanges, protected data exchanges, in the cloud, where you can in fact put data out there that has already been culled, already been curated, and is available for consumption,” Chambers said.
He’s also watching technology development supporting a data fabric, as opposed to having all data sources gathered into a data lake.
“In my mind, you want to move toward the notion of a distributed data set of sources,” Chambers said. In that notion, applications draw on an authoritative data source, “but use those tools that can normalize the data, that can curate the data, without data having to be moved to a central environment. I don’t want to necessarily have to move data in order to normalize it, to curate it, and to make it usable for others to be consumed.”
When it comes to applications and supporting services, HHS’s multi-cloud environment requires special management attention.
“There are many facets to cross cloud multi cloud management,” Chambers said. For applications, “one of the first technologies that we’re using are around the management of [application programming interfaces], where we have, instead of formal integrations taking place between applications, we’re pushing toward the use of API’s,” Chambers said.
He said the APIs are written in for multiple platforms and multiple cloud environments, meaning the tracking of numerous APIs itself is a focused activity. He said HHS employs a platform to ease management of the library of APIs “to be consumed by the development and missions application groups, and not have to be replicated. APIs can be a shared service.”
Because HHS is so highly federated, with each component having its own IT capabilities, Chambers said his office acts as a provider of “shared services that can be consumed by any of the operating divisions.”
He added, “In essence, they have their own expertise that will allow them to achieve their mission. It’s our job to not only provide oversight, so things don’t go completely off the rail, but also to enable shared services.” As such, the application and platforms solutions group provides tool kits and a knowledge base of information to help the other divisions prepare their applications and data for cloud hosting. Chambers said he and his team give careful consideration to what is suitable for migrating to the cloud, and in what priority order, with the proviso that said some assets will remain inside HHS data centers.
“And there are reasons for not going to the cloud, whether it’s proprietary protection or specific use,” he said. “For labs, sometimes you have a proximity requirement as well.”
Still, HHS has what Chambers characterized as a mountain of legacy applications the IT groups are chipping away at. He said the high value systems in use by multiple HHS agencies – applications which are well documented – are higher on the priority list for rationalization.
“My team has been responsible for moving in the past year and a half, two years, probably 35 to 40 systems to the cloud,” Chambers said. For the most part, he added, they could be cloud-enabled without needing a full rewrite.
Beyond a tally of applications moved, Chamber said it’s important, and challenging, to have performance metrics in place “to understand how your technology has achieved this goal or not.”
While there’s no standard set of metrics, Chambers named four metrics his group applies to cloud applications for which it has oversight. They measure cycle time, throughput, quality and resources needed to make the process happen. Quality measures things like errors, number of reboots, data integrity and total cost of ownership.
“These fundamentals are challenging, but meaningful,” Chambers said, “because that’s how you can make a change.”
He added, “When you know what your cycle time is, what your throughput is, what your quality is, what your investment is, you could say, from a technology standpoint, we’ve achieved our goal.”