Jeff Shilling, the chief information officer of the National Cancer Institute in the National Institutes of Health, said modernizing the underlying technology i...
The National Cancer Institute is keeping its IT modernization strategy pretty simple. There is cloud one and there is cloud two. All new development will reside in one of those two instances.
Jeff Shilling, the chief information officer of the NCI in the National Institutes of Health, said the agency’s cloud-first approach is all about improving how his office provides services to the mission area and reducing the burden of technology.
“The different clouds allow us to put things into a security structure so that we’re really doing security from the get go because the environment is already set,” Shilling said on Ask the CIO. “When staff and researchers and the contractors they work with go in that knowledge, they’re able to understand those limitations of the clouds. I think the vendors have been very, very good about securing their products through the government’s FedRAMP process. They’re very innovative and they’re moving new products in there all the time. I do really applaud the federal government’s move to make FedRAMP less expensive and to make it more streamlined. And understanding that if we’re going to do this, the cloud has to be less expensive, it has to be faster.”
Shilling said NCI moved a lot of their virtual machines into Cloud One and are trying to use the services native to that specific cloud. He’s not a big fan of lifting-and-shifting legacy technology into the cloud, so the agency still has applications and workloads on-premise.
“We tried to move most new stuff there since much of our new stuff is publicly facing. We’re driven a lot by this new federal mandate for doing scientific data sharing,” he said. “Most of my legacy that we’re going to move forward with, we’ll use modern tools and move it to the cloud when we reengineer the product. I would say 80% redesign work with the customers, the NCI staff. We’ve run a service model so the NCI staff will work with us to know what is possible today. We will work through some of those minimal viable products where you have to showcase some things to them. They’re very smart, very inventive people, and they will really start to start to come up with things and we have to build that infrastructure to make that happen.”
One example of that is the NCI built a centralized events website to help bring together current and potential grantees.
Shilling said previously each office custom built these event pages, taking time and resources away from the mission.
“Now almost all the events are hosted on that that event website. It’s in the cloud and it’s secure. My biggest thing is IT doesn’t burden the scientists,” he said. “We gave them a template and told them just to give us your language, and then we’ll whip it up there for you. Something as simple as that lowers the complexity and minimizes what they need to know about the technology.”
It’s also an example of how the cloud is making technology better and more successful. Shilling said 10 years ago many IT projects failed for an assortment of reasons, but today the underlying technology, the use of iterative or agile development and improved processes have reduced the risk of failure quite a bit.
“Now what we have is because there was so much risk of failure, there’s a lot of shadow IT or duplicate IT or multiple solutions. We have to start writing that model in now. Our enemy isn’t success or failure. Our enemy is too many options,” he said. “What does that do to my organization? It gives me data strewn all over the place, meaning we struggle to do analytics. I can’t just say ‘how long did it take us to do this process?’ I have seven systems that do that so I have to collect all that data. And then a year later, we’ve finally answered that one simple question. The other thing is it’s impossible due to prospective analytics.”
A big benefit of the cloud is bringing all of those disparate systems and therefore data together. Shilling said the vendors need to make it easier to share data between cloud instances.
“We haven’t really done a lot of that sharing. Instead, we’ll duplicate data in the cloud because that’s where the researcher’s analysis tools are built and they are used to working in there. We’ll put the data there and we’ll duplicate it and that will cost us more,” he said. “At the National Cancer Institute, we are trying to plan for the future where we see all the value in scientists sharing their data. We’re trying to create these cloud platforms and capabilities. I was at a meeting [in March] about the Cancer Research Data Commons, which is basically a way for the scientists to not have to worry so much about how they share their data. They just give it to the Cancer Research Data Commons, it gets annotated and metadata is added. There’s a lot of complexity with data; as we all know, the cleaner you collect the data, the better it’s going to be in the long run, but we don’t really have the data standards that people are using in many other fields, and cancer is a very complex thing to study. There’s lots of standards. Our new director is very, very, very much focused on the cancer moonshot, and we really see this as a low-hanging fruit for discovery, which is us being able to share data from patient data, clinical trial data and research data.”
Watch the entire discussion here.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Jason Miller is executive editor of Federal News Network and directs news coverage on the people, policy and programs of the federal government.
Follow @jmillerWFED