The Food and Drug Administration has come a long way in terms of modernization, but officials said the agency needs to find a way to use the data it collects more efficiently.
Since 2009, the FDA has been driven to improve its systems through not only data center optimization, but also focusing heavily on virtualization and giving the agency more power and better “up-time” in its offices.
“There’s been a lot of advances in data analytics, and really in data generation. If you think about things like genomic sequencing or [the] ability to basically take DNA and turn it into ones and zeroes and then use that for computational analysis, that ends up creating a large data set, and we will use those heavily,” Brad Wintermute, the FDA’s deputy chief information officer, said on IT Modernization month. “Plus, we’re an agency that likes to hold onto everything in case we need to go back and reference that. So that certainly adds to the challenge.”
Data centers house a large sum of information. Optimizing these centers to better serve the agency is tough, and Wintermute said there are two paths that need to run parallel in order for the system to work.
“We call it a high-performance computing, that we’re doing … like some genomic sequencing … I was making kind of heavy-duty data analytics” he said. “Processing those larger data sets … takes a lot of parallel machines.”
The optimization can take anywhere from 1,000 to 10,000 computers working together. Wintermute said this takes a heavy review process and a workflow that automatically goes from step one to step two to step three.
“During that process, there’s a lot of interaction that happens with the various documents … that needs to be reviewed,” he said.
The agency’s reform efforts thus far have not gone unnoticed. The FDA has equipment in its optimized data center that helps ensure the process goes well. Other advances have been instituting a better workflow that will pull files quicker and correctly.
“We have all your networking and your processing together, and we’ve found that close approximation has allowed us to move that data in and out of … storage to processing faster, which actually helps the user get whatever it is they’re trying to get … more rapidly,” Wintermute said.
Like many federal agencies, the FDA is working toward virtualization of what was once considered hardware architecture — software-defined networking.
Wintermute said the FDA has heavily focused the last couple of years on how to migrate its workflow to the cloud.
“We’ve had to go through just like other government agencies and make sure that we have proper security controls,” he said.
For example, if a drug company has a new product that they’ve spent the last decade working on, the information first has to be turned in to the FDA. The agency checks the math and holds the intellectual property rights temporarily during the review process. So part of the issue with the cloud rests in how much security the FDA will have.
“Right now, all of our data that comes in and out of the internet is going through, of course, the internet connection,” Wintermute said. “It would be nice to have maybe a direct connection to the cloud vendor so that we would not have mixed traffic.”
The agency is worried that a bottleneck could appear somewhere in the process. Currently, the FDA information is encrypted within the cloud to make sure the agency is in control of the data.
Goals for the future center on the exploration of working with shared servers to tackle a potential bottleneck down the road. FDA and the Department of Health and Human Services already work together because a lot of their information is similar.
“I think there are certain areas where we could share services,” Wintermute said. “Some of the other operating divisions … have special expertise and maybe there’s a way we could use that expertise and benefit from what they’re doing.”
One plus, the deputy CIO said, is that the agency doesn’t currently use outdated applications. One drawback: some of the applications have been built to complete specific tasks, so it would be challenging to change them to reflect new laws, new codes, etc.
“There may be a new law or … [new] commitments we have to make [that] sometimes change when going in, and making those changes and those applications end up as kind of a long lead … namely through the testing process or just because there’s a lot of functions that are all built in and kind of intertwined over the course of the years.”
Wintermute said building new modules on existing applications will require the data to be separate. This would allow the FDA to be able to push and pull the data to get the best results.