Before any agency can truly take advantage of the cloud or artificial intelligence or data analytics tools, they first must create the underlying environment.
Basically it comes down to the fact that the network still matters. It must be robust enough to handle the increased demand across all of the department.
Gary Barlet, the chief information officer of the Postal Service’s inspector general’s office, said at the 930Gov conference that the office’s networks are primed and ready for the move to the cloud and to use more advanced data analytical tools.
“We have 106 field offices scattered across the United States and most of those field offices have small, traditional circuits connected to them, T1s in most cases and that wasn’t going to cut it when you talk about cloud technology and a lot of these new advancements in technology,” Barlet said on Ask the CIO. “The first thing we did is start from the ground up. At the circuit level we replaced all the circuits we had. We went to different internet providers in the local areas and said, ‘treat us like a small business and give us a nice internet pipe similar to what I might have at my house or a small business downtown might have.’ We replaced all those circuits with much bigger pipes, and then everything went up from there.”
He said the USPS IG’s office replaced its entire backbone infrastructure and migrated applications and data out of data centers.
Dorothy Aronson, the CIO of the National Science Foundation, said the agency has been on this modernization journey for the better part of the last seven years following an inside-out strategy.
“We were doing two things at once. For the first few years, we were building governance. We were building ways to hear from our customers. At a certain point, we had our infrastructure ready so that the customer-facing parts of the systems could be focused on them. We had a solid infrastructure to work from,” she said. “We still are working to modernize certain things that are very heavy-lift, for example, our database infrastructure. We still cannot yet move our database to the cloud, but we are working inch by inch to get there while modernizing customer facing utilities.”
Aronson, who also is the NSF’s chief data officer, said over the last few months the agency has turned its focus on data governance further inward by assembling program and mission owners to figure out a new strategy.
“People really understand they are working with information every day and they have all kinds of diverse desires about how to make things better. We found everyone working in isolation were doing similar kinds of things, and bringing them together was very empowering,” she said.
Bringing the data together
Barlet said the USPS IG’s office runs on data for auditors and investigators. He said one of the next focus areas for the organization’s IT modernization effort is building a business intelligence library. The BIL is a central data warehouse where the data is documented, cataloged and available to the individual user.
“Data can’t be just available for the IT people. AI is great and we are pressing hard in that area, but making the data available to the average worker who are doing the work and making the decisions is a big focus for us in the future,” he said. “We are tasking our analytics team to stop doing things in siloes. We are creating a data-mart where the data is accessible for them to start solving problems that they may not have thought about solving yet. We don’t want them to spend a lot of time prepping the data and find out what data is available.”
The USPS OIG is launching an AI pilot because the network is ready to handle the technology.
“Now it’s more about the business itself, the business needs and the business processes. And then figuring out what solutions you can provide and it’s not about the technology itself. It’s about the processes, the application of the technology, the ability to put power in the hands of your customers,” he said. “Back in the day, if you wanted something you came to IT and IT decided when and where and how it’s done. Now it’s just enabling your customers to decide when and where and how it’s done, and empowering them to do that. That’s a significant shift for a lot of IT people and relinquishing control back to your customer base.”
Barlet said he has set up boundaries to ensure data privacy and protections, but not limiting the ability for his customer to use the information they have access to in the first place.
Reskilling pilot underway
Aronson said for NSF to get to that point where her customers know what data they want and what tools they can use comes back to driving data literacy across the workforce.
“We’re doing something now at the NSF called the Udacity experiment, which is funded by the CIO Council. Udacity is a vendor that provides online training around various technologies and we selected data analytics and data science. We are doing an experiment to see how quickly people can upskill,” she said. “My real challenge is to figure out how to help everyone understand how to be a data-driven decision maker.”
Through the reskilling effort, Aronson said volunteers from 10 agencies will use the tool to learn the skills necessary to become data analysts or data scientists. Based on volunteer feedback, NSF and the CIO Council will determine whether the tool is a promising one for further federal reskilling initiatives, she said.
“I think we are liberating the skills and the tools so in the future the IT staff will have very important roles that only IT staff can do, but they will be invisible roles. The things that people will see and manage will be in their own hands,” Aronson said.