“The official government acronym is the continuous authorization and verification engine, and really what it boils down to is a software factory or a container-based platform to streamline software development efforts, ATO preparation efforts and the ongoing maintenance and runtime that goes into building a system. All in the name of trying to make developers lives easier as they build software for CMS,” said Robert Wood, the CMS chief information security officer, in an interview with Federal News Network. “It’s a DevSecOps platform and a software factory as I view them almost as one in the same in some ways. It’s the combination of the accumulation of technologies, processes and the culture that goes into building software and getting out the door quickly and aligning to continuous deployment principles.”
Wood’s team leads the BatCave effort because to build software faster, the security group has to reduce friction, ensure stability, resiliency and, most of all, make as much of the security process automated and continuous as possible.
While CMS may be seen by some as stuck in the past with mainframes and COBOL, over the last few years the agency has been aggressively moving systems and data to the cloud.
Rajiv Uppal, the CMS chief information officer, said at the recent AFCEA Health IT day that the agency already has move more than 90 systems to the cloud out of 200.
“There are some things that will take longer to go to the cloud. For example, we have claims processing. It’s a 40-year-old system that runs on the mainframe. We are taking pieces and moving it to the cloud. That will take time and we have to be careful in how we do those things,” he said. “Eventually, I believe, almost everything will be in the cloud. CMS probably has the largest cloud footprint in the civilian sector. We are well on our way.”
Borrowed from Air Force, others
The BatCave isn’t necessarily a new concept. CMS worked closely and modeled it after the Air Force’s Platform One effort.
CMS developers are not mandated to use the BatCave so Wood knows it has to provide value and incentives to draw in users.
“We collaborate and have discussions with the Air Force because they did this also in a very federated environment, which is similar to how we have to operate in CMS. Everyone has their own money and is doing their own thing,” he said. “Service adoption happens not by mandate, but by choice. We have to have the right incentive levers and value proposition in place for somebody to choose to consume a centralized service. So there’s a lot of lessons learned from the Navy’s efforts in the Air Force’s efforts.”
To draw in those users, Wood said one big lessons he has learned from the Air Force is to focus on the community and users’ needs.
“I think it’s really easy to fall into the trap of building what you think your community needs, instead of actually listening to them or letting the data drive where you need to go. In our case, we did a lot of user research, a lot of user validation, a lot of digging into the data about what our systems look like, the ATO process and things like that we had these efforts that preceded the BatCave that really informed how we were going to build and what we were going to build,” he said. “We’ve been doing human-centered design research throughout, thinking about it from a value-driven flywheel perspective. Doing something like this demands that level of user engagement and community engagement.”
Security control inheritance
Wood said CMS moved the DevSecOps platform from contract award to production in less than a year and currently has six teams using it. He said several other mission areas across CMS are evaluating how they can take advantage of the tools in the future.
“It’s not going to be a good fit for everyone. We recognize that. But those that are running containerized workloads, that are that are trying to move faster with software, that are running things in the cloud, that are running web services, application programming interfaces (APIs), they’re probably a pretty good fit,” he said. “They might benefit from not having to worry about ATO overhead anymore. They might benefit from being able to change their software and deploy really quickly without having to go through a costly and time consuming security impact analysis process every time that they do a new release or want to introduce a new feature. All of that contributes to just faster mission release and faster time to market.”
Wood said one of the biggest benefits of the BatCave platform is developers inherit almost 80% of the required security controls. This means they only have to test for the remaining 20%, reducing the time from development to production.
“We didn’t start out trying to get to 80%. We basically built what we felt was an ideal minimum viable product (MVP), and we went through and started just doing the hard work of control mapping for all the different things that go into it in this very modular way. We expect that as we’re able to add more and more stuff into the pipelines because this was just the MVP,” he said. “The rest of the things like security monitoring activities and stuff like that are things that we can also systematically start to build into the process. These include things like collecting and aggregating logs into our data lake and producing a software bill of materials (SBOM). That all falls on the shoulders of the development teams, but that we can tee them up to be successful and ingest the artifacts in such a way that we can continuously monitor them, such that they get to the point where we are just, we feel comfortable placing them into a continuous state of authorization.”