For USPTO, security is part of the software code

John Owens, the chief information officer of the U.S. Patent and Trademark Office, said automated tools and human analysis ensure software code is secure.

John Owens, the chief information officer of the U.S. Patent and Trademark Office, will join Federal News Radio on Oct. 19 for a free online chat. Register to ask John your own questions.

The U.S. Patent and Trademark Office has been out in front of most agencies in modernizing its technology systems.

By implementing a dev/ops process, it can now turn around software bug fixes in 24-to-48 hours. USPTO also is using two-to-three week sprints to develop new IT services for patent and trademark examiners with a goal of releasing new capabilities at least quarterly.

John Owens, the chief information officer of the U.S. Patent and Trademark Office, said the push to agile or dev/ops hasn’t shortchanged cybersecurity.

John Owens is the chief information officer of the U.S. Patent and Trademark Office.

He said USPTO is doing even more to ensure the entire dev/ops process is secure.

“We have an entire automated test suite for security that we are continuously improving. We’ve also brought in security and our IT security officer right in the beginning of every project,” Owens said on Ask the CIO. “Architecturally, we’ve qualified a foundational layer of hardware and operating system, and looked at what is available not only on our internal cloud offering but external clouds to make sure we are completely compliant with the documentation, which certainly on the external clouds can be problematic at times. But as the providers become more familiar on what it takes to be federally compliant, which is most of the time just documentation, automating those security scans, bringing in people early, deciding what technologies we can and cannot use, and checking that box that says we can justify this and we can secure it, is really important.”

Owens said too often developers are focused on getting capabilities out the door and not so much on the security aspect so having these automated tools is a key piece of the cyber puzzle.

He said USPTO changes out its security tools more quickly than it does for any other software tools because the threats and vulnerabilities morph daily.

And Owens said he’s always looking for new automated tools that can be easily integrated into the build and deployment process.

USPTO also calls in teams of special experts to review code and test for security problems before the capability goes to production. Owens said adding the human analysis piece to the automated tools brings the entire process together.

“Your scripts will only tell you so much before a human goes, ‘hmmm, I wonder if I could do this?’ So you really do find another layer there,” he said. “Remembering to do it along with your performance scans early enough in the process where you are not ready to roll out when you find out you have a problem. We want to catch as many of those up front as possible so we don’t build a backlog of technical debt full of security vulnerabilities. Usually there is enough technical debt to fill up as much as you can deliver.”

The third part of this dev/ops security effort is broad-based training. Owens said ensuring customers and non-security IT workers really understand what security does and why it does it, helps with the change management side of dev/ops.

Owens said the chief information security officer and her team are part of the planning team, and has created an approach to get ahead of the security potential and real problems.

He said the changes included changes to contractors as well as contract clauses.

Owens said the use of automated security tools and other dev/ops process has let his staff deploy new capabilities in 24-to-48 hours instead of two week deployments. He said sometimes the automated testing of the new software is good enough, and then they go back and do a deeper security review.

“It no longer became the obstacle and the automation produces a rock-solid history for auditability when it comes to ‘did you do the scan?’ ‘What was the documentation?’ You take that result and check it into a system and there is source control and treat it like code because security is part of code,” he said. “Whether it’s at a core or foundational level, infrastructure, or platform-as-a-service or software-as-a-service or in any part of that stack, we have that traceability and auditability. Of course, we would always stop if there was some egregious issue. It really cuts down on the number of [security vulnerabilities] that you are deploying because that’s the big deal. Legacy systems, in my experience, we were deploying with hundreds of vulnerabilities over very large systems, multiple-year waterfall deployments. It was like, ‘OK, we have to deploy so we will deploy.’ I’ve seen a dramatic decrease in the number of deployments and an increase in our auditability to work with the IG or some other investigator to say ‘yes, we did it.’ I feel more prepared now because of the automation than ever before.”

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories