At the Pacific Northwest National Laboratory automated testing is part of the design

Paul Reichlin director of digital platforms at the Pacific Northwest National Laboratory sees automated software testing as a foundational role.

The Pacific Northwest National Laboratory manages a broad range of applications that support operations from finance to HR systems, facilities, asset capabilities and custom-built software. The PNNL portfolio consists of 180 business applications and a user population of 6,500.

PNNL conducts high-level research to support operations in national security, sustainable energy, power grid modernization, and even technology and chemical materials research for safeguarding ports against nuclear materials smuggling. The organization is currently undergoing a major modernization effort that includes removing and replacing some legacy systems and introducing new technology.

Paul Reichlin director of digital platforms at the Pacific Northwest National Laboratory sees automated software testing as a foundational role in the current digital transformation being undertaken at PNNL. The organization has relied both on the Agile scrum development methodology, which focuses on short cycles, where you build, develop and release code within short spaces of time, and the DevSecOps approach, which leans on continuous learning and continuous improvement as you go.

The national labs are federally funded and operated by contractors. They employ commercial off the shelf products for functional testing to help them write automated tests and scripts. The testing moves between testing teams that test separately and learn from each other.

They maintain a service catalog, use data metrics and track progress. Reichlin notes that they have two domains, described best by looking at financial software. They build their requirements to meet federal standards, but also the requirements of the organziation that’s operating the lab.

“We have a number of efforts where we’re building out kind of brand new digital platforms from scratch, reimagining the business applications as they have a fair amount of disparate systems built over time. As we rethink those, we’re looking at our users. We’re studying how they do things. We bring them into the process and then reimagine what those products can be. So we’re really building from the ground up at this point,” Reichlin said on Federal Monthly Insights – Automated Software testing.

With modernization comes the crucial need for automated software testing. PNNL’s strategy has been to build testing into the process at shorter and faster intervals throughout. Reichlin notes the importance of moving the testing earlier in the process. His approach has been to integrate testing into the development cycle.

“We’ve definitely taken the common approach of moving a lot of testing left in the process, so the more we can move it into the development process, the build processes, so we get that faster feedback.” Reichlin said. “When you’re starting new, you can do a lot with the code in terms of making it more modular and testable . . . You have those tests  to let you know if you broke something.”

Reichlin emphasizes standardizing the testing practice. For his team, that meant recognizing that legacy systems often require a great deal of attention if you plan to keep them in operation. They are most likely to not have documentation or background information or the workers that still write and understand the original code. In these situations, the test teams are required to do a great deal of analytical work to guarantee they are well versed at knowing what the software is doing. PNNL also focuses on managing the data sets around their testing.

“It’s important to have good test data sets.” Reichlin said. “You have to look at what you can test in term of interfaces, and having either a mock or a stub to kind of mimic a downstream dependency. . .  This allows you to automate and move quickly.” Reichlin said on ‘The Federal Drive with Tom Temin.’

PNNL has made the move to cloud storage and now has about 95% of their applications hosted in the cloud, including a combination of internally developed software, commercial products, Software as a service, no and low code.  He emphasized that the cloud comes with its own challenges, and noted that it’s important to stay up to date on software service offerings to limit the worry of the issues that come with patch updates. Through it all, Reichlin has kept his focus on the end user of the systems.

“These processes require a lot of people at the table in terms of making sure we’ve got the requirements right, but also that we’re designing these systems for the users and the way they like to work. So it’s really important to get that input early and have them work with you as you develop your plans and designs. Reichlin said.

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories

    Derace LauderdaleFederal Executive Boards connect federal employees across the country

    FEBs seek to offer resources to more federal employees

    Read more
    Getty Images/iStockphoto/chombosanAI (artificial intelligence) concept.

    Shining a light on shadow AI: Three ways to keep your enterprise safe

    Read more