Insight by Iron Bow Technologies, Dell, & Intel

How to rationalize the hybrid cloud

The cloud first mandate, now in its second decade, has pushed nearly every federal agency to use commercial cloud computing to some degree. But it remains a slow, deliberate process because of the exacting requirements of the government.

True, with a credit card any employee in theory can open a small cloud space to play around with development. But systems of record require not just critical cybersecurity, but also portability to avoid vendor lock-in and compatibility with agencies own – ideally optimized – data centers.

To ensure the resulting hybrid cloud infrastructures meet the requirements for security, uptime, disaster recovery, and workload portability, agencies are using a variety of technologies. These include virtualization not only of applications but also of associated data, storage and network assets. Also containerization – encapsulating legacy applications and surrounding them with APIs so they, too, can migrate to the cloud.

Federal News Network and Iron Bow assembled a panel of federal experts to discuss these issues and gain a sense of how the government is approaching them.

Judge and Mendes represented the range of federal situations. The ITA operates 100 percent in the cloud, Mendes said, using both infrastructure and software as services. Judge said the Army is working to rationalize its base of applications, identifying those suitable for commercial cloud deployment. In some cases its refactoring code in more up-to-date languages before migrating.

Lockhart pointed out the importance of fully understanding workloads and knowing what’s going on behind the scenes in terms of micro segmentation, security and traffic. This understanding, he said, helps agencies better match a given workload with the cloud most suited to it, and thereby control costs.

Massey emphasized that agencies need to retain, or gain back, control in hybrid, multi-commercial-cloud environments to ensure the application performance before automating ongoing cloud operations.

Current Cloud Strategy

We’re effectively 100 percent cloud based. Now we’re in the process of migrating the legacy applications from the infrastructure-as-a-service into software-as-a-service as fast as we can…It has allowed us to dramatically cut our costs of operations.

Containerization

We have mainframe applications that sill run Cobol today. So we’re going the cost-benefit analysis. Should we just put these into a cloud environment wrapped inside containers to maintain current operations? But is it a better return on investment to do the necessary engineer work to completely refactor those applications [for] a cloud native environment?

Cost Considerations and Forecasting

One of the benefits of the cloud is that if you’re looking into something like machine learning, you can spin your data set into the cloud and then pay by the drink. Use the GPUs, be able to hammer away at the data, get your results out, and then turn the system off. You’re not having to expend all this cost on these licenses that you now have sitting in the corner.

Listen to the full show:

Copyright © 2019 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.