Insight by ThunderCat Technology and Dell Technologies

Agencies grasping the cost, benefit analyses of moving to the cloud

As agencies shift from cloud first to cloud smart, they must rethink how and why they use these off-premise services.

The underlying principle of cloud smart attempts to give agencies the ability to evaluate their options based on their service and mission needs, technical requirements and existing policy limitations.

The Office of Management and Budget say computing and technology decisions should also consider customer impact balanced against cost and cybersecurity risk management criteria.

OMB also told agencies they should assess their requirements and seek the environments and solutions, cloud or otherwise, that best enable them to achieve their mission goals while being good stewards of taxpayer resources.

Seems so simple? Right?

But recent research found out of 350 companies, 74 percent moved applications to the cloud and then back on-premise. The research from IHS Markit says the main reason companies move systems back on premise vary from poor cloud performance to mergers and acquisitions to security concerns.

Agencies may not have that much flexibility as the private sector, but to be sure there are some agencies that moved to the cloud 5-7 years ago that either now are having second thoughts or have moved the apps back on premise.

What this all means is agencies must understand what they are trying to accomplish, the cost models for the different cloud options—infrastructure, platform and software as a service—and balancing data accessibility and security.

Agencies need to figure out what it takes to meet the goals of cloud smart and improve citizen services.

Mitchell Schwartz, a systems engineer at ThunderCat Technology, said agencies must ensure they control their data no matter where it lives.

“Now that agencies are talking about doing artificial intelligence and machine learning workloads, they may start accessing data that in the past they didn’t have access to, so there are new trends to their data that they hadn’t had before,” he said. “You want to know what is being accessed. Typically when we would do this on-premise, we wouldn’t put a lot of firewalls or lot of restrictions on systems that were next to each other. But we have to start doing that and we have to understand what commands are being accessed and run.”

Current Cloud Strategies

When we had cloud first as an initiative, there were a lot of agencies that just did the lift-and-shift. They moved everything into the cloud and had to containerize because a lot of them were legacy systems. Those may be the folks who have decided they need to move them back on-premise. We are taking an assessment approach to everything and really adopting a cloud smart approach.

Data Management

Containerization is the big trend right now, being able to package up your application. You can run it locally. You can run it in the cloud. You can also run it locally and have the cloud as a disaster recovery strategy, which makes it a more usable platform than just running it in the cloud or on-premise.

The Data/Cloud Balance

Before getting the data, you need to understand what are the pain points and what are the use cases that requires those data sets. We use a lot of human-centered design to talk to the workforce to understand what their pain points are and what are the things we can improve upon.

Listen to the full show:

Panel of experts

  • Cris Brown

    Deputy Chief Information Security Officer, Nuclear Regulatory Commission

  • Oki Mek

    Senior Advisor to the Chief Information Officer, Department of Health and Human Services

  • Ed Krejcik

    Manager, Isilon Systems Engineers, Dell Technologies

  • Mitchell Schwartz

    Systems Engineer, ThunderCat Technology

  • Jason Miller

    Executive Editor, Federal News Network

Sign up for breaking news alerts