Cloud is here to stay, and the next generation of technologies is already being built on its foundations.
This content is provided by Red Hat.
Every year or so, there’s a big pushback that labels the cloud as “just a fad.” One popular argument posits that it doesn’t even exist, since it’s really just someone else’s computer. While your average “hands-on-keyboards” types tend to be more up-to-date on such technologies, it’s surprisingly common for executives to be less in touch, deciding to “skip” cloud technologies in favor of seeing what comes next. Within the historical context of IT advances, it becomes clear that this is a mistake; cloud is here to stay, and the next generation of technologies is already being built on its foundations.
“There have been a lot of very specific steps over the course of history that have taken us from the mainframe days to where we are now. These steps have shaped our direction into the future,” said Damien Eversmann, senior solutions architect for Red Hat Public Sector.
When enterprise IT began with the mainframe, the systems administrators did everything, and the code base was monolithic. But as technology sped up, it diversified, which led to servers. Data sat on one server, business logic on another, and sometimes there would even be a front end presentation layer in the form of a client application, or eventually web browser login. And each of those layers were overseen by a different specialist; as the technology layers diversified, so did the technologists.
“Then the number of applications being created continued to grow,” Eversmann said. “And instead of having people specialize in one layer, people started to break things out into what we now call services oriented architectures. This is the precursor to microservices, one of the main things that we’re looking at with the cloud now.”
Once things started getting broken down into individual services, people realized that those services could be reused. For example, everyone needs a login function. So why not write it once, and share it across all applications?
“Now we can scale a much smaller, more granular piece to keep our entire application performing at its best. And this is where making that transition to the cloud was important,” Eversmann said. “Because the way things were with services oriented architecture, we had reached the limit of what could happen in your data center.”
But some people like to point to the pendulum effect around where compute resides to justify waiting until the cloud “fad” passes. Compute moves from the core, to the edge, back to the core again. From mainframes to workstations to data centers to web browsers.
If the current utilization of cloud is just the pendulum swinging back to the core, albeit one that no longer exists in data centers, why not wait until the pendulum swings back, and skip cloud altogether?
Because it turns out the pendulum metaphor is a little simplistic. Eversmann has a better one.
“You know in movies, the ninja that jumps up the alley by jumping back and forth between the buildings? Your pendulum is swinging back and forth but at the same time you keep popping higher and higher,” Eversmann said. “And that’s what’s happening here:as we go back and forth between this data at the core, data at the edge, we’re also leapfrogging the technology of the last time.”
In fact, that hypothetical pendulum (or wall-jumping ninja, if you will) is already moving back in the other direction. Technological advancements like the Internet of Things and 5G are enabling so much data to be gathered at the edge that it’s stressing network bandwidth to send it back to a centralized location for processing. Accordingly, many federal agencies are already looking at and planning for the ability to push compute back out to the edge, so the data gets analyzed in the field where it’s collected, and the only thing that gets pushed back is the analysis itself.
“And you’d think that that was a contrary argument to the cloud, right? But it’s not because if you look at how you define the edge, it’s actually the cloud,” Eversmann said. “We now suddenly have the cloud on both sides of this pendulum swing. If we swung into far off servers that are really powerful, that’s the cloud. As we need to bring computing closer to the masses, we’re not bringing it to their desktop anymore. We’re bringing it to an edge computing node on the cloud.”
That’s why you can’t just skip the cloud and see what comes next. Cloud isn’t the technology; it’s the platform.
“If you look at some of the functionality and the compute capabilities that live in these hyper-scalers, like AWS or Azure, it’s stuff that you can’t really get in your data center without becoming a hyper-scaler yourself,” Eversmann said. “There are things that we’re doing now, with function-based computing and serverless computing that can’t be matched without going to the cloud.”
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.