Joe Kim is the senior vice president and global chief technology officer of SolarWinds.
As Congress continues to seek ways to push the Modernizing Government Technology (MGT) Act forward, there’s one bipartisan truth that everyone can agree on: Legacy technologies are keeping agencies’ IT operations from being efficient, reliable and secure.
But while the MGT Act focuses primarily on the need to upgrade individual IT systems, agencies should start their modernization initiatives right at the network level.
Federal IT managers of all political persuasions will undoubtedly testify to the fact that many of the network technologies that government agencies have used for years no longer cut it in today’s cloud-driven world. Many of these systems are an astounding 50 years old! These dinosaurs are not only outdated; they’re also not secure, and are unsuitable for current and future network demands.
Let’s take a look at some ways legacy IT networks are holding us back, and how modern network architectures can help propel us forward.
Something that was built in the 1970s wasn’t designed to work in 2017, let alone withstand a modern cyber attack. Some of these networks haven’t been updated in recent memory, and may not even be supported anymore. Additionally, the pool of individuals who have the skills to maintain and update them may be small, if non-existent.
These systems must be modernized for better efficiency and to be able to detect, defend and evolve against today’s cyber threats. Managers must explore creating modern and automated networks that are software-defined; that is, the actions that were traditionally handled by hardware (firewalls, switches, routers) are now completely software-driven. This will provide managers with automated, easier-to-manage platforms that can detect and alert them to potential security risks and serve as a flexible infrastructure designed to adapt to future needs.
Greater flexibility for now and the future
The key word here is “flexible.” Unlike legacy IT networks, modern, software-defined network architectures are built on open standards; thus, they are more flexible and can easily scale depending on the needs and demands of the agency.
In 2017, we’re seeing a number of trends begin to come into their own: Big Data continues to grow, use of mobile technologies has become pervasive, and the Internet of Things (IoT) has gone from promise to reality. All of these are already having a tremendous impact on the network, and it’s safe to say that they’ll continue to do so even as new technologies—which will most likely be just as demanding—are introduced in the years ahead.
Today’s networks cannot be rigid. Managers must be able to automatically scale them up or down as needed. Networks must be adaptable enough to accommodate usage spikes and changing bandwidth requirements, so bottlenecks are eliminated and five nines of availability is maintained.
Open, software-defined network architectures allow for this, while also enabling managers to deploy different solutions to suit their needs and budgets. We’re long past the days where agencies are reliant solely on a single vendor. Agencies may be using a combination of services and solutions, and it’s not uncommon for agencies to use a hybrid IT infrastructure that includes technologies from different vendors. With the advent of the cloud and open standards, vendor lock-in is, just like legacy IT systems, a thing of the past.
Insight into hybrid infrastructures
Hybrid IT networks are becoming more commonplace in the public sector. In fact, a recent SolarWinds report indicates that a majority of government agencies are moving to the hybrid model.
This type of environment poses unique challenges. Having part of the network and its applications exist off-premises makes it difficult to gain complete visibility into issues that may be impacting network performance. Traditional network monitoring tools that are designed to manage on-premises network activity are insufficient for these environments.
Managers must investigate and consider deploying solutions that can provide insight into the network operations and applications wherever they may reside, both on- and off-premises. They need to be able to see through the blind spots that exist where their data traverses between their hosting providers and their on-site infrastructures. Having this complete and unimpeded perspective is critical to maintaining reliable, fast, and well-protected networks.
These networks are the beating hearts of federal IT, and while there’s little doubt that individual hardware components must be updated, we cannot forget about the infrastructure that drives all of these pieces. While the MGT Act is a step in the right direction, none of us should lose sight of the fact that it is network modernization that will ultimately drive better security, reliability and efficiency.