In the past couple of decades, the seeds for future technology trends were often planted months, if not years, in advance. For most agencies, embracing new technologies was a process of watch, learn, adopt and adapt. Large, upfront investments in software and hardware often required very careful planning and risk analysis. As a result, it can take some time for agencies to gain the cost efficiencies and growth capabilities created by new technology.
Of course that cycle has been shortened in recent years. With technology now often deployed in a service-driven model, government agencies have been able to move more quickly to explore the efficiencies of a new technology without making as much of a substantial investment up front. If a particular technology doesn’t work, agencies can have more flexibility to try new things. You might say that the watch and learn phases have been compressed, and in some cases, the adopt phase has moved to the front of the cycle.
However, while the adoption cycle for using new technology may be shorter, the overall value created by any technology is still its impact on the mission — and the most important consideration for agencies looking to invest in any technology. Unfortunately, trying new things more quickly doesn’t always guarantee that you’ll get better (or even faster) results. You might, but you might not. It can be like assembling your complex holiday gifts without reading the instructions.
For this reason, we believe 2016 will be a year that requires true diligence and insight in deciding where and how to invest in technology. Cloud computing has changed the speed and the efficiency at which data can be processed. But it has also changed the way in which data is moved, stored and managed. Make no mistake: Data is any agency’s most important asset and managing that data most effectively is vital.
Based on our work with government agencies over the past year, we have developed four federal IT predictions for 2016 and beyond.
Cloud computing will be fully realized through hybrid clouds
The hybrid cloud took off in 2015 across the public and commercial sectors. Sandler Research found that the market is growing at an annual compound rate of 29.22 percent over the period of 2014-19. Hybrid cloud infrastructure proved particularly appealing for government agencies because it offers the control and security of a private cloud, while allowing them to leverage the elasticity and scalability of a public cloud.
What agencies have learned is that private cloud services can be costly to manage and require prohibitive upfront capital expenses, while public cloud services risk vendor lock-in, restricting the ability of agencies to benefit from evolving cloud provider services and cost competitiveness. Therefore, IT organizations must learn to support a modern constituency of end users who are eager to create new, flexible and responsive IT resource environments, and who see the public cloud as a means to achieve this goal on their own.
Look for hyper-scale providers of cloud services such as Amazon, Microsoft and Google to continue to change the game by delivering data management services that enable agencies to store large amounts of data more efficiently. In 2016 agencies will remain focused on the key challenge of not only storing data, but also unlocking its value. Mining data and protecting data means that it must often be mobile. Being able to manage data in multiple environments, whether in a public cloud, private cloud or even on premise is the challenge. For this reason, hybrid cloud will continue as the dominant model for managing and storing data in 2016.
Rise of the data manager
The role of storage administrators will continue to evolve in response to transforming government IT landscapes. As organizations move to a cloud delivery model to reduce costs and increase flexibility, they shift from being builders and operators of their own data centers to being brokers of services that span both private and public cloud resources. In 2016, the classic storage administrator will either evolve into a data manager of the hybrid cloud with a seat at the executive table, or hole up in comfortable storage product minutia and become increasingly less relevant. The fact is that data managers will need to oversee a hybrid cloud architecture able to deliver seamless data management across cloud resources so as to complement a private cloud with a public cloud strategy that doesn’t introduce new risk, complicate policies or result in the loss of control of valuable agency information.
The evolutionary path of data managers is not unlike that of the government CIO, who has become a broker of cloud services sorting their application portfolio into those applications that they must control entirely (in on-premises private clouds), control partially (in enterprise-grade public clouds), workloads that are more transient (public hyper-scale clouds) as well as those best purchased as software-as-a-service.
Software will create a modern ‘data fabric’ architecture
Twenty-four of the 26 federal agencies participating in the Office of Management and Budget’s (OMB) information technology reform initiatives reported achieving an estimated total of $3.6 billion in cost savings and avoidances between fiscal years 2011 and 2014. Most of the cost savings were attributable to shifting workloads to the cloud and data center consolidation.
We are witnessing the transformation of the data center in a way that not only impacts future cost savings in 2016 and beyond, but also brings operational efficiency when it comes to data mobility and security. With data no longer confined to buildings, mobile devices or even one cloud infrastructure, agile, smart software must be the brains that unlock the intellectual capital of an organization — it’s data.
What’s emerging is a modern “data fabric” woven together and managed by intelligent software that not only stores, but protects, backs up and moves data wherever it needs to be, quickly and efficiently. This data fabric allows agencies to control, integrate, move and consistently manage their data across different IT environments and multiple cloud vendors.
Converged infrastructure will ignite agency innovation
IT modernization and innovation will be a prevailing public sector theme in 2016. Look no further than the Federal Information Technology Acquisition Reform Act (FITARA), a law designed to give agency chief information officers more power to plan and execute IT plans. Former House staff member Rich Beutel, a primary author of FITARA, recently commented that with the law, “really what’s at stake here is the value of IT as an instrument to drive innovation and to deliver 21st-century government services.”
There is a role for converged infrastructure to play in driving innovation in 2016, as it will minimize the drudgery of hardware integration and free up agencies to experiment with software innovation. This is due to the fact that converged infrastructure — which offers a shared environment that benefits agencies through non-disruptive upgrades, increased scalability and security control, and easy integration with numerous software and hardware platforms — is designed to increase IT responsiveness to business demands while reducing the overall cost of computing.
In addition to simplicity and speed, converged infrastructure also addresses a pressing agency challenge: shortage of IT skills. In 2016, organizations will see an increase in investments on converged infrastructure as DevOps — which promises communication, collaboration, integration and automation to create efficiencies within organizations — emerges as a key use-case driving growth. While DevOps requires more time for application programming, it automates and optimizes many processes like hardware configuration. It also promises reduced time to deployment for new applications. This transition of workflows can help IT teams refocus their staff to take on tasks that help achieve and further business goals, while driving innovation forward.
Rob Stein is the vice president for NetApp U.S. Public Sector.
Four 2016 federal IT predictions: It’s all about the data
Rob Stein, the vice president for NetApp U.S. Public Sector, highlights possible trends in how agencies buy and deploy technology over the next year.
In the past couple of decades, the seeds for future technology trends were often planted months, if not years, in advance. For most agencies, embracing new technologies was a process of watch, learn, adopt and adapt. Large, upfront investments in software and hardware often required very careful planning and risk analysis. As a result, it can take some time for agencies to gain the cost efficiencies and growth capabilities created by new technology.
Of course that cycle has been shortened in recent years. With technology now often deployed in a service-driven model, government agencies have been able to move more quickly to explore the efficiencies of a new technology without making as much of a substantial investment up front. If a particular technology doesn’t work, agencies can have more flexibility to try new things. You might say that the watch and learn phases have been compressed, and in some cases, the adopt phase has moved to the front of the cycle.
However, while the adoption cycle for using new technology may be shorter, the overall value created by any technology is still its impact on the mission — and the most important consideration for agencies looking to invest in any technology. Unfortunately, trying new things more quickly doesn’t always guarantee that you’ll get better (or even faster) results. You might, but you might not. It can be like assembling your complex holiday gifts without reading the instructions.
For this reason, we believe 2016 will be a year that requires true diligence and insight in deciding where and how to invest in technology. Cloud computing has changed the speed and the efficiency at which data can be processed. But it has also changed the way in which data is moved, stored and managed. Make no mistake: Data is any agency’s most important asset and managing that data most effectively is vital.
Based on our work with government agencies over the past year, we have developed four federal IT predictions for 2016 and beyond.
Cloud computing will be fully realized through hybrid clouds
The hybrid cloud took off in 2015 across the public and commercial sectors. Sandler Research found that the market is growing at an annual compound rate of 29.22 percent over the period of 2014-19. Hybrid cloud infrastructure proved particularly appealing for government agencies because it offers the control and security of a private cloud, while allowing them to leverage the elasticity and scalability of a public cloud.
What agencies have learned is that private cloud services can be costly to manage and require prohibitive upfront capital expenses, while public cloud services risk vendor lock-in, restricting the ability of agencies to benefit from evolving cloud provider services and cost competitiveness. Therefore, IT organizations must learn to support a modern constituency of end users who are eager to create new, flexible and responsive IT resource environments, and who see the public cloud as a means to achieve this goal on their own.
Look for hyper-scale providers of cloud services such as Amazon, Microsoft and Google to continue to change the game by delivering data management services that enable agencies to store large amounts of data more efficiently. In 2016 agencies will remain focused on the key challenge of not only storing data, but also unlocking its value. Mining data and protecting data means that it must often be mobile. Being able to manage data in multiple environments, whether in a public cloud, private cloud or even on premise is the challenge. For this reason, hybrid cloud will continue as the dominant model for managing and storing data in 2016.
Rise of the data manager
The role of storage administrators will continue to evolve in response to transforming government IT landscapes. As organizations move to a cloud delivery model to reduce costs and increase flexibility, they shift from being builders and operators of their own data centers to being brokers of services that span both private and public cloud resources. In 2016, the classic storage administrator will either evolve into a data manager of the hybrid cloud with a seat at the executive table, or hole up in comfortable storage product minutia and become increasingly less relevant. The fact is that data managers will need to oversee a hybrid cloud architecture able to deliver seamless data management across cloud resources so as to complement a private cloud with a public cloud strategy that doesn’t introduce new risk, complicate policies or result in the loss of control of valuable agency information.
The evolutionary path of data managers is not unlike that of the government CIO, who has become a broker of cloud services sorting their application portfolio into those applications that they must control entirely (in on-premises private clouds), control partially (in enterprise-grade public clouds), workloads that are more transient (public hyper-scale clouds) as well as those best purchased as software-as-a-service.
Software will create a modern ‘data fabric’ architecture
Twenty-four of the 26 federal agencies participating in the Office of Management and Budget’s (OMB) information technology reform initiatives reported achieving an estimated total of $3.6 billion in cost savings and avoidances between fiscal years 2011 and 2014. Most of the cost savings were attributable to shifting workloads to the cloud and data center consolidation.
We are witnessing the transformation of the data center in a way that not only impacts future cost savings in 2016 and beyond, but also brings operational efficiency when it comes to data mobility and security. With data no longer confined to buildings, mobile devices or even one cloud infrastructure, agile, smart software must be the brains that unlock the intellectual capital of an organization — it’s data.
What’s emerging is a modern “data fabric” woven together and managed by intelligent software that not only stores, but protects, backs up and moves data wherever it needs to be, quickly and efficiently. This data fabric allows agencies to control, integrate, move and consistently manage their data across different IT environments and multiple cloud vendors.
Read more: Commentary
Converged infrastructure will ignite agency innovation
IT modernization and innovation will be a prevailing public sector theme in 2016. Look no further than the Federal Information Technology Acquisition Reform Act (FITARA), a law designed to give agency chief information officers more power to plan and execute IT plans. Former House staff member Rich Beutel, a primary author of FITARA, recently commented that with the law, “really what’s at stake here is the value of IT as an instrument to drive innovation and to deliver 21st-century government services.”
There is a role for converged infrastructure to play in driving innovation in 2016, as it will minimize the drudgery of hardware integration and free up agencies to experiment with software innovation. This is due to the fact that converged infrastructure — which offers a shared environment that benefits agencies through non-disruptive upgrades, increased scalability and security control, and easy integration with numerous software and hardware platforms — is designed to increase IT responsiveness to business demands while reducing the overall cost of computing.
In addition to simplicity and speed, converged infrastructure also addresses a pressing agency challenge: shortage of IT skills. In 2016, organizations will see an increase in investments on converged infrastructure as DevOps — which promises communication, collaboration, integration and automation to create efficiencies within organizations — emerges as a key use-case driving growth. While DevOps requires more time for application programming, it automates and optimizes many processes like hardware configuration. It also promises reduced time to deployment for new applications. This transition of workflows can help IT teams refocus their staff to take on tasks that help achieve and further business goals, while driving innovation forward.
Rob Stein is the vice president for NetApp U.S. Public Sector.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Related Stories
What the UK gets about remote work that the US doesn’t
Network connectivity: An urgent matter of national security
NIST’s quantum standards: The time for upgrades is now