Having the right foundation in place will set data-driven projects up for success and put agencies in the position to solve highly complex problems and achieve ...
A few months into 2020, the Federal Data Strategy Action Plan gave Federal agencies a clear set of goals. Agencies have put their data strategy at the top of the priority list – and they are hard at work.
Agencies are focused on developing and rolling out their enterprise data strategies – which are first and foremost on the list of operational needs. With that, it’s important to recognize the primacy of data – and a modern data experience built on a data-centric architecture provides a cornerstone to facilitate agencies achieving their goal and vision.
Agencies, particularly those with a healthcare focus, are already making notable strides and were praised by then-Federal CIO Suzette Kent at AFCEA Bethesda Health IT Summit in January. These agencies have made significant investments aimed at achieving the goals laid out in the strategy – and have proven an ability to share data to improve services and increase efficiency. The massive amount of data that is generated through these efforts can, in turn, solve highly complex problems.
A modern data experience offers a storage-as-a-service approach to enable agencies to extract more value from their data while reducing the complexity and expense of managing infrastructure. A modern data experience is simple. It should be API-defined, with easy, common management tools and proactive analytics that are actionable at scale. It should also be seamless. This experience can span any protocol, any tier of service level and multiple clouds in a single environment. Lastly, it should be sustainable. You should be able to buy only what is needed, and the experience should be self-upgrading.
A modern data experience is key for government agencies striving to achieve a data-driven government and realize strategic use of data. Federal agencies are responsible for massive amounts of data – and they have been tasked with leveraging it for insight and innovation. The White House Office of Science and Technology Policy recently posted a request for public comment on draft characteristics of repositories for managing and sharing data resulting from federally funded research – further solidifying the importance of accessible data and confirming the pressure agencies will continue to be under to manage, share, and store data securely and efficiently. A modernized approach that focuses on data and places it at the center of everything is the best solution as these agencies work to prioritize data over the coming year.
Organizations, including government agencies, seek a new type of data hub – one that allows organizations to consolidate all applications on a single storage platform to unify and share data across the applications that need them for better insight. The hub must allow for organizations to share and deliver data for modern analytics and AI, not primarily to store data.
Because agencies are looking to preserve existing infrastructure investment and reduce risk, the hub should allow them to share their data across teams and applications ‒ taking the key strengths of each silo and the unique features that make them capable for their own tasks, and integrating them into a single unified platform.
A data hub must have four qualities, which are essential to unifying data: high throughput ‒ file and object; native scale-out; multi-dimensional performance; and massively parallel architecture – that mimics the structure of GPUs – to deliver performance to tens of thousands of cores accessing billions of objects. A data hub may have other features, like snapshots and replication, but if any of the four features are missing from a storage platform, it is not a data hub.
The final Federal Data Strategy Action Plan gives agencies the guidance they need to leverage data to implement innovation that drives value for the public – but it is an undertaking. Having the right foundation in place will set data-driven projects up for success and put agencies in the position to solve highly complex problems and achieve the goals laid out for them.
Data comes first
Having the right foundation in place will set data-driven projects up for success and put agencies in the position to solve highly complex problems and achieve ...
A few months into 2020, the Federal Data Strategy Action Plan gave Federal agencies a clear set of goals. Agencies have put their data strategy at the top of the priority list – and they are hard at work.
Agencies are focused on developing and rolling out their enterprise data strategies – which are first and foremost on the list of operational needs. With that, it’s important to recognize the primacy of data – and a modern data experience built on a data-centric architecture provides a cornerstone to facilitate agencies achieving their goal and vision.
Agencies, particularly those with a healthcare focus, are already making notable strides and were praised by then-Federal CIO Suzette Kent at AFCEA Bethesda Health IT Summit in January. These agencies have made significant investments aimed at achieving the goals laid out in the strategy – and have proven an ability to share data to improve services and increase efficiency. The massive amount of data that is generated through these efforts can, in turn, solve highly complex problems.
A modern data experience offers a storage-as-a-service approach to enable agencies to extract more value from their data while reducing the complexity and expense of managing infrastructure. A modern data experience is simple. It should be API-defined, with easy, common management tools and proactive analytics that are actionable at scale. It should also be seamless. This experience can span any protocol, any tier of service level and multiple clouds in a single environment. Lastly, it should be sustainable. You should be able to buy only what is needed, and the experience should be self-upgrading.
A modern data experience is key for government agencies striving to achieve a data-driven government and realize strategic use of data. Federal agencies are responsible for massive amounts of data – and they have been tasked with leveraging it for insight and innovation. The White House Office of Science and Technology Policy recently posted a request for public comment on draft characteristics of repositories for managing and sharing data resulting from federally funded research – further solidifying the importance of accessible data and confirming the pressure agencies will continue to be under to manage, share, and store data securely and efficiently. A modernized approach that focuses on data and places it at the center of everything is the best solution as these agencies work to prioritize data over the coming year.
Organizations, including government agencies, seek a new type of data hub – one that allows organizations to consolidate all applications on a single storage platform to unify and share data across the applications that need them for better insight. The hub must allow for organizations to share and deliver data for modern analytics and AI, not primarily to store data.
Because agencies are looking to preserve existing infrastructure investment and reduce risk, the hub should allow them to share their data across teams and applications ‒ taking the key strengths of each silo and the unique features that make them capable for their own tasks, and integrating them into a single unified platform.
A data hub must have four qualities, which are essential to unifying data: high throughput ‒ file and object; native scale-out; multi-dimensional performance; and massively parallel architecture – that mimics the structure of GPUs – to deliver performance to tens of thousands of cores accessing billions of objects. A data hub may have other features, like snapshots and replication, but if any of the four features are missing from a storage platform, it is not a data hub.
The final Federal Data Strategy Action Plan gives agencies the guidance they need to leverage data to implement innovation that drives value for the public – but it is an undertaking. Having the right foundation in place will set data-driven projects up for success and put agencies in the position to solve highly complex problems and achieve the goals laid out for them.
Nick Psaki is Federal CTO at Pure Storage.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Related Stories
What the UK gets about remote work that the US doesn’t
Network connectivity: An urgent matter of national security
NIST’s quantum standards: The time for upgrades is now