There’s a lot of discussion in Washington, D.C. right now about running the federal government more like a business. While government is not completely analogous to commercial enterprises, it’s a worthy conversation as we all stand to benefit from a more effective and efficient approach to the people’s business.
Technology, without question, plays a critical role in this vision, and we’ve seen an uptick in the focus on modernization. Artificial intelligence (AI) and machine learning are emerging capabilities that have tremendous potential for a smarter and agile government, and one that is simultaneously more efficient and effective.
A 2017 report by Deloitte, AI-augmented government, illustrates the dollars-and-cents potential. The firm projects that “simply automating tasks that computers already routinely do could free up 96.7 million federal government working hours annually, potentially saving $3.3 billion.”
At the high end, Deloitte estimates that “AI technology could free up as many as 1.2 billion working hours every year, saving $41.1 billion.” These valuable hours will ideally allow employees to focus on tasks that improve agency outcomes and lead to more satisfied citizens.
As for the effectiveness part of the equation, AI, in the private sector, proves its ability to elevate the customer experience, identify fraudulent activity, streamline navigation, reduce energy use, improve public safety and much more.
There is every reason to expect that these outcomes can be replicated in the public sector. The soil is rich with possibilities, and the seeds for innovation are almost too abundant to count. AI has the potential to improve the effectiveness of some of our nation’s largest budget spends, including driving better outcomes for our Medicare/Medicaid investment as well as identifying (and even preventing) fraud, waste and abuse in health care and across numerous other government programs.
We’re seeing the Defense Department beginning to leverage AI to improve preventive maintenance for mission-critical assets, which ultimately protects the safety of our military personnel and homeland. AI can also make a stand on the frontlines of efforts to improve public health — with the ability to predict the emergence of the next pandemic and enable swift containment. And, it has the potential to boost disaster preparedness, opening the door to better prediction of longer-term weather patterns so that FEMA can allocate and position resources well in advance of an event.
And, AI holds the promise of elevating the customer experience that government agencies deliver to world-class status, through smarter and more efficient interactions.
The journey to AI is not without its hurdles, including navigating ethical questions about black-box algorithms that prevent visibility into learning paths. These challenges and questions, however, can be resolved with the creation of best practices and ethical guidelines.
Even as we focus on resolving these issues and advancing other modernization efforts, it’s important to consider that federal agencies can take meaningful steps to prime the AI engine today. Big data is an important fuel. It’s the bedrock for real-time analytics, which continues to loom large in the minds of leaders in the federal sector.
While the volume of unstructured data has exploded, legacy storage built to house that data has not fundamentally changed in decades. Deep learning (DL), graphics processing units (GPUs), and the ability to store and process very large datasets at high speed are fundamental for AI. DL (a form of machine learning that loosely mimics the way that information is processed in the nervous system) and GPUs (which are used to rapidly render images, animations, and videos) are massively parallel, but legacy storage technologies were not designed for these workloads — they were designed in an era with an entirely different set of expectations around speed, capacity and density requirements.
Legacy storage has become a bottleneck for agencies that want to turn big data into a big advantage through real-time intelligence. Within the last two years, the amount of compute required to run bleeding-edge deep learning algorithms has jumped 15-fold. Compute delivered by GPUs has jumped 10-fold. By and large, legacy storage capabilities have stayed stagnant.
If data is the new currency for the 21st Century and we are committed to running the federal government more like a business, we cannot design the systems we need on the technologies of the last century.
We need to begin building the new foundation today with data platforms that are reimagined from the ground-up for the modern era of intelligent analytics. Slow storage means slow machine learning performance. Imagine a marathon runner trying to re-hydrate after the race through a wafer-thin straw — essentially, this is what happens to organizational data run on yesterday’s storage.
Ultimately, much of the available insights remain locked in the data, limiting the intelligence which could be extracted from the stored data.
Several key characteristics define data platform and storage requirements for the cloud era:
Silicon-optimized versus disk-optimized storage, to support gigabytes/second of bandwidth per application. The performance of SSD technology exceeds that of hard disk drive-based storage many times over.
A highly-parallel application architecture that can support thousands to tens of thousands of composite applications sharing petabytes of data versus tens to hundreds of monolithic applications consuming terabytes of data siloed to each application.
Elastic scale to petabytes that allow organizations to pay as they grow with perpetual forward compatibility.
Full automation to minimize management resources required to maintain the platform.
The ability to support and span multiple cloud environments from core data centers to edge data centers, as well as across multi-cloud infrastructure-as-a-service (IaaS) and software-as-a-service (SaaS) providers.
An open development platform versus a closed ecosystem built on complex one-off storage software solutions.
A subscription-consumption model that supports constant innovation and eliminates the churn and endless race to expand storage to meet growing needs and refresh every three-to-five years.
While there is little consensus in Washington today, one idea that we can all get behind is the inherent benefit of a more efficient and effective government. AI and other next-gen technologies offer an important path forward, and it’s vital to begin the journey today with a data foundation designed to accelerate adoption and impact.
Gary Newgaard is the vice president of public sector for Pure Storage.
It’s time to prime the AI engine
Gary Newgaard, the vice president of public sector for Pure Storage, details seven ways agencies should prepare to use artificial intelligence.
There’s a lot of discussion in Washington, D.C. right now about running the federal government more like a business. While government is not completely analogous to commercial enterprises, it’s a worthy conversation as we all stand to benefit from a more effective and efficient approach to the people’s business.
Technology, without question, plays a critical role in this vision, and we’ve seen an uptick in the focus on modernization. Artificial intelligence (AI) and machine learning are emerging capabilities that have tremendous potential for a smarter and agile government, and one that is simultaneously more efficient and effective.
A 2017 report by Deloitte, AI-augmented government, illustrates the dollars-and-cents potential. The firm projects that “simply automating tasks that computers already routinely do could free up 96.7 million federal government working hours annually, potentially saving $3.3 billion.”
At the high end, Deloitte estimates that “AI technology could free up as many as 1.2 billion working hours every year, saving $41.1 billion.” These valuable hours will ideally allow employees to focus on tasks that improve agency outcomes and lead to more satisfied citizens.
Learn how DLA, GSA’s Federal Acquisition Service and the State Department are modernizing their contract and acquisition processes to make procurement an all-around better experience for everyone involved.
As for the effectiveness part of the equation, AI, in the private sector, proves its ability to elevate the customer experience, identify fraudulent activity, streamline navigation, reduce energy use, improve public safety and much more.
There is every reason to expect that these outcomes can be replicated in the public sector. The soil is rich with possibilities, and the seeds for innovation are almost too abundant to count. AI has the potential to improve the effectiveness of some of our nation’s largest budget spends, including driving better outcomes for our Medicare/Medicaid investment as well as identifying (and even preventing) fraud, waste and abuse in health care and across numerous other government programs.
We’re seeing the Defense Department beginning to leverage AI to improve preventive maintenance for mission-critical assets, which ultimately protects the safety of our military personnel and homeland. AI can also make a stand on the frontlines of efforts to improve public health — with the ability to predict the emergence of the next pandemic and enable swift containment. And, it has the potential to boost disaster preparedness, opening the door to better prediction of longer-term weather patterns so that FEMA can allocate and position resources well in advance of an event.
And, AI holds the promise of elevating the customer experience that government agencies deliver to world-class status, through smarter and more efficient interactions.
The journey to AI is not without its hurdles, including navigating ethical questions about black-box algorithms that prevent visibility into learning paths. These challenges and questions, however, can be resolved with the creation of best practices and ethical guidelines.
Even as we focus on resolving these issues and advancing other modernization efforts, it’s important to consider that federal agencies can take meaningful steps to prime the AI engine today. Big data is an important fuel. It’s the bedrock for real-time analytics, which continues to loom large in the minds of leaders in the federal sector.
While the volume of unstructured data has exploded, legacy storage built to house that data has not fundamentally changed in decades. Deep learning (DL), graphics processing units (GPUs), and the ability to store and process very large datasets at high speed are fundamental for AI. DL (a form of machine learning that loosely mimics the way that information is processed in the nervous system) and GPUs (which are used to rapidly render images, animations, and videos) are massively parallel, but legacy storage technologies were not designed for these workloads — they were designed in an era with an entirely different set of expectations around speed, capacity and density requirements.
Legacy storage has become a bottleneck for agencies that want to turn big data into a big advantage through real-time intelligence. Within the last two years, the amount of compute required to run bleeding-edge deep learning algorithms has jumped 15-fold. Compute delivered by GPUs has jumped 10-fold. By and large, legacy storage capabilities have stayed stagnant.
Read more: Commentary
If data is the new currency for the 21st Century and we are committed to running the federal government more like a business, we cannot design the systems we need on the technologies of the last century.
We need to begin building the new foundation today with data platforms that are reimagined from the ground-up for the modern era of intelligent analytics. Slow storage means slow machine learning performance. Imagine a marathon runner trying to re-hydrate after the race through a wafer-thin straw — essentially, this is what happens to organizational data run on yesterday’s storage.
Ultimately, much of the available insights remain locked in the data, limiting the intelligence which could be extracted from the stored data.
Several key characteristics define data platform and storage requirements for the cloud era:
While there is little consensus in Washington today, one idea that we can all get behind is the inherent benefit of a more efficient and effective government. AI and other next-gen technologies offer an important path forward, and it’s vital to begin the journey today with a data foundation designed to accelerate adoption and impact.
Gary Newgaard is the vice president of public sector for Pure Storage.
Want to stay up to date with the latest federal news and information from all your devices? Download the revamped Federal News Network app
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Related Stories
Take it to the limit: Going beyond zettabytes
Artificial intelligence proves major time savings for federal employees
Greg Allen: Artificial intelligence finally coming into its own