Artificial intelligence: Incubator, accelerator for federal modernization

Gary Newgaard, vice president, public sector at Pure Storage, made the case why agencies should develop a data hub as they move into using artificial intelligen...

Federal IT managers, focused on modernizing government in a climate where the long-standing mantra is “Do more with less,” are increasingly optimistic about the potential of artificial intelligence to transform the IT landscape. As with any technology, success is not a given.

Instead, AI requires a strong foundation — a data-centric architecture that optimizes compute power, storage and data to create both a powerful innovation incubator and a transformation accelerator.

IT managers within the federal government see the potential of AI. According to a recent study, 77 percent say this technology will change the way government thinks about and processes information, while 61 percent say the technology could solve one or more of the challenges their agency face today.

The White House also jumped in with the creation earlier this year of an Interagency Select Committee on Artificial Intelligence, which aims to improve the coordination of federal efforts related to AI to ensure continued US leadership in this emerging, and potential-rich field.

This committee continues to make progress by understanding the potential of this emerging technology and highlighting the need for a coordinated effort around the development of AI initiatives in the federal government. As the committee continues its work, it is important to consider the AI challenge holistically, starting with infrastructure.

While government and industry agree on the potential AI has to offer – amid the enthusiasm, the key is overcoming adoption challenges. Primary barriers include cost, infrastructure readiness and sparse talent resources.

Here are our recommendations to the committee for how to overcome these challenges:

Explore use cases and benefits to weigh investment

Identifying actual use cases where the benefits of AI will be most pronounced. Whether it is at a basic automation level or applied to data analytics processes, this will help you determine the return on investment you can expect as a way to make the business case.

In the federal market, the National Institutes of Health is leveraging AI and machine learning to accelerate the pace of cancer and genomics research, and to gain a better understanding of how cancer evolves, how genetic activity affects quality of life and human development, and how to mitigate or prevent genetic disorders from afflicting people throughout their lives. Improvements in IT infrastructure are giving hours and days back to researchers, providing more time to search for answers, and reducing time spent waiting for queries to process.

Reducing the time required to diagnose cancer, genetic disorders and drug interactions has the potential to save many lives as well as billions of dollars in diagnostic costs, acute and ongoing care, and drug development timelines.

A varied approach to develop talent, acquire tools

To acquire necessary skills for developing and deploying these techniques and advancing AI and deep learning, it’s essential to engage in public-private partnerships and provide research grants and fellowships to universities and commercial enterprises. The expertise is out there, and partnering with and bringing skilled people into government research will provide the fastest path to leveraging this technology in the government space.

When it comes to leveraging the technology, an enormous amount of time has been spent building the tools necessary to start working with AI. Commercial industry has pioneered purpose-built, off-the-shelf hardware and software capabilities that contain everything required to create AI capabilities, and these can be used to enable the government to get to work quickly.

Rather than extending the capabilities analysis and requirements determination process, the prototyping and exploration cycle, and ultimately, procurement and implementation phases — which can often take years — enables the rapid adoption of the capabilities necessary to get to work.

Put data at the center of infrastructure

As we think about AI, two new elements are forcing a change in data center architecture. First is the explosion of data creation. Second, software now allows us to mine that data for competitive advantage.

Data-centricity in your infrastructure design, which keeps data and applications in place while technology is built around it, can help turn AI dreams into reality by fundamentally transforming data center architecture with the data as its core design element.

While determining how best to incorporate AI initiatives, feds should already be focused on data collection and cleaning in preparation for AI implementation and ensuring they are equipped to handle this explosive growth of data. As the size of data sets has increased exponentially, moving and replicating data has become a prohibitive expense, and a bottleneck for innovation.

A new model is needed — enter the data-centric architecture.

A new architecture means one that can share and deliver data anywhere, any time — a data hub that can be the way forward for a modern government. A true data hub must include:

  • High throughput for file and object storage
  • True scale-out
  • Multi-dimensional performance that is built to respond to any data type with any access pattern, and
  • Must be massively parallel.

These four features are essential to unifying data.

Too much data remains stuck in a complex sprawl of silos. Each is useful for its original task. But in a data-first world, silos are counter-productive. Silos mean data can’t do work while it’s not being actively managed.

For agencies that want to keep data stored, a data hub does not replace data warehouses or data lakes. For those looking to unify and share their data across teams and applications, a data hub identifies the key strengths of each silo, integrates their unique features and provides a single unified platform.

To truly benefit from a data-centric architecture, agencies need the system to work in real-time, providing the performance necessary for the next-generation analytics that make AI so powerful. It also needs to be available on-demand and be self-driving. Consolidating and simplifying this through flash makes it far easier for teams to support the technology that is fueling tomorrow’s growth.

Storage has a unique opportunity to become much more than a siloed repository for the deluge of data constantly generated, but rather a platform that shares and delivers data to create value.

Computing power, networking and storage are the foundational elements that enable the incredible work of AI – but unless these elements move forward at the same rate, the process will be off balance – and other technologies will be held back. Thinking about AI, two new components are forcing a change in data center architecture. First is the explosion of data creation. Second, software now allows mining of that data. A data-centric architecture that is developed with optimization in mind will create both a powerful innovation incubator and transformation accelerator.

 Gary Newgaard is the vice president, public sector at Pure Storage.

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.