Insight by Booz Allen Hamilton

Agencies look to use data to drive customer experience

Agencies are increasingly turning to design thinking, digital tools and their own data stores to drive forward their “customer experience” initiatives.

The Biden administration’s latest budget request signaled a whole-of-government focus on the concept of CX. The 2024 proposal seeks to add 120 “customer experience experts” across government to “conduct human-centered design and digital service delivery.”

Agencies are increasingly looking to make better use of customer feedback data to improve citizen-facing services, while also streamlining access to data needed to verify eligibility to federal benefits programs.

“Data and experience are becoming the last two hard problems,” Dan Tucker, senior vice president for the civil sector at Booz Allen Hamilton, said on Federal News Network.

“It’s so much easier today to  spin up a cloud infrastructure,” Tucker said. “Low code/no code platforms make it so much easier to build an application front end these days. But data has gravity. Data is dirty. Data is siloed. And that is really still a very difficult problem.”

Organizations are increasingly treating data as a commodity by establishing “data product managers” who are responsible for data quality, sharing and standards, Tucker continued. And chief data officers are using a combination of “carrot and stick” to incentivize sharing in some cases and require it in others.

“The challenge is really the policies, the patterns, the processes that allow that technology to enable the data sharing,” Tucker said. “That’s the friction that I’m seeing in a lot of places today.”

While agencies had previously explored using massive pools of data and establishing vast warehouses such as “data lakes,” that solution has become increasingly impractical for a variety of reasons, not least of which is the high costs of moving data around and maintaining those stores.

One concept that’s increasingly gaining traction is the use of a “data mesh,” a relatively new buzzword that describes an architecture where data stays with the organization that owns it, but is made available to others through the use of application programming interfaces (API’s) and other mechanisms.

“That seems to get us to the step function change that we need, as opposed to pouring everything into a data lake, which takes a lot of time and is particularly expensive to operate,” Tucker argues.

The data mesh also operates on a set of governance policies to ensure quality, as well as access and security. The latter issue is particularly important as agencies move to adopt zero trust architectures that aim to “verify anything and everything attempting to establish access,” in the words of the White House’s zero trust strategy.

“The technologies are in place,” Tucker said. “It’s just how do you put the processes in place to say, ‘Okay, this is going to be our mechanism for ensuring trust, and ensuring the security and privacy of the data we have.”

Training and education will also be a major factor in whether agencies can successfully manage and protect their data, and use it to introduce better public-facing services.

While concepts like the “data pipeline” and “data operations” may seem intimidating, Tucker said those represent the more cutting edge of the different data qualifications. Meanwhile, the key standards around metadata, data quality and other basic data management processes have become well established.

“The data pipeline concept and concepts around data ops have emerged over the past five years or so, but up to that point, the standards around good data management are mature and really well understood,” he said. “It wouldn’t take a long time for somebody to get up to speed on them.”

Listen to the full show:

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.