Communication, training must drive agency AI adoption
Lucy Melvin, a principal at Deloitte Consulting, said employees must be part of the strategy to test and figure out use cases for artificial intelligence.
Back in April, President Joe Biden called for an artificial intelligence talent surge. The goal is to hire 500 AI experts by the end of fiscal 2025. That doesn’t include the Defense Department’s need to hire more than 2,500 AI experts this year alone and more than 9,000 over the next few years.
New data from July shows agencies have made more than 200 hires under this AI talent surge so far this year.
Additionally, data from the White House in April shows applications for AI and AI-enabling positions across the federal government more than doubled between January and March.
Across the government, agencies are taking steps to prepare their workforce for a future where AI is part of every job from IT to procurement to financial management.
It’s not just about hiring new workers, but agencies also have to train existing employees for how to use these AI capabilities.
And, of course, the workforce is just one key piece to ensuring agencies are ready to adopt AI tools and capabilities.
Lucy Melvin, a principal at Deloitte Consulting, said even during the early days of agencies using AI, say between 2003 and 2016, departments have seen a positive impact with employees spending 1.6% less time on low-value tasks and 4.6% more time on high value tasks.
“What that means is that we’re seeing a shift already, and the new technologies that we’re seeing, like generative AI, are going to continue that trend, and likely accelerate it,” Melvin said on the discussion AI and Human Capital: Redefining Collaboration in the Public Sector. “One thing that’s really important is to empower the workforce by giving workers access to tools to democratize access to AI and Gen AI tools, and to allow them to experiment and understand these technologies. Now, while they’re doing that, it’s also important that leaders put in place the guidelines and policies that are needed on the appropriate use so this is something that the executive order and Office of Management and Budget mandates have laid out guidance on.”
Need to identify best, not so good use cases
OMB issued final guidance in May detailing a host of new requirements, including establishing AI governance boards that will be led by their deputy secretaries or an equivalent executive and ensure training for current employees.
Melvin said it’s one thing to create policies and oversight, but it’s another to explain to the workforce how leaders communicate expectations about those foundational actions.
“It’s important to bring together teams to identify where the best use cases and having these safe spaces to get hands on experience as a part of that. Something that we see is it’s important for leaders to also identify like what are not appropriate use cases for these technologies. Where is this something that is not going to be the right move for the organization? It’s not an appropriate use in light of the mission?” she said. “Another thing that’s really important on that front is for leaders to be communicating about this. One of the most important things that we see in terms of appropriate adoption and really enabling the workforce is for leaders to communicate about their AI strategy.”
Melvin said sometimes leaders forget to open up the dialog around new technologies or new processes. She said by inviting their teams to share ideas and ask questions, leaders can be more effective in implementation of AI tools and capabilities.
A big piece of that communication is lowering the risk, real or perceived, of using new technology.
Melvin said this is why just as important to communication is democratizing access to this technology by putting in place sandboxes where employees can experiment and then ask questions to help them identify things that they may not have anticipated.
Building AI fluency
“By creating that expectation that employees are part of the creation, the testing, and ultimately, if there are decisions to move forward with things, the implementation and adoption of them, the employees are part of that decision process,” Melvin said. “The focus on enabling the workforce where really the literacy and fluency pieces are important because a big part of the communication is giving access to information about what these technologies are, how they can be used and how they’re impacting the way work is being done. That does require both leadership communicating, but also that cascading across the organization of leaders bringing together their teams and having those strategic discussions about how they’re going to use AI and Gen AI. Then looking at what that means for the organization and what they need to communicate to the broader workforce.”
That broader workforce strategy has to include investing in building that AI fluency. Agencies need to upskill and reskill current employees in how they use new and existing capabilities. For example, robotics process automation (RPA) can accelerate and supplement back office functions like finance, procurement and human resources.
“Some parts of the workforce, like in the technology workforce, are going to need to really augment their skills, upskill and reskill. We’re starting to see some of those efforts happen, and it’s a really critical part of this change,” Melvin said. “This is really about amplifying humans’ ability to deliver work and augmenting human work. When I talk to leaders across government, their challenge is having enough workers to deliver on the mission. They’re looking at how can they identify these tools in order to allow them to do the mission that they’re charged with and with the workforce that they have. Right now, that’s why adoption of these tools is so important and really focusing on that as part of the strategy.”
For more in the series, Artificial to Advantage: Using AI to Advance Government Missions, click here.