Insight by EY

EY’s own experience with AI provides important lessons

Anja Allen, a principal at EY, said the internal rollout of Microsoft Copilot offered insights into what goes into successful AI implementations.

Over the last 25 years, the federal government has experienced several technology disruptions. The internet changed the way agencies could deliver services.

The cloud made computing cheaper, better and faster.

But the biggest disruptor, according to many experts, may just have hit the scene in a big way in the last three or so years. Artificial intelligence impacted — and will continue to do so — agencies from the top down and the bottom up.

The disruption AI is bringing to agency mission and back office areas comes in many facets. It’s changing how work gets done.

The use of AI requires new ways for people to think and manage, as well as new skillsets to be successful. And, of course, it’s changing the risk factors that are important to mission areas.

Anja Allen, a principal at EY, said agencies need to get ahead of the disruption by understanding what it does or could mean to their mission areas and how to ensure they are getting value from the use of AI tools.

“There’s a lot of pressure from executives in many organizations to make sure that the organizations looking at AI are adapting it in the right way,” Allen said on the discussion Government Modernization Unleashed: AI, Change and Opportunity. “The good news is that it is not a black and white scenario. You can turn off features that you might be concerned about. There’s a lot of flexibility in how the technology can be deployed, and that’s going to be important to think about. It’s not a hard ‘no,’ where we take this black and white approach, which we tend to do when security is in play. We really need to think about how do we need to change our processes, our security posture or, as a whole, does the solution need to get changed in order to meet our security requirements?”

AI’s potential is coming into focus

AI is not new by any means. Allen said she took classes that covered certain aspects of AI in the early 1990s. But what’s different today is AI capabilities are more accessible and the rise of generative AI has caused this disruption to occur at a faster pace.

“I think one of the biggest issues we’re seeing is that you have to really take a step back and think about how disruptive can this be potentially for us, or how can we do things fundamentally different,” Allen said. “That is probably the biggest opportunity that everybody, whether it’s a commercial organization or government organization has right now to take a step back and to think more out of the box, not just look at it as a productivity enhancement, but really think about it. How can we use this to do things differently?”

Allen said EY over the last year took this proactive approach by implementing Microsoft Copilot.

She said they designed this internal pilot to help increase employee productivity by crawling SharePoint sites and helping summarize or pull key information out of a document.

“When I want to create a document, it really taps into our knowledge base and I have a really easy way of doing that now by just prompting Copilot to say, ‘find me any information about this new technology. Get me data about AI adoption or give me a good example that might be more relevant to our topic.’ It will essentially crawl to our environment and get me all the data,” Allen said. “What is more important is it will either give me the references so I can actually read the origin documents and get into more detail because there’s always going to be hallucinations. So having that human-assisted attitude to AI is very important.”

Key lessons for adoption

Like with most new technologies, there was initial excitement across EY to use Copilot, and then usage dropped significantly after about 30 days.

Allen said EY employees stopped using the tool because they did everything they could figure out on their own without additional training.

“People didn’t engage with the tool anymore, and that was not just a handful of people. That was a pretty prevalent occurrence across the data set we had globally. It wasn’t just an issue with a certain set of people by age or otherwise,” she said. “We took that lesson learned and really made sure now in the current rollout, which will probably make us one of the largest Copilot rollouts globally as we are targeting at least over 100,000 employees right now, and that obviously can expand over time that we are very much focused on teaching people and communicating to them about what is the art of the possible with Microsoft Copilot. With our other internal large language model, it’s really proactively pushing sometimes small tidbits of prompt engineering to people to say, ‘were you wondering how you can analyze your spreadsheet? Here are the three steps,’ and then how you can prompt Microsoft Copilot to take care of this.”

The successful implementation of Copilot at EY isn’t any different than any other large organization. Allen said the key factors of training and executive support drive the change management across the organization.

She said agencies, like EY, need to consider how best to maximize this investment in AI tools and what roles do technology, chief human capital and other executives play in the roll out.

Allen added that agencies need to balance user needs with security, privacy and risk management with any AI implementation.

“I do think that needs to be a very deliberate effort. There’s areas where it probably is black and white, and you say, ‘hey, for right now, we’re not going to be able to deploy the technology because of security concerns, limitations around data.’ However, I do think there’s areas where you should be proactively looking at the solution, but I also keep always an eye on not just data security, but also data privacy,” she said. “I do think when you’re starting to consider AI solutions, let’s bring in security right in the beginning when we are testing the solution, when we are developing solutions. I think the same kind of approach needs to be taken here by looking at what are the potential risk factors. What rules and regulations do we have to follow from a data privacy or from an intellectual property perspective, and how can we be proactive about it?”

Copyright © 2025 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories