As cybercriminals and nation-state actors continue their daily assaults on the U.S. government’s most sensitive data and networks, agency IT leaders are fast-tracking their approach to adopting a zero trust architecture – especially in a post-pandemic world filled with more remote workers and cloud environments.
The zero trust model, while not a new concept for many private sector businesses, is a strategic approach that is successful in preventing data breaches by removing the assumption of trust from an agency’s network architecture, so much so that the Biden Administration mandated its adoption in May 2021 as part of the Executive Order on Improving the Nation’s Cybersecurity. This means that federal IT and cybersecurity professionals must shift from relying on perimeter-based protection to adopting a framework that relies on mutual authentication and eliminates “trust.”
Regarding data and analytics, the executive order mandates that federal agencies start “prioritizing the identification of unclassified data that is considered to be the most sensitive and under the greatest threat and begin working to address how that data will be processed and stored within 90 days.” This point highlights the strategic importance for adopting a zero trust approach in government, especially if a plan exists to use self-service analytics – analytics that empower users to access data on their own, allowing them to gain valuable insights without prior experience.
However, zero trust still has its pros and cons. And while it remains a successful strategy for combatting cyberattacks and the criminals who commit them, it presents various challenges for government agencies looking to modernize their data analytics platforms and frameworks.
For government agencies looking to comply with the new executive order and achieve their analytics goals, they must first address the issue of data silos.
Data silos lead to barriers in decision-making, creating a lack of transparency and breaking down the ability to share within an agency environment. But as government leaders work to comply with the executive order’s requirements, they are faced with lingering questions about how they should empower users to become data-savvy while also implementing the concept of zero trust.
Traditionally, agencies have focused on protecting application programming language interfaces (APIs), segmenting networks, and building new data silos to protect sensitive government data. However, enforcing policy decisions at the data level has received minimal attention when it comes time to roll out a comprehensive strategy.
In the past, this may have worked. But as government agencies plan to launch data analytics in a zero trust environment, not enforcing policy decisions at the data level will create a significant backlog, not only in operations but also on the technical side.
When data silos are created, whether through restricted access, copying data to enforce policy, or challenges in analyzing data, government agencies are consequently limiting employees’ abilities to gain actionable data insights. Data sharing then becomes nearly impossible, and the data cannot sync with the authoritative data source.
Achieving data analytics in a zero trust environment
Policies surrounding data access must be enforced across many modern and legacy systems that exist in the federal government. These systems require a team of software developers and database administrators (DBAs) to not only create and maintain policies but also to synchronize them. As such, this results in operational overload while improvements in data access remain stagnant.
To deploy data analytics in a zero trust environment, government agencies can take various steps to break down data silos, empower their workers to become data savvy, and secure data in alignment with the executive order’s zero trust strategy requirements. These steps include:
Deploy a single policy approach that can be enforced across networks and technologies. This provides data protection without hindering performance.
Focus on enforcing and automating data use agreements between agencies and organizations, which is an operational requirement outlined in the executive order when sharing data with the Cybersecurity and Infrastructure Security Agency and is critical for data sharing.
Control access through subscription policies. This allows for data ownership by restricting access to everyone unless they are manually added to the data set. This also gives data stewards the power to build policies based on user roles and groups.
Use an attributed-based access control model that integrates user and data attributes to enforce data access, governance and privacy policies. This model aligns with the National Institute of Standards and Technology guidelines required to build a zero trust architecture for data analytics.
Ensure users accept a data use agreement upon accessing any data. This ensures anyone who analyzes datasets remains compliant with their agreement.
Log all activity and interactions with data for auditing purposes, including when adding or removing metadata tags from a dataset, granting user access, creating user queries and other data-related actions.
As government agencies shift to the zero trust model to combat evolving cyber threats, a primary focus should be on removing data silos. When data silos are broken down, government agencies can increase collaboration, improve constituent relationships, better understand what data is available and empower their employees to make informed decisions using secure data analytics programs.
Brian Shealey is vice president of public sector sales at Immuta and Chris Brown is director of solutions architecture for the public sector at Immuta.