Insight by Microsoft Federal

For M-21-31 compliance to work, agencies must shift to a modern logging mindset

While agencies may understand the need to work towards the transformative power of modern log management and data-driven strategies, implementing them is anothe...

This content is provided by Microsoft Federal.

OMB’s August 2021 release of M-21-31 signaled a sweeping change in the way federal agencies collect and retain event logs across different systems and networks.

At its core, M-21-31 is intended to increase government visibility before, during, and after a cybersecurity incident. The memorandum establishes different logging maturity levels and sets retention requirements for active and long-term data storage. However, while M-21-31 requires agencies to expand the types of data collected and store it for longer periods of time, it offers limited guidance on what agencies should do with that information once it has been collected. This is because agencies have differing levels of resources and mission requirements, so operationalizing log data cannot be done with a one-size-fits-all solution.

It’s also important to realize that M-21-31 does not exist in isolation. The public sector has seen a wave of recent cybersecurity regulations, including EO 14028, EO 14110, M-22-09, and the National Cybersecurity Strategy. These underscore the administration’s focus on making cybersecurity a strategic priority.

While necessary for security, all of these executive actions constrain agencies’ collective resources and make it difficult to prioritize guidance. Smaller agencies often have limited funds and few dedicated security analysts, so absorbing these new directives can present significant challenges. While agencies may understand the need to work towards the transformative power of modern log management and data-driven strategies, implementing them is another story.

What does modern log management look like in practice?

There are two main types of data we need to consider when talking about modern log management: active data and long-term data. Oftentimes, long-term data is still important to the organization—typically for business intelligence or compliance regulations—and should be archived on lower-cost storage tiers after a sufficient amount of time has passed. Companies cannot delete this data altogether, but they don’t need to pay for advanced analytics capabilities and premium storage. Long-term data can also be marked as read-only to protect against modification and encrypted to help protect against exfiltration in the event of a potential breach.

By contrast, active data is the logging and alert data that comes from security tools, endpoints, networks, and identities. Unlike long-term data, SOC analysts, forensic teams, and even AI and ML models need immediate access to active data in order to function. As more active data flows in, the cost of storing that information in a SIEM increases. This is particularly relevant for multicloud environments, as they tend to produce a significant volume of active data.

Traditionally, SIEMs and the data hosted in the SIEM have been restricted to the SOC team. However, as multi-cloud environments grow more complex and security teams look for better ways to handle the growing volume of alerts, there are other roles within an agency that need access to active data. Alert fatigue from this increased number of security signals can cause analysts to miss true positives that need to be investigated and remediated. Additionally, SIEMs can struggle to correlate signals quickly enough for SOC analysts to respond to the breach and prevent threat actors from establishing a foothold.

The case for using SIEMs and cyber data lakes collectively

SIEMs still have a role to play in security. However, agencies should augment them with modern solutions that are more accessible to the broader agency and more cost-efficient at storing large amounts of active data. For many organizations, that solution is a cyber data lake.

When combined with SIEMs, cyber data lakes offer a more holistic, comprehensive data approach that accounts for different categories of data. Data lakes act as a centralized or decentralized repository that ingests and stores large volumes of structured and unstructured raw data for processing and analysis. This provides core data consistency across a variety of applications, powering big data analytics, machine learning, predictive analytics, and other forms of intelligent action. Many private sector organizations rely on data lakes to keep raw data consolidated, integrated, secure, and accessible.

Using SIEMs and data lakes in tandem can also simplify the eDiscovery process, especially in hybrid and multi-cloud environments. SIEMs act as a familiar environment where SOC analysts can analyze key log data and act on meaningful security alerts. And because data lakes are a cost-effective way to store more types of data for longer periods, they can hold the contextual information and data query interface needed for threat hunters to properly investigate incidents. This is crucial for M-21-31 compliance, especially when it comes to storing high-volume data producers like endpoints and providing visibility with efficient long-term data storage and retrieval.

When it comes to cybersecurity, change is our only constant

Cybersecurity is a continuous process of learning and adapting to the limitations of your environment, as well as adversaries’ tactics, techniques, and procedures (TTPs). Mandates like M-21-31 are a way for the federal government to encourage agencies to evolve alongside our current threat landscape and implement modern security best practices.

For example, it takes nine months on average for businesses to identify and report a data breach. If agencies rely on SIEMs as their primary source of threat intelligence, they must often juggle cost and performance restraints. This leaves security teams with limited visibility into how their system was breached or what threat actors did once they were inside the network. SOC analysts need proper breadth and depth of information to reduce dwell times and quickly contain lateral movement.

Modern log management also ties into the Zero Trust mindset of assuming breach. All agencies, regardless of how advanced their cybersecurity posture is, have to assume that they eventually will be breached. M-21-31 pushes the federal government to increase logging so that agencies have extended access to the data in the event of an incident. Robust log management programs can also act as a deterrent for attackers as security teams have better visibility and can implement controls rapidly, driving up the cost and effort needed to compromise systems by forcing threat groups to continuously change their TTPs.

The recent advances in large language models (LLMs) and the increased log data collected by the agencies provide a valuable pathway for organizations to move away from static defenses and instead prioritize dynamic data-driven strategies. As agencies introduce new tools and technology into their ecosystem, this provides new ways to increase the value of the data collected by M-21-31. It also helps modernize security controls and concepts, paving the way to use the SIEM and cyber data lakes for AI-driven operations.

To learn more about modern logging and how your agency can increase compliance with regulations like M-21-31, visit Microsoft Federal Cybersecurity.

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.