Before their finalization, the proposed rules received over 200 comments from industry groups, universities and legal firms. Many of them highlight the complexities...
Sam is a data security engineer at a Fortune 500 company. For the past three weeks, she and her team have been managing the fallout of a major data breach, which according to her boss, is expected to cost upwards of $200 million in lost business. As a shareholder herself, this has Sam worried and she’s now thinking of selling her stock in case the price tanks at the earnings call.
Meanwhile, 4,000 miles away in a furtive St. Petersburg office, a member of a Russian hacking crew, now satisfied their attack was successful, just shorted the stock. All he has to do now is wait.
This story is fabricated and grossly oversimplified but highlights a risk the Securities and Exchange Commission believes goes unaddressed by current cyber reporting rules. Consequently, from September, new rules will require public companies to report any material cyber breaches to the SEC within four days.
“Delayed reporting of cybersecurity incidents can result in mispricing of securities, and that such mispricing can be exploited by threat actors,” the final rule states.
Before their finalization, the proposed rules received over 200 comments from industry groups, universities and legal firms. Many of them highlight the complexities of cyber incident reporting and the many potential conflicts it has with the transparency the SEC seeks to provide investors.
When the clock starts ticking
One contentious issue in the lead up to the ruling – and one of the most reported since – is the required notification window of just four days. While perhaps not immediately obvious, the four day timer doesn’t start when an incident occurs or even when it is detected (with the current industry mean-time-to-detection being over 200 days, four days is unrealistic to say the least). No, the clock starts ticking when the affected organization has determined the materiality of an incident: “[reports] must be filed within four business days of determining an incident was material,” the final rule states.
…of material interest to investors
When millions can be lost in minutes, there is an understandable urgency to assessing whether a cyber incident would be material to investors. So even before the four-day timer starts, the SEC was initially assertive that such determinations be made “as soon as practicable.” Concerned commenters pushed back citing fears that assessments might be “rushed prematurely.”
I don’t think it’s unreasonable for investors to expect a certain level of detail surrounding reports of a “material incident.” How company operations, income forecasts or its prospects for growth are affected should no doubt be factored into any investor’s analysis.
Investors might also expect at least some supporting evidence of how such determinations are made. If a company reports that quarterly earnings are expected to tank due to a “material cyber incident” but provide no further details, I expect many investors would be left scratching their heads.
The breadth, severity and highly varied nature of cyberattacks would surely warrant further interrogation from investors. Was a large ransom paid to the perpetrators of a ransomware attack? Was there a data breach and the company now fears reputational damage? How did the attack happen in the first place and how can investors be sure that improved security practices will be adopted moving forward?
While completely reasonable questions, the answers could actually prove dangerous if made public.
The delicate art of “just the right amount of information”
At its core, cybercrime is about information. Technical details of corporate systems, potential vulnerabilities and information about a company’s operations are crucial to would-be attackers. Disclosing any information about how an attack was pulled off could open the door for similar attacks on companies in a similar situation.
Thankfully, in the final ruling (following pushback from public commenters) the SEC removed the requirement to disclose details of how a cyber incident took place and whether or not it had been remediated.
Even so, simply knowing whether or not an attack was detected by the victim is useful information to an attacker. A criminal gang might test offensive tactics on one company, wait to see if an incident is reported and, if after several months it hasn’t been, deploy the attack on more victims with increased confidence and vigor.
The murkiness of materiality
The murkiness of materiality reveals additional challenges for the SEC: reporting accuracy and the granularity in which incidents are reported. In modern IT environments it can be difficult to audit systems and data accesses reliably. In the typical legacy systems of the Fortune 500, to do so across all systems and for every staff member, vendor or customer is nigh-on impossible. So when compromised credentials or unauthorized access is detected, how might the effects of that discovery be reported?
An unauthorized access is probably material. But how material? What is the threshold for materiality from an investor’s perspective? Setting the threshold too low could result in a sea of noise while setting it too high could undermine the SEC goals entirely.
A few hundred email addresses stolen from a marketing list is vastly different to an attacker who dumped an entire customer database before anyone noticed. In many data breaches, far too little information is available to reliably differentiate between the two, leaving victims no choice but to either report the reasonably worst outcome or downplay the material impacts altogether.
Take the attack on Australian telecommunications company Optus last year. It was initially reported that 10 million records were breached, either because this is how many records were stored in the compromised system or because that is what the attacker claimed. However, it’s very likely they did not know exactly which records were accessed. Some insiders now claim that after a 3-month investigation, it was discovered that only a fraction of the 10 million records were actually accessed. By that point the damage had been done with customers leaving in droves for the company’s biggest rival.
Is corporate America ready?
In my view, U.S. corporations are ill-equipped to meet the new SEC rules. But their problems aren’t limited to the SEC. Multiple agencies, including the Federal Communications Commission, FBI and the Cybersecurity and Infrastructure Security Agency already have strict cyber incident and data breach reporting requirements and these are set to become even more stringent. With the proliferation of data, slow adoption of new technologies such as encryption-in-use and alarming levels of over-access, companies face an uphill battle.
According to a survey conducted by the Ponemon Institute in 2021, 70% of employees have access to data they should not see, and 62% of IT security professionals say their organizations have suffered a data breach due to employee access.
Not only are appropriate access controls not being implemented, the controls that are implemented are often ineffective. While there are many products available for access control management, even the best “SecOps” teams will tell you that ensuring every system has exactly the right controls applied is practically impossible. A task made even more difficult when data is constantly on the move.
Worse still, access auditing is often an afterthought rather than included by design. When defensive measures fail, being able to reliably identify the details of what was accessed not only aids remediation but will increasingly prove crucial in determining materiality.
Obstacle or opportunity?
Major changes to corporate law often attract controversy. Particularly, when such changes threaten to hit the bottom line. However, some of the most contentious legislative changes of the past few decades have proved lucrative for opportunistic CEOs and founders. The General Data Protection Regulation (GDPR), The Sarbanes-Oxley Act and even the U.S. Consumer Protection Act have each opened the door for new products, service offerings and entire new markets.
At the same time, increased regulatory pressure often reduces productivity and increases costs (at least in the short-term). In this case, while the delivery isn’t perfect, I think bringing cyber incidents into sharper focus is a step in the right direction.
New SEC rules: The murkiness of materiality
Before their finalization, the proposed rules received over 200 comments from industry groups, universities and legal firms. Many of them highlight the complexities...
Sam is a data security engineer at a Fortune 500 company. For the past three weeks, she and her team have been managing the fallout of a major data breach, which according to her boss, is expected to cost upwards of $200 million in lost business. As a shareholder herself, this has Sam worried and she’s now thinking of selling her stock in case the price tanks at the earnings call.
Meanwhile, 4,000 miles away in a furtive St. Petersburg office, a member of a Russian hacking crew, now satisfied their attack was successful, just shorted the stock. All he has to do now is wait.
This story is fabricated and grossly oversimplified but highlights a risk the Securities and Exchange Commission believes goes unaddressed by current cyber reporting rules. Consequently, from September, new rules will require public companies to report any material cyber breaches to the SEC within four days.
“Delayed reporting of cybersecurity incidents can result in mispricing of securities, and that such mispricing can be exploited by threat actors,” the final rule states.
Get tips and tactics to make informed IT and professional services buys across government in our Small Business Guide.
Before their finalization, the proposed rules received over 200 comments from industry groups, universities and legal firms. Many of them highlight the complexities of cyber incident reporting and the many potential conflicts it has with the transparency the SEC seeks to provide investors.
When the clock starts ticking
One contentious issue in the lead up to the ruling – and one of the most reported since – is the required notification window of just four days. While perhaps not immediately obvious, the four day timer doesn’t start when an incident occurs or even when it is detected (with the current industry mean-time-to-detection being over 200 days, four days is unrealistic to say the least). No, the clock starts ticking when the affected organization has determined the materiality of an incident: “[reports] must be filed within four business days of determining an incident was material,” the final rule states.
…of material interest to investors
When millions can be lost in minutes, there is an understandable urgency to assessing whether a cyber incident would be material to investors. So even before the four-day timer starts, the SEC was initially assertive that such determinations be made “as soon as practicable.” Concerned commenters pushed back citing fears that assessments might be “rushed prematurely.”
I don’t think it’s unreasonable for investors to expect a certain level of detail surrounding reports of a “material incident.” How company operations, income forecasts or its prospects for growth are affected should no doubt be factored into any investor’s analysis.
Investors might also expect at least some supporting evidence of how such determinations are made. If a company reports that quarterly earnings are expected to tank due to a “material cyber incident” but provide no further details, I expect many investors would be left scratching their heads.
The breadth, severity and highly varied nature of cyberattacks would surely warrant further interrogation from investors. Was a large ransom paid to the perpetrators of a ransomware attack? Was there a data breach and the company now fears reputational damage? How did the attack happen in the first place and how can investors be sure that improved security practices will be adopted moving forward?
While completely reasonable questions, the answers could actually prove dangerous if made public.
The delicate art of “just the right amount of information”
At its core, cybercrime is about information. Technical details of corporate systems, potential vulnerabilities and information about a company’s operations are crucial to would-be attackers. Disclosing any information about how an attack was pulled off could open the door for similar attacks on companies in a similar situation.
Read more: Commentary
Thankfully, in the final ruling (following pushback from public commenters) the SEC removed the requirement to disclose details of how a cyber incident took place and whether or not it had been remediated.
Even so, simply knowing whether or not an attack was detected by the victim is useful information to an attacker. A criminal gang might test offensive tactics on one company, wait to see if an incident is reported and, if after several months it hasn’t been, deploy the attack on more victims with increased confidence and vigor.
The murkiness of materiality
The murkiness of materiality reveals additional challenges for the SEC: reporting accuracy and the granularity in which incidents are reported. In modern IT environments it can be difficult to audit systems and data accesses reliably. In the typical legacy systems of the Fortune 500, to do so across all systems and for every staff member, vendor or customer is nigh-on impossible. So when compromised credentials or unauthorized access is detected, how might the effects of that discovery be reported?
An unauthorized access is probably material. But how material? What is the threshold for materiality from an investor’s perspective? Setting the threshold too low could result in a sea of noise while setting it too high could undermine the SEC goals entirely.
A few hundred email addresses stolen from a marketing list is vastly different to an attacker who dumped an entire customer database before anyone noticed. In many data breaches, far too little information is available to reliably differentiate between the two, leaving victims no choice but to either report the reasonably worst outcome or downplay the material impacts altogether.
Take the attack on Australian telecommunications company Optus last year. It was initially reported that 10 million records were breached, either because this is how many records were stored in the compromised system or because that is what the attacker claimed. However, it’s very likely they did not know exactly which records were accessed. Some insiders now claim that after a 3-month investigation, it was discovered that only a fraction of the 10 million records were actually accessed. By that point the damage had been done with customers leaving in droves for the company’s biggest rival.
Is corporate America ready?
In my view, U.S. corporations are ill-equipped to meet the new SEC rules. But their problems aren’t limited to the SEC. Multiple agencies, including the Federal Communications Commission, FBI and the Cybersecurity and Infrastructure Security Agency already have strict cyber incident and data breach reporting requirements and these are set to become even more stringent. With the proliferation of data, slow adoption of new technologies such as encryption-in-use and alarming levels of over-access, companies face an uphill battle.
Sign up for our daily newsletter so you never miss a beat on all things federal
According to a survey conducted by the Ponemon Institute in 2021, 70% of employees have access to data they should not see, and 62% of IT security professionals say their organizations have suffered a data breach due to employee access.
Not only are appropriate access controls not being implemented, the controls that are implemented are often ineffective. While there are many products available for access control management, even the best “SecOps” teams will tell you that ensuring every system has exactly the right controls applied is practically impossible. A task made even more difficult when data is constantly on the move.
Worse still, access auditing is often an afterthought rather than included by design. When defensive measures fail, being able to reliably identify the details of what was accessed not only aids remediation but will increasingly prove crucial in determining materiality.
Obstacle or opportunity?
Major changes to corporate law often attract controversy. Particularly, when such changes threaten to hit the bottom line. However, some of the most contentious legislative changes of the past few decades have proved lucrative for opportunistic CEOs and founders. The General Data Protection Regulation (GDPR), The Sarbanes-Oxley Act and even the U.S. Consumer Protection Act have each opened the door for new products, service offerings and entire new markets.
At the same time, increased regulatory pressure often reduces productivity and increases costs (at least in the short-term). In this case, while the delivery isn’t perfect, I think bringing cyber incidents into sharper focus is a step in the right direction.
Dan Draper is Founder and CEO of CipherStash.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Related Stories
DHS says agencies should adopt cyber incident reporting definition, common form
Industry has lots to say about the Homeland Security plan for cyber incident reporting
Groups urge CISA to develop simple mechanism for cyber incident reporting