The SEC’s recent enforcement action against RR Donnelley & Sons is the latest in a series of proceedings in which the agency has broadly interpreted the scope of the Exchange Act’s internal controls provisions. That approach has been sharply criticized by dissenting commissioners and by outside commenters, but in a recent “Radical Compliance” blog, Matt Kelly entertains the possibility that the SEC’s view of the world may be right.
Matt points out that Section 13(b)(2)(B) of the Exchange Act requires companies to maintain internal accounting controls “sufficient to provide reasonable assurances” that, among other things, access to assets is permitted only according to management authorization. He notes that in this enforcement proceeding, the SEC is taking a provision intended to apply to accounting fraud and applying it to cybersecurity – but as he explains in this excerpt, this isn’t necessarily an unreasonable position:
Is it really proper for the SEC to use its books-and-records provision in that manner? Honestly, I dunno. On one hand, we should remember that no actual fraud happened at Donnelley. No transactions were improperly recorded. The company didn’t even suffer a loss of data, since the data was only copied.
On the other hand, Donnelley was locked out of important IT systems. For example, some customers couldn’t receive documentation vital to vendor payments and disbursement checks. If this cyber attack happened in the real world, it would be akin to hooligans strolling into your building, changing the locks to the accounting department, and demanding millions if you want to get the set of new keys. A company that let something like that happen would certainly seem inept to most reasonable investors.
Critics of the SEC (and lord knows there are plenty around) would say the Donnelley case is a novel interpretation of anti-fraud rules, with the SEC basically nosing its way into cybersecurity regulation. That seems outside the SEC’s swim lane.
Then again, suppose those hackers had exploited sloppy cybersecurity controls to steal money from Donnelley rather than copying data, and then covered their tracks by altering the finance department’s banking records. (A frighteningly easy thing to do, by the way.) Few people would fault the SEC for raking Donnelley over the coals then. So why does this case feel a bit weird now, when money wasn’t stolen?
Matt suggests that we step back and look at the big picture – as technology has advanced, the controls required for strong financial reporting and those required for strong cybersecurity are converging into a single system focusing on access control. In this new reality, it’s essential to have strong controls to prevent unauthorized access to IT systems, rather than the historical norm of controls governing access to the accounting department and its physical books and records.
You might think that with all the negative attention from regulators about audit quality issues over the past few years, shareholders might be a little more hesitant to vote in favor of proposals to ratify auditors. However, according to a recent Ideagen/Audit Analytics report, if you thought that, you’d be wrong:
Throughout the last six years, our analysis on shareholder votes reveals that, on average, nearly 98% of total votes are cast in favor of auditor ratification. Shareholder votes filed between January 1, 2021 and December 31, 2023, continued that trend for a sixth consecutive year. Votes against auditor ratification comprised 1.7% of the total votes; abstained votes account for the remaining 0.4% of total shareholder votes cast.
In fairness, the percentage of proposals in which more than 5% of the outstanding shares voted against ratification of the auditors ticked up last year from 7% to 8%, but that still leaves 92% of proposals in which fewer than 5% of the outstanding shares were voted against ratification. Maybe shareholders ought to be a little more reluctant to toe the party line here, because a 2023 study found that higher than expected shareholder dissatisfaction with external auditors is associated with improved audit quality.
Yesterday, Corp Fin Director Erik Gerding issued a statement addressing concerns expressed by some registrants that the SEC’s rules requiring disclosure of material cybersecurity incidents in an Item 1.05 Form 8-K preclude registrants from sharing information beyond that disclosed in the 8-K with others, including contractual counterparties. Director Gerding’s statement clarifies that this is not the case, and that Regulation FD offers various alternatives for sharing this information without raising selective disclosure concerns:
There are several ways that a public company can privately share information regarding a material cybersecurity incident beyond what was disclosed in its Item 1.05 Form 8-K without implicating Regulation FD. For example, the information that is being privately shared about the incident may be immaterial, or the parties with whom the information is being shared may not be one of the types of persons covered by Regulation FD.
Further, even if the information being shared is material nonpublic information and the parties with whom the information is being shared are the types of persons covered by Regulation FD, an exclusion from the application of Regulation FD may apply. For example, if the information is being shared with a person who owes a duty of trust or confidence to the issuer (such as an attorney, investment banker, or accountant)or if the person with whom the information being shared expressly agrees to maintain the disclosed information in confidence (e.g., if they enter into a confidentiality agreement with the issuer), then public disclosure of that privately-shared information will not be required under Regulation FD.
The statement notes that while companies may be reluctant to share additional information about cybersecurity incidents with third parties, companies that follow the scope and requirements of the selective disclosure rules in Reg FD should not face undue impediments to mutually beneficial sharing of information regarding material cybersecurity incidents with third parties.
I recently saw a report quoting an OpenAI insider who estimates that there’s a 70% chance that artificial intelligence will destroy humanity. I guess that would worry me more if I didn’t put the odds of us doing that to ourselves without AI’s help at around 75% – and if the current iterations of AI didn’t have more in common with ’80s icon Max Headroom than with the HAL 9000 from “2001: A Space Odyssey.”
That being said, I’ve recently learned about one emerging use for AI that really does terrify me. Apparently, people are starting to use generative AI tools to prepare board minutes. A recent article in “The Boardroom Insider” flags this emerging practice, and this excerpt lays out some of the things that could go very wrong with relying on AI tools in this setting:
Potential downsides of this trend are apparent (and some are still to be realized). Recording of board meetings are always a legal bomb waiting to go off. The more it becomes a standard practice, the more likely someone will neglect to wipe all copies once minutes are finalized. While AI minuting apps note that their draft is only that — a draft for further human processing — what it retains and ignores can prove worrisome.
Further, once you get comfortable with letting AI do the minuting, you’re more likely to just send its digital take out for quick approval. AI “hallucinations” sneaking into the draft could be hard to spot. Finally, what if everyone on the board uses a recording to create their own AI summaries? This Tower of Babel approach could be a nightmare.
If you’re still willing to take the plunge, the article goes on to identify some AI tools that you might use to help generate board minutes. If you’re up for that, well, Godspeed! As for me, when it comes to the use of generative AI for board minutes, I’m firmly in Colonel Kurtz’s camp.
There’s some big Wu-Tang Clan-related crypto news that I’d like to share to close out the week. A few years ago, I blogged about how a “digital autonomous organization” or “DAO” named PleasrDAO had acquired the sole copy of the group’s legendary “Once Upon a Time in Shaolin” album that the feds grabbed from its original owner, fraudster Martin Shkreli.
PleasrDAO had big plans for the album, but those plans depended on its ability to persuade The RZA and Cilveringz to sign off on them. That sign-off was necessary because RZA opted to impose a unique restriction on any owner of Once Upon a Time in Shaolin when the album was announced in 2015 – whoever bought it would not be able to release it until 2103, 88 years following its release.
Well, it looks like PleasrDAO was successful, because according to this Bloomberg BusinessWeek article, it recently sponsored swanky listening sessions in NYC where attendees could hear selections from the album while “sipping artisanal cocktails.” If you missed the New York sessions, don’t despair – all you have to do is travel to New Zealand to catch the sessions being held at a Tasmanian museum through June 24th (in case you’re on the fence, they’re getting rave reviews).
So why is PleasrDAO holding these events? Well, this is where the tenuous connection to the federal securities laws that allows me to periodically blog about The Wu-Tang Clan comes into play. Here’s an excerpt from the article:
Perhaps not surprisingly, these exclusive sessions have coincided with the start of a campaign to wring more money from the album. On June 13, PleasrDAO started selling digital ownership stakes in Once Upon a Time in Shaolin for $1, entitling buyers to a short sampler from the album along with an encrypted file of the music that will remain locked—but maybe not until 2103, as originally promised. The collective says each sale will reduce the time it takes to make the entire album publicly available by 88 seconds. In short, the decades-long restriction is more fungible than most people might have assumed.
The article says that PleasrDAO raised $250K selling these NFTs in just four days, and cites a NY Times report as indicating that it would need to raise a total of $28 million to release the album to the public. It looks like PleasrDAO has attempted to structure this NFT to avoid having it classified as a security. Of course, that’s what all the NFT folks have said – and the SEC hasn’t always agreed. So, PleasrDAO would be wise to take some advice from the Wu-Tang Clan and “watch your step, kid, watch your step, kid, protect ya neck, kid!”
On Tuesday, the SEC announced an enforcement action against RR Donnelley & Sons arising out of alleged disclosure and internal controls violations associated with a series of cyber incidents occurring in November and December 2021 that resulted in a hacker obtaining information belonging to 29 of the company’s clients. This excerpt from the SEC’s press release explains the basis for the action:
According to the SEC’s order, data integrity and confidentiality were critically important to RRD’s business. Because client data was stored on RRD’s network, its information security personnel and the third-party service provider RRD hired were responsible for monitoring the network’s security. However, according to the order, RRD failed to design effective disclosure controls and procedures to report relevant cybersecurity information to management with the responsibility for making disclosure decisions, and failed to carefully assess and respond to alerts of unusual activity in a timely manner.
The order further finds that RRD failed to devise and maintain a system of cybersecurity-related internal accounting controls sufficient to provide reasonable assurances that access to RRD’s assets – its information technology systems and networks – was permitted only with management’s authorization.
Under the terms of the SEC’s order in the case, the company consented, on a neither admit nor deny basis, to the entry of a C&D enjoining future violations of Exchange Act Section 13(b)(2)(B) and Rule 13a-15(a). In addition, the company agreed to pay a civil monetary penalty of $2.125 million.
In a dissenting statement, Commissioners Peirce and Uyeda again challenged the SEC’s use of Section 13(b)(2)(B) in a setting not involving accounting controls:
The Commission’s order faulting RRD’s internal accounting controls breaks new ground with its expansive interpretation of what constitutes an asset under Section 13(b)(2)(B)(iii). By treating RRD’s computer systems as an asset subject to the internal accounting controls provision, the Commission’s Order ignores the distinction between internal accounting controls and broader administrative controls. This distinction, however, is essential to understanding and upholding the proper limits of Section 13(b)(2)(B)’s requirements.
If this objection to an expansive interpretation of Section 13(b)(2)(B) sounds familiar, that’s because it’s one that these same two commissioners raised in response to two prior enforcement actions – the SEC’s 2020 enforcement action against Andeavor and its 2024 enforcement action against Charter Communications.
Here’s the final installment in our series of guest blogs on AI Related Disclosures by Orrick’s J.T. Ho, Bobby Bee and Hayden Goudy:
AI-related Business and MD&A Disclosure. Several companies in the S&P 500 mentioned AI in the Business or MD&A sections of their most recent 10-K, tying AI to their main products and services or to key business updates. While less common than an AI-related risk factor, 40% of the S&P 500 had an AI-related disclosure in the Business or MD&A sections of their most recent 10-K, an increase from 30% in the previous period.
AI-related disclosure in the Business or MD&A sections of the 10-K varied significantly by industry. For instance, 85% of companies in the information technology sector made an AI-disclosure in the Business or MD&A sections, compared to 56% of companies in the financial sector and 38% of companies in health care.
As more companies adopt AI in their operations, products and services, we expect more references to AI in the Business and MD&A sections of 10-Ks across the S&P 500.
Limited Disclosure in the Proxy Statement. AI-related disclosure in the proxy statement across the S&P 500 was limited. While more than 39% of companies in the S&P 500 mentioned AI in their most recent proxy statement, a significant proportion of references were to new AI-related products or the role that AI was playing as part of a business transformation. Additionally, 24% of the S&P 500 disclosed director-level AI-related expertise or experience in their most recent proxy statement.
However, a much smaller percentage of companies in the S&P 500, approximately 9%, disclosed the role of the board or its committees in overseeing AI-related risks.
For companies that disclosed board or committee oversight, the allocation of that responsibility varied.
The most common approach was at the full board level – 18 companies in the S&P 500 disclosed a clear role for the full board in overseeing AI-related risks. The second most common approach was for the Audit Committee to oversee AI-related risk.
We’re really looking forward to returning to an in-person format for our upcoming Proxy Disclosure and Executive Compensation Conferences to be held in San Francisco on October 14th and 15th. Our agenda is always topical, and this year is no exception. Here’s a taste of what we have in store for you:
– If you’ve been following this week’s blogs on AI-related disclosure issues, you won’t want to miss our “Governing and Disclosing AI” panel
– Our “Cyber Incidents: Handling Real Time Reporting” panel will offer insights to keep you out of the Division of Enforcement’s cross-hairs when it comes to cybersecurity issues.
– Our “Living with Clawbacks – What Have We Learned?” panel will bring you up to speed on how companies are adjusting to the clawback listing standards and the emerging issues they are encountering.
Of course, we’ll also have panels addressing the latest developments in shareholder activism, climate disclosure, key 10-K and proxy disclosures, perks, navigating ISS & Glass Lewis. You’ll hear insights on proxy disclosure and executive comp hot topics from our “SEC All-Stars” and have the opportunity to listen to Dave interview Corp Fin Director Erik Gerding. As always, we’ll also have a little fun – this year, it’s in the form of a “Family Feud”-style “lightning round” ame show that we think you’ll really enjoy.
We hope many of you will join us in San Francisco! Register by July 26th to lock in our “early bird” deal for individual in-person registrations ($1,750, discounted from the regular $2,195 rate). If traveling isn’t in the cards, we also offer a virtual option so you won’t miss out on the practical takeaways our speaker lineup will share. (Also check out our discounted rate options for groups of virtual attendees!) You can register now by visiting our online store or by calling us at 800-737-1271.
Yesterday, the SCOTUS granted a cert petition filed by NVIDIA seeking review of the 9th Circuit’s decision in E. Ohman J:Or Fonder AB v. NVIDIA Corp., (9th Cir.; 8/23), concerning the PSLRA’s heightened pleading requirements for allegations of falsity and scienter. In its cert petition, NVIDIA pointed out that plaintiffs often try to meet the PSLRA’s heightened pleading requirements for falsity & scienter by alleging that internal documents contradict a company’s public statements, and that the 9th Circuit’s ruling presented two questions that have divided the circuits concerning how the PSLRA’s pleading requirements apply in this “common and recurring context”:
1. Whether plaintiffs seeking to allege scienter under the PSLRA based on allegations about internal company documents must plead with particularity the contents of those documents.
2. Whether plaintiffs can satisfy the PSLRA’s falsity requirement by relying on an expert opinion to substitute for particularized allegations of fact.
NVIDIA went on to note that, with respect to the pleading requirement for alleging scienter based on internal documents that contradict public statements, five circuits have held that the statute requires to allege the contents of those documents with particularity, while two (now including the 9th) have held that plaintiffs may allege scienter “merely by hypothesizing about what those documents ‘would have’ said.” As to the falsity requirement, NVIDIA pointed out that two circuits have held that plaintiffs can’t satisfy the PSLRA’s pleading standards by substituting an expert opinion for particularized allegations of fact, so the 9th Circuit’s decision permitting plaintiffs to do that creates a split.
By the way, the case caption isn’t a typo, “E. Ohman J:Or Fonder AB” is the correct name of the lead plaintiff. For some odd reason, today is my day for blogs involving parties with names that look like typos to American eyes. Over on DealLawyers.com, I blogged about an EC investigation of a deal under the EU’s Foreign Subsidy Rule in which for some reason the regulators decided to abbreviate the name of Emirates Telecommunications Group Company PJSC as “(e&)”.
Here’s the second installment in our series of three guest blogs on AI Related Disclosures by Orrick’s J.T. Ho, Bobby Bee and Hayden Goudy:
Corporate Disclosure Trends We identified AI as one of the fastest growing disclosure topics in SEC filings across the S&P 500, with a rapidly growing number of companies disclosing AI-related risk factors in the 10 K. However, disclosure of AI-related oversight at the board and management level in the proxy statement significantly lagged disclosure of AI-related risks in the 10-K.
Companies Disclosed AI-Related Risks More Often Than AI Oversight. We found a gap between the prevalence with which companies in the S&P 500 disclosed significant or material AI-related risks and the prevalence with which they disclosed board and committee oversight of those risks in the proxy statement. Together with growing investor and activist interest, we expect increasing pressure from a range of stakeholders on public companies to address this gap, including pressure to develop and disclose an approach to AI oversight at the board or committee level.
AI-related Risk Factors.The most common type of AI-related disclosure in SEC filings across the S&P 500 was an AI-related risk factor. Nearly 60% of the S&P 500 disclosed an AI-related risk factor in their most recent 10-K. This was a major increase from the previous reporting period, where only 16% of the S&P 500 disclosed an AI-related risk factor.
Most relevant risk factors in the S&P 500 were not focused solely on AI. Instead, we found that references to AI were generally integrated into existing risk factors. Companies included AI-related references into risk factors addressing:
– Cybersecurity risks, such as higher levels of exposure due to threat actors using AI, or a higher likelihood of a data breach due to the use of AI tools.
– Operational and business risks, such as higher costs from adopting AI technology or potential loss of market share from AI-driven disruption.
– Potential harm to the company brand and reputation from intellectual property disputes involving AI.
– Costs or risks associated with AI regulations.
The final installment of this series will address AI-related Business and MD&A disclosure, as well as practices regarding AI-disclosures in proxy materials.