TheCorporateCounsel.net

November 3, 2023

Artificial Intelligence: Consider Your Third-Party Risks

AI has been especially prevalent in the news this week, following the Executive Order that President Biden issued on Monday (here’s the fact sheet). Among other things, the order gives broad leeway to federal agencies to set standards for the use of AI (e.g., the NIST framework) and for the protection of individual privacy. It’s not a stretch to think that this developing issue is on the SEC’s radar.

With that, here’s a good recap of the recent Securities Enforcement Forum from Holly Carr, who spent a decade in the SEC’s Enforcement Division and is now at BDO. On top of Dave’s recent reminder about cyber risks, this jumped out at me on the topic of AI:

On AI, companies should be assessing how not just their use of AI but how the use of AI by others may expose their business to new or increased risks. For example, how are customers or vendors using AI that may impact your organizations’ risk profile.

As John noted a few weeks ago, we’re continuing to post practical governance & disclosure resources in our “Artificial Intelligence” Practice Area. And on the topic of SEC Enforcement, make sure to mark your calendars for our webcast – “SEC Enforcement: Priorities and Trends” – which is less than two weeks away, on November 15th at 2pm Eastern. We’ll hear from Hunton Andrews Kurth’s Scott Kimpel, Locke Lord’s Allison O’Neil, and Quinn Emanuel’s Kurt Wolfe about the Division’s priorities, the latest developments on “gatekeeper” scrutiny, the pros & cons of voluntary reporting & cooperation, and more. CLE credit is available!

Liz Dunshee