How U.S. Prosecutors will Weigh Companies’ AI Use

The U.S. Department of Justice has updated its guidance for prosecutors, with significant emphasis on probing companies' AI use
Var ShankarVar Shankar
Var Shankar
30 Oct
2024
How U.S. Prosecutors will Weigh Companies’ AI Use

On September 23, 2024, the US Department of Justice (DOJ) updated its guidance for prosecutors on how to evaluate corporate compliance programs. Nicole M. Argentieri, a senior DOJ official, unveiled the updated guidance during remarks at the Society of Corporate Compliance and Ethics 23rd Annual Compliance & Ethics Institute in Texas. The guidance now includes considerations of how companies use AI and manage its risks.

What is the purpose of the guidance?

The DOJ publishes guidance on Evaluation of Corporate Compliance Programs (ECCP), to help DOJ prosecutors assess whether a company’s compliance program is a) well designed, b) being applied earnestly and c) working in practice.

Though the ECCP is written for DOJ prosecutors and the considerations in the document are not legally binding, the ECCP has been made public for several years, during which time it has served as useful guidance for executives and compliance teams at companies.

While the revised ECCP touches upon many new topics, the most significant revisions are in the areas of AI, emerging technologies, the role of data analytics in compliance programs and whistleblower protections.

What does the revised guidance say about AI use?

The ECCP guides prosecutors to evaluate the following factors related to companies’ AI use, among others: 

·  Does the company have in place a risk management program for AI use?

·  Are risk levels assigned to AI systems?

·  Does the company address risks that it identifies from AI use in an effective manner?

·  Does the company monitor its AI systems to ensure that they are functioning as intended?

·  Are high-risk systems subjected to appropriate oversight by people?

·  Are employees appropriately trained to use AI systems?

·  Does the company’s compliance program appropriately leverage data analytics and technology to ensure that it is functioning effectively?

·  Does the company address vendor risk in an effective manner?

·  Has the company dedicated appropriate resources to its compliance program?

·  Is the AI compliance program appropriately documented?

What is the broader context?

The revised guidance signals continued DOJ interest in company use of AI. In a March speech in San Francisco, Deputy Attorney General Lisa Monaco stressed that there is no AI exemption to existing laws, announced an intention to pursue “stiffer sentences” for AI abuse and previewed ECCP revisions.

Earlier this year, the DOJ also launched its Justice AI Initiative, a series of convenings of “stakeholders across civil society, industry, academia, and law enforcement” to inform how AI will intersect with the DOJ’s efforts.

Argentieri’s remarks on the revised ECCP added further color to DOJ’s revisions and demonstrated a nuanced understanding of the potential impacts of generative AI on companies. As an example, she noted that “prosecutors will consider whether the company is vulnerable to criminal schemes enabled by new technology, such as false approvals and documentation generated by AI.”

Enzai is here to help

Enzai’s AI GRC platform can help your company deploy AI in accordance with best practices and emerging regulations, standards and frameworks, such as EU AI Act, the Colorado AI Act, the NIST AI RMF and ISO/IEC 42001. To learn more, get in touch here.

Build and deploy AI with confidence

Enzai's AI governance platform allows you to build and deploy AI with confidence.
Contact us to begin your AI governance journey.