Ten Considerations for Developing an Effective Generative AI Use Policy
Here's the intro to our Client Update that then lays out 10 considerations for developing a generative AI use policy:
"This year's news has been full of stories about "generative" artificial intelligence (AI) applications. Generative AI tools create code, text, images, and other content in response to text prompts, queries, and other inputs. These tools have the potential to make research, writing, coding, graphic design, and other forms of content creation and manipulation much faster and easier. But as with other emerging technologies, the rapid, widespread adoption of these tools in a developing legal and regulatory environment can give rise to potential risks.
To manage these risks, many companies are adopting an acceptable use policy (AUP) governing their use of third-party generative AI tools, educating employees on their use, and monitoring initial use cases and the quality, legality, and accuracy of the outputs, particularly regarding publishing or utilizing generated content publicly.
Crafting an appropriate AUP for generative AI is a process that requires careful consideration and collaboration across multiple departments. Each policy will be different, reflecting the company's business needs and culture, the nature of the intended uses of such tools, and the company's level of risk tolerance in light of its industry and the applicable evolving legal and regulatory landscape. Policies should be flexible because generative AI tools—and the laws and regulations governing them—are developing."
Print and share
Explore more in
Public Chatter
Public Chatter provides practical guidance—and the latest developments—to those grappling with public company securities law and corporate governance issues, through content developed from an in-house perspective.