Own What You Create: Copyright Protection for AI-Assisted Works (Part III)
Part III – Contracting and Licensing
For AI-assisted works, authorship and registration are only the beginning. The real leverage (and exposure) often lies in the contracts that govern how those works are created, what rights are acquired, and what obligations are imposed.
Case in point: just last week, Hachette Book Group pulled a novel from distribution after allegations circulated widely online that large portions of the novel were AI-generated.[1] The author disputes having personally used AI, but states that an editor she hired may have.[2] This event is a useful illustration of the risks faced by creators and licensees alike when commercializing an AI-assisted asset. This article details best practices at each stage of the licensing process to mitigate these risks. It also explains how the principles addressed in Parts I and II translate into the agreements companies use to license creative assets, engage outside vendors, and manage employee IP, and how these agreements can significantly affect ownership, control, and risk.
AI Platform Agreements
Before licensing or enforcing an AI-assisted work, content owners and creators should understand what rights and obligations they have under the governing platform terms, which may vary significantly from one platform to another, and between consumer versus enterprise subscriptions.
For example, OpenAI’s consumer terms provide a full assignment: “As between you and OpenAI, and to the extent permitted by applicable law, you (a) retain your ownership rights in Input and (b) own the Output. We hereby assign to you all our right, title, and interest, if any, in and to Output.”[3] Midjourney’s terms are similar, but with notable exceptions. Specifically, companies or employees of a company with more than $1 million in annual revenue must be subscribed to a “Pro” or “Mega” plan to own the output, and outputs are publicly shared absent paid privacy features.[4] A content owner seeking to enforce exclusive rights in AI-assisted creative work should therefore confirm the rights granted to them under the applicable terms before engaging in any licensing discussions.
Platform agreements also govern indemnity rights and obligations for creators that use the platform, which may vary significantly by provider and tier. Many platforms offer some level of indemnity rights for enterprise customers. For example, Adobe’s Generative AI terms offer limited indemnification for Firefly outputs, but only for certain enterprise-based plans.[5] Likewise, OpenAI’s and Anthropic’s service terms for commercial customers offer indemnification against third-party claims, subject to certain limitations and exclusions.[6] By contrast, consumer-tier and standard paid plans typically provide no indemnity and instead require users to indemnify the AI platform.[7]
The documentation and registration guidelines addressed in Parts I and II of this series concern what content creators can protect under the law. The platform agreement establishes what creators actually own under a contract. Both questions need answers before any licensing conversation can meaningfully begin.
Licensing AI-Assisted Works
Understanding what a creator owns under the platform agreement is only the beginning of the review process for transactions involving AI-assisted works. As Part I discussed, copyright protection for AI-assisted works turns on the degree of human creative control over the expressive elements of the work.[8] A work in which AI generated the core expressive elements—images, text, audio—may be protectable only as a compilation, and only to the extent that human creativity was expressed through the selection and arrangement of those AI-generated elements. This is a more limited scope than what some licensors might believe and presents different legal and business questions than a work created entirely by human authors.
Before any negotiations take place, content creators should conduct a chain of title review for the asset in question to understand: (1) what was created by humans, (2) what was generated by AI, (3) whether the work has been registered, and (4) whether the registration accurately reflects the scope of the human-authored content.
On the contract side, licensors should review their agreements to ensure that any representation about ownership or validity is consistent with the applicable platform terms (as discussed above) and how copyright law applies to AI-generated and AI-assisted works. Contractual terms defining IP are often drafted broadly enough to cover all IP rights the licensor holds in the property without specific representations about the protectability of individual creative elements. But where agreements do contain representations about the validity or enforceability of that IP, a licensor would benefit from reviewing those terms with counsel and ensuring that they accurately reflect how these works were created and registered. This includes understanding and documenting any employee or third-party contributions to the work that incorporate AI-generated content (as further discussed below).
Licensees, on the other hand, should approach licensing AI-assisted properties with a few questions in mind. First, what is the value of the property given the scope of the copyright? If significant creative elements were AI-generated, the licensor’s copyright may be narrower than the overall asset suggests, even if the asset has substantial market value. Second, has the work been registered, and does the registration accurately reflect the human-authored content? An inaccurate or incomplete registration creates vulnerability that could affect the licensee’s ability to enforce against third party infringers. Third, how does the agreement allocate AI-related risk more broadly? In addition to questions about the scope of copyright protection, a licensee may also want the licensor to address whether the work was created using tools or source material that could create separate infringement disputes. Finally, in higher-value transactions, a licensee may want to request documentation of how the licensed works were created as part of diligence before closing. The Hachette Publishing example is instructive. Although Hachette’s contracts required authors to disclose AI use, the purported use of AI by a third-party editor hired by the author was apparently not disclosed.[9] To mitigate this risk, licensee requests for disclosure of AI use in the creative process should expressly cover all parties that contributed to the work.
Employment Agreements
Work created by an employee within the scope of employment is typically owned by the employer under the work-for-hire doctrine. But ownership itself does not resolve the scope of what is protected. If the employee’s AI use involves limited human expressive contributions, the copyright in the resulting work may be thin or nonexistent, regardless of who owns it. Without visibility into how employees are using AI in the creative processes, a company may be unable to assess the copyright status of its own assets and make accurate representations in licensing transactions.
On the contract side, employers should review their employment agreements with counsel to confirm that IP assignment provisions capture AI-assisted work product and do not contain carveouts or limitations that could exclude portions of AI-assisted work product.
More practically, companies should ensure that they have an AI use policy in place that is tailored to the particular platform(s) approved by the employer and reflects the employer's knowledge of the rights and obligations under the governing platform terms. Employers should then require employees to acknowledge and comply with the company’s AI use policy as part of their employment agreement or onboarding.
Works for Hire
Companies that rely on vendors and freelancers for creative work should understand the use of AI in the third-party workflow and the scope of any rights they are acquiring.
When a company commissions original creative work from a third party, standard work-for-hire language assigns to the company whatever copyright the third party holds in the work product.[10] That language works well for fully human-authored work. For AI-assisted work, however, the more important question is whether there is sufficient human authorship in the work product to support a copyright claim at all, and if so, whether the third party’s human contribution is adequately documented. Otherwise, a company that receives AI-generated creative assets under a work-for-hire agreement may find out too late that the copyright it assumed it acquired is thin or nonexistent.
Work for hire agreements should therefore include a requirement that the third party disclose whether and how AI tools were used in producing the work product. This can be done by requiring the vendor to disclose, prior to the execution of any agreement, whether their workflow will involve AI, and if so, which platform, and how. The goal is to ensure the company has the information it needs to assess what it is acquiring. Understanding the degree of AI involvement in the work product also informs the company’s negotiating position on representations and warranties. A company that knows how the work was created is better positioned to assess what representations it can reasonably require, and what risk it may need to allocate through other means.
Service Provider Agreements
When a service provider (whether a vendor or freelancer) is engaged not to create original work but to further develop, refine, or process existing assets—post-production work, format conversion, audio mastering—the question is not about ownership of the underlying work, but whether the service provider’s AI use introduces anything that could affect the copyright analysis for the resulting asset. As Part I discussed, AI contributions that are “de minimis” (those that would not be independently copyrightable if performed by a human) do not affect the copyright in the underlying work. As a best practice, however, companies managing core creative assets should maintain a record of whether any part of the service provider’s contribution was generated by using AI. This can be addressed by requiring disclosure of AI use during contracting, requiring pre-approval of AI-related tools and workflows, and preserving version history or change logs where feasible. Again, the Hachette Publishing example illustrates the importance of having visibility into the full creative chain. It is unclear whether the third-party editor in that case used AI only for “de minimis” purposes, e.g. flagging grammatical and punctuation errors, as opposed to more creative purposes, e.g. writing entire passages of the manuscript, though the cancellation of the book suggests the latter. These differing uses present different legal implications. But whatever the facts are, the importance of documenting the use of AI at all stages of the creative process—whether by creator, employees, or third parties—cannot be overstated.
Confidentiality
The confidentiality risk associated with AI tools is distinct from the copyright questions addressed elsewhere in this article, but it cuts across all the relationships discussed above. When any of these parties use a third-party AI tool, they may be inputting proprietary content, i.e. unreleased creative work, brand assets, source code, and business information, into systems operated by external providers. Depending on the provider’s terms, that content may be retained or used to train the model, placing confidential information outside the company’s control without its knowledge.
Companies can mitigate this risk through an AI use policy that restricts use of AI to approved platforms, requires appropriate privacy settings (including disabling training where available), and prohibits submission of confidential materials without prior authorization. For employees, they can be implemented through employment agreements or onboarding as a condition of employment. For vendors and freelancers, those requirements can be incorporated into the agreement.
Conclusion
AI-assisted works can be marketed, licensed, and monetized like human-created works. But as discussed throughout this series, the legal foundation is often different. The value of these assets depends not just on what is created, but on how it is created, protected, and contracted.
Doing that work at the outset—documenting human authorship, timely registering works, and aligning contracts with the realities of AI-assisted creation—will determine whether an asset can be confidently commercialized. The law governing AI-assisted works will continue to evolve, but content owners who build that foundation now will be best positioned to enforce, license, and build on what they create.
Contracting and Licensing: Key Action Items
AI Platform Agreements
☐ Identify the platform(s) used to create commercially significant AI-assisted assets and locate the governing terms in effect at the time of creation.
☐ Confirm ownership structure, e.g. whether rights were assigned or licensed, whether any revenue thresholds, plan tiers, or default visibility settings affect the scope of ownership.
☐ Review indemnification provisions to understand what protections, if any, the platform provides and what obligations run from the content owner to the platform.
Licensing
☐ Conduct a chain of title review before licensing: identify what was human-authored, what was AI-generated, and whether the registration accurately reflects the human-authored scope.
☐ During licensing, review representations about ownership or validity and confirm they are consistent with how the work was created and registered.
☐ Request or provide documentation of the creative workflow as part of pre-closing diligence.
Employees
☐ Review IP assignment provisions in employment agreements to confirm they capture AI-assisted work product without unintended carveouts.
☐ Establish AI use policy and require employees to acknowledge and comply with the company's AI use policy as a condition of employment or onboarding.
Work for Hire/Service Providers
☐ Require disclosure of AI tool use, including which platform and how, and compliance with AI use policies, prior to execution of any agreement for creative work.
☐ For processing or post-production service providers, require disclosure of AI use and maintenance of version history/change logs for core creative assets.
☐ Confirm that work-for-hire language in service provider agreements captures AI-assisted work product and that any representations about ownership are consistent with the scope of human authorship.
Confidentiality
☐ Ensure AI use policy addresses confidentiality issues by limiting use to appropriate platforms, requiring appropriate privacy settings, and prohibiting submission of confidential materials without authorization.
☐ Incorporate AI confidentiality requirements into vendor agreements and employment agreements.
Chieh Tung is a litigator representing companies in copyright, trademark, and business disputes. She writes about developments at the intersection of AI and intellectual property.
***
[1] https://www.nytimes.com/2026/03/19/books/shy-girl-book-ai.html
[2] Id.
[3] https://openai.com/policies/row-terms-of-use/ (effective Jan. 1, 2026) (“As between you and OpenAI, and to the extent permitted by applicable law, you (a) retain your ownership rights in Input and (b) own the Output. We hereby assign to you all our right, title, and interest, if any, in and to Output.”).
[4] https://docs.midjourney.com/hc/en-us/articles/32083055291277-Terms-of-Service (effective Feb. 12, 2026), § 4.
[5] Adobe Generative AI Product Specific Terms (effective June 17, 2025), §§ 8.1–8.4.
[6] https://openai.com/policies/service-terms/ (effective Jan. 9, 2026), §§ 1, 3; https://www.anthropic.com/legal/commercial-terms (effective June 16, 2025), § K.
[7] Supra note 1 (Open AI Terms) (“If you are a business or organization, to the extent permitted by law, you will indemnify and hold harmless us. . .”); supra note 2 (Midjourney Terms), § 10 (“To the extent permitted by law, you will indemnify and hold us harmless. . . from third party claims arising out of or relating to your use of the Services and Assets or any violation of these Terms”); https://www.anthropic.com/terms), § 11 (“YOU AGREE TO INDEMNIFY AND HOLD HARMLESS THE ANTHROPIC PARTIES FROM AND AGAINST ANY AND ALL LIABILITIES . . . ARISING OUT OF OR RELATED TO . . . YOUR ACCESS TO, USE OF, OR ALLEGED USE OF THE SERVICES”).
[8] See Parts I and II of this series for a full discussion of the human authorship requirement and registration mechanics.
[9] See supra, note 1.
[10]17 U.S.C. § 101 (defining “work made for hire” to include works prepared by an employee within the scope of employment, and certain categories of commissioned works where the parties expressly agree in a written instrument signed by them that the work shall be considered a work made for hire).