In an age where digital manipulation has become increasingly sophisticated, the problem of non-consensual deepfakes has reached alarming proportions. This issue came into sharp focus earlier this year when explicit AI-generated images of popular musician Taylor Swift surfaced on X (formerly Twitter). This incident not only sparked widespread concern but also led to urgent calls for the US Senate to enact new legislation to address the growing menace of deepfake abuse. The Senate has now proposed the COPIED Act, aiming to safeguard the rights of content creators and enhance transparency in the use of artificial intelligence.

Understanding the Act and Its Implications
Presently, the fair use doctrine allows journalistic content to be used for training AI models without needing permission or providing compensation. The proposed COPIED Act, which has garnered support from major media organizations such as the News/Media Alliance and the National Newspaper Association, seeks to change this by acknowledging the rights of content creators. This legislation represents a significant step forward in protecting the intellectual property of those who produce original content.
The Act emphasizes several critical areas where Congress believes improvements are needed:
- A lack of visibility into how AI systems operate.
- Insufficient transparency regarding the data used to train these systems.
- The absence of consensus-based standards and practices for developing and deploying such systems.
Moreover, the act acknowledges that current deficiencies in identifying the nature, origins, and authenticity of digital content have negatively impacted the public, as well as journalists, publishers, broadcasters, and artists. These groups’ content is often used to train AI systems, which can then be manipulated to create synthetic or synthetically-modified content. Important highlights of the bill include:
Key Definitions
- Artificial Intelligence
This term has the same meaning as given in section 5002 of the National Artificial Intelligence Initiative Act of 2020.
- Covered Content
The term “covered content” means a digital representation, such as text, an image, or audio or video content, of any work of authorship described in section 102 of title 17, United States Code.
- Synthetic Content
This term refers to information, including works of human authorship such as images, videos, audio clips, and text, that has been wholly generated by algorithms, including artificial intelligence.
- Synthetically-Modified Content
This term defines information, including works of human authorship such as images, videos, audio clips, and text, that has been significantly modified by algorithms, including artificial intelligence.
- Deepfake
This term refers to synthetic content or synthetically-modified content that:
- Appears authentic to a reasonable person; and
- Creates a false understanding or impression.
- Covered Platform
The term “covered platform” refers to a website, internet application, or mobile application available to users in the United States, including a social networking site, video sharing service, search engine, or content aggregation service available to users in the United States, that either:
- Generates at least $50,000,000 in annual revenue; or
- Had at least 25,000,000 monthly active users for not fewer than 3 of the 12 months immediately preceding any conduct by the covered platform in violation of this legislation.
Duties of the National Institute of Standards and Technology (NIST)
Under the COPIED Act, the National Institute of Standards and Technology (NIST) is tasked with a crucial role in establishing a public-private partnership, in collaboration with the U.S. Patent and Trademark Office (USPTO) and the U.S. Copyright Office. This partnership aims to develop standards for content provenance information technologies and methods for detecting synthetic and synthetically-modified content. Key initiatives outlined in the Act include:
Public Education Campaign: Within one year of the Act’s enactment, NIST will launch a public education campaign, in consultation with the Register of Copyrights and the Director. This campaign will focus on raising awareness about synthetic and synthetically-modified content (including deepfakes), watermarking, and content provenance information.
Development of Guidelines and Standards: NIST will work on creating guidelines, voluntary consensus-based standards, and best practices for watermarking and content provenance information. This includes addressing the detection of synthetic and synthetically-modified content across various media types, such as images, audio, video, and text. The initiative also covers the use of data to train AI systems and other aspects related to the transparency of synthetic media.
Evaluation and Assessment Tools: NIST will develop guidelines, metrics, and practices for evaluating and assessing tools designed to detect and label synthetic, synthetically-modified, and non-synthetic content. This includes activities like AI red-teaming and blue-teaming to ensure the robustness and reliability of detection tools.
Grand Challenges and Prizes: In coordination with the Defense Advanced Research Projects Agency (DARPA) and the National Science Foundation (NSF), NIST will establish grand challenges and prize competitions. These initiatives aim to enhance the detection and labeling of synthetic and synthetically-modified content and to develop cybersecurity measures to protect detection tools, watermarks, and content provenance information from tampering.
Research Initiatives: NIST is also responsible for conducting research to advance measurement science, standards, and testing related to the effectiveness and resilience of technologies used for detecting synthetic content, watermarking, and content provenance information. This research will also focus on cybersecurity protections and other countermeasures to prevent tampering with these technologies.
Requirements for Content Provenance Information and Prohibited Acts
- Content Provenance Information:
• Synthetic and Synthetically-Modified Content: Two years after the enactment of the COPIED Act, any commercial entity offering tools primarily for creating synthetic or synthetically-modified content in interstate commerce must:
A) Provide users with the ability to include content provenance information, indicating the content is synthetic or synthetically-modified, following the standards established under section 4 of the Act.
B) Implement reasonable security measures to ensure that this information is machine-readable and not easily removable, alterable, or separable from the content, where technically feasible.
• Covered Content: Similarly, commercial entities offering tools for creating or substantially modifying covered content must:
A) Offer users the option to include content provenance information for such content.
B) Ensure that this information is machine-readable and protected against removal, alteration, or separation, where technically feasible. - Removal of Content Provenance Information:
• General Prohibition: It is unlawful for any person to knowingly remove, alter, tamper with, or disable content provenance information to engage in an unfair or deceptive act or practice in commerce.
• Exception for Security Research: A covered platform is exempt from liability if it removes or alters this information solely for necessary, proportionate, and limited security research purposes. - Prohibition on Non-Consensual Use of Covered Content:
The Act makes it unlawful to use covered content, which includes attached or associated content provenance information or from which such information has been removed, to train AI systems or generate synthetic content without the express, informed consent of the content owner. Users must also comply with the terms of use, including any compensation requirements set by the copyright owner.
Enforcement and Support
The Federal Trade Commission (FTC) is empowered to enforce the provisions of this Act, treating any violation of this Act or a regulation promulgated under it as a violation of a rule defining an unfair or deceptive act or practice prescribed under the Federal Trade Commission Act. Additionally, the Attorney General of a state may bring a civil action on behalf of the state’s residents, as parens patriae, if the Attorney General believes that an interest of the residents has been or is threatened or adversely affected by the engagement of any person in a practice that violates this legislation.
The Act has received widespread support from industry leaders across different arenas. “The COPIED Act will put creators, including local journalists, artists, and musicians, back in control of their content with a provenance and watermark process that is very much needed,” said US Senator Cantwell. An official press release emphasized that these measures will allow content owners to protect their work and set terms for its use, including seeking compensation. The bill has garnered broad support from labor unions and industry associations, including the Screen Actors Guild (SAG-AFTRA), the National Music Publishers’ Association, the Songwriters Guild of America, and the National Association of Broadcasters.
Our Takeaway
The COPIED Act represents a landmark effort to address the complex challenges posed by AI and deepfake technologies. By requiring transparency and giving content creators control over their work, the Act aims to protect the integrity of digital content and the rights of its creators. As AI technology continues to evolve, the need for robust legislation to safeguard against misuse becomes increasingly crucial. The COPIED Act not only aims to fill these gaps but also sets a precedent for future regulatory frameworks.
This legislation challenges us to consider the ethical implications of AI and the importance of maintaining accountability in an increasingly digital world. The Act’s provisions for content provenance and watermarking are designed to empower creators and protect consumers, ensuring that digital interactions remain authentic and trustworthy.
You can access the bill by following this link.
Click here to subscribe to our AI Newsletter.
Looking for guidance on your AI implementation journey?
Connect with Ajay Mago or any member of EM3’s Artificial Intelligence practice for professional support.

Ajay Mago, Managing Partner at Maxson Mago & Macaulay, LLP (EM3 Law LLP).
Disclaimer: This publication is for information purposes only and should not be construed as legal advice or a substitute for legal counsel. This information is not intended to create an attorney-client relationship. Do not send us any unsolicited confidential information unless and until a formal attorney-client relationship has been established. EM3 Law is under no duty of confidentiality to persons sending unsolicited messages, e-mails, mail, facsimiles and/or any other information by any other means to our firm or attorneys prior to the formal establishment of such relationship. The views and opinions expressed herein are those of the author(s) and do not necessarily reflect the views of the firm.
