New Senate Bill Aims to Safeguard Content Creators from AI Misuse
4 min readA new bill has been introduced by a group of bipartisan senators. This legislation seeks to provide protection for artists, songwriters, and journalists against unauthorized use of their content by AI. These creators are concerned about their work being used to train AI models or generate AI content without their consent.
The Content Origin Protection and Integrity from Edited and Deepfaked Media Act, or COPIED Act, is at the heart of this new legislative effort. It aims to make it harder for AI to misuse human-generated content and to detect harmful deepfakes. Meanwhile, key figures in the Senate emphasize the need for transparency and control in AI-generated content.
COPIED Act: Ensuring Content Protection
The Content Origin Protection and Integrity from Edited and Deepfaked Media Act, or COPIED Act, aims to protect content created by artists, songwriters, and journalists. This bill requires companies to attach content provenance information to digital content within two years. The provenance acts as a watermark, ensuring the original creators’ rights are respected, and prevents their work from being used to train AI models without consent.
Combating Deepfakes and AI Manipulation
One of the critical provisions of the COPIED Act is to tackle the issue of harmful deepfakes. The bill would mandate guidelines for detecting and watermarking synthetic content, developed by the National Institute of Standards and Technology (NIST). This aims to make it easier to identify whether content has been altered or generated by AI. The rise of deepfakes has posed significant challenges, from spreading misinformation to violating personal rights.
A Bipartisan Effort for Content Rights
The bill is spearheaded by Senate Commerce Committee Chair Maria Cantwell (D-WA) and supported by senators Martin Heinrich (D-NM) and Marsha Blackburn (R-TN). The bipartisan nature of this effort underscores the importance and urgency of the issue across political lines. Cantwell emphasizes the need for transparency in AI-generated content, aiming to provide creators control over their work.
Support from Artists’ Groups
The COPIED Act has garnered support from several prominent artists’ groups, including SAG-AFTRA, the National Music Publishers’ Association, and the Songwriters Guild of America. These groups recognize the necessity of protecting creative content in the rapidly evolving digital landscape. Artists and journalists alike back the legislation as a vital measure to guard against unauthorized use and manipulation.
Broader Legislative Context
The introduction of the COPIED Act is part of a broader legislative movement to regulate AI technology. Just last month, Senator Ted Cruz proposed the Take It Down Act, aimed at holding social media platforms accountable for policing AI-generated deepfake porn. Meanwhile, Senate Majority Leader Chuck Schumer introduced a roadmap to address AI, which includes measures for national security and election integrity.
State-Level AI Regulations on the Rise
State legislatures are increasingly active in introducing AI-related bills. Axios reported an average of 50 AI-related bills per week earlier this year, totaling 407 across more than 40 states by February. This surge in legislative activity reflects growing concerns about AI’s impact and the need for structured guidelines.
Executive Orders on AI Safety
In addition to legislative efforts, executive actions have also been taken to address AI safety. President Joe Biden issued an executive order last October setting standards for AI safety and security. These standards require AI developers to share safety test results with the government before deploying their systems. However, former President Donald Trump has promised to repeal this order if re-elected.
NIST’s Role in Implementing Standards
The National Institute of Standards and Technology (NIST) will play a crucial role in implementing the COPIED Act. NIST is tasked with developing guidelines for content provenance information, watermarking, and synthetic content detection. These guidelines will help determine the origins of digital content and whether it has been altered by AI, providing a framework for content integrity.
Platform Accountability and Legal Recourse
The COPIED Act empowers content owners with the right to sue platforms that use their content without permission or tamper with content provenance information. This provision aims to hold platforms accountable and ensure that creators are compensated for their work. Such measures are seen as crucial steps in safeguarding intellectual property in the digital age.
A Step Toward Transparent AI Use
The COPIED Act represents a significant step toward transparency in AI use. By mandating content provenance and watermarking, the bill aims to put creators back in control of their digital content. This effort not only protects rights but also establishes norms for ethical AI practices, addressing the challenges posed by rapid technological advancements.
In conclusion, the proposed COPIED Act represents a significant stride in safeguarding the rights of content creators. By tackling AI misuse head-on, the bill aims to ensure transparency and control over digital works.
This legislative effort highlights the urgent need for clear guidelines and protections in the evolving AI landscape. Content provenance and watermarking are crucial steps toward achieving this goal.