USA: A new law to protect digital identity from deepfakes | Large language models tutorial python github | Is chatgpt an llm | Large language models tutorial pdf free download | Turtles AI
The United States Congress recently reintroduced the bipartisan NO FAKES Act, aimed at combating the unauthorized use of AI-generated digital replicas, such as audio and video deepfakes. Backed by technology companies such as Google, OpenAI, Amazon, and YouTube, the bill seeks to give citizens control over their digital image and voice, addressing the challenges posed by the growing spread of synthetic content.
Key Points:
- The NO FAKES Act gives citizens the exclusive right to authorize the use of their voice and image in AI-generated digital content.
- The law includes exemptions for biographical, critical, and parodic works, to protect freedom of expression.
- YouTube has announced its support for the bill, implementing technologies to manage digital replicas.
- Blockchain-based technologies are being explored to authenticate and verify the origin of digital content, combating the spread of deepfakes.
The “NO FAKES Act” (Nurture Originals, Foster Art, and Keep Entertainment Safe Act), introduced by Senators Chris Coons (D-DE) and Marsha Blackburn (R-TN), seeks to establish a federal right to publicity, giving individuals control over the use of their digital identities. Those rights would pass to their heirs upon death for up to 70 years. The bill, supported by organizations including the Recording Academy and the Motion Picture Association, has been amended from its 2023 version to include specific exemptions that protect biographical, critical, and parodic works, addressing free speech concerns. YouTube has expressed its support for the measure, deploying technologies to manage digital replicas and working with agencies like the Creative Artists Agency to help creators detect and request removal of unauthorized content.
Alongside legislative efforts, blockchain technology is emerging as a tool to authenticate and verify the origin of digital content. Solutions like Amber Authenticate use public blockchains to record cryptographic hashes of videos, allowing them to detect any alterations. Startups like OpenOrigins provide blockchain-based platforms for newsrooms to authenticate content in real time. Large companies like Fox Corp are also adopting similar technologies to track the provenance of their online content. These initiatives aim to combat the spread of deepfakes by providing tools to verify the authenticity of digital content and protect the integrity of information.
In a context where technology is rapidly advancing, integrating legislative measures and technological solutions is a proactive approach to address the challenges posed by deepfakes and protect individual rights in the digital age.