A new bill aimed against AI abuse in music (and beyond) has been introduced
It’s called the NO AI FRAUD Act. That’s actually an acronym that stands for “No AI Fake Replicas and Unauthorized Duplications.” Its introduction in the US House of Representatives yesterday (January 10) comes from a bipartisan group of politicians (three Republicans and two Democrats).
It targets “abusive AI deepfakes, voice clones, and exploitive digital human impersonations” and promises to put safeguards and guardrails in place to protect both artists and public figures (and, presumably regular people, too.)
Heard this before? Yes, we have, Back in the fall, there was the NO FAKES Act (“Nurture Originals, Foster Art, and Keep Entertainment Safe” Act). Both bills have received praise from industry groups like the Recording Industry Association of America, Universal Music Group CEO Sir Lucian Grange, and an artist lobby group called The Human Artistry Campaign.
Meanwhile, Tennessee is prepping its own brand new legislation that’s focused on protecting musicians. It’s called the ELVIS Act (The Enduring Likeness Voice and Image Security ACT. Seriously). And over the UK, there’s also legislation in the pipeline.
This is just the start of a new layer of standard protections that should benefit everyone involved–except people looking to use AI for illegal, immoral, and unethical purposes. But we’ll see, won’t we?
How do we keep AI from deleting tracks it doesn’t like from streaming services?
It starts with Oasis… and slowly moves onto your license & registration.