Wednesday, June 12, 2024

Warner Music Group boss Robert Kyncl backs US Senate bill to crack down on AI deepfakes

Must read


Warner Music Group (WMG) CEO Robert Kyncl has come out in favor of a US Senate bill that would crack down on unauthorized deepfakes, arguing that the technology could ruin lives, reputations and businesses if left unchecked.

Although many in the music business are embracing AI, it’s also the case that “generative AI is appropriating artists’ identities and producing deepfakes that depict people doing, saying, or singing things that never happened,” Kyncl will tell a subcommittee of the Senate Judiciary Commitee on Tuesday (April 30).

“Through AI, it is very easy for someone to impersonate me and cause all manner of havoc,” Kyncl said, according to his pre-prepared remarks shared with MBW.

“They could speak to an artist in a way that could destroy our relationship. They could say untrue things about our publicly traded company to the media that would damage our business.”

You can watch the hearing here.

Kyncl will appear before the committee on Tuesday to give his and WMG’s backing for the Nurture Originals, Foster Art and Keep Entertainment Safe (NO FAKES) Act, a proposed piece of legislation that would protect the voice and visual likeness of all individuals against unauthorized use in AI-generated deepfakes.

Though the wording of the bill hasn’t yet been finalized, a “discussion draft” of it was circulated in the Senate last October by Democratic Sens. Christopher Coons of Delaware and Amy Klobuchar of Minnesota, and Republican Sens. Marsha Blackburn of Tennessee and Thom Tillis of North Carolina.

The proposed bill aims to “hold individuals or companies liable if they produce an unauthorized digital replica of an individual in a performance” and “hold platforms liable for hosting an unauthorized digital replica, if the platform has knowledge of the fact that the replica was not authorized by the individual depicted.”

It also aims to carve out exceptions “based on recognized First Amendment [freedom of speech] protections.”

A bill with similar aims was introduced in the US House of Representatives this past January. The No AI FRAUD Act establishes “an intellectual property right that every individual holds over their own likeness and voice, allows individuals to seek monetary damages for harmful, unauthorized uses of their likeness or voice,” and “guards against sexually exploitative deepfakes and child sexual abuse material.”

That bill – also a bipartisan effort, like the NO FAKES Act – quickly garnered the support of major players in the music industry, including the Recording Industry Association of America (RIAA) and Universal Music Group (UMG). It also has the support of the Human Artistry Campaign, which seeks to protect creators’ works and livelihoods while advocating for the responsible development of AI.

Separately, the state of Tennessee has passed into law the ELVIS Act, which updates the state’s right of publicity to include protections for songwriters, performers, and music industry professionals’ voices from the misuse of artificial intelligence (AI).

These legislative efforts have come in the wake of a growing number of controversies involving unauthorized deepfakes, within the music industry, in the broader entertainment industry, and in society as a whole.

“The truth is everyone is vulnerable – families defrauded by voice clones pretending to be relatives; people placed in pornography without their consent; school children having their faces inserted into humiliating scenes.”

Robert Kyncl, Warner Music Group

Last summer the music business was rocked by the appearance of an unauthorized AI-generated track “performed” by Drake and The Weeknd, which was quickly the subject of takedown notices by UMG, with whom both artists are signed.

Months later, veteran actor Tom Hanks warned that his likeness was being used without permission in an ad selling a dental plan.

And outrage has arisen around incidents of people’s images being used without permission in AI-generated pornographic images, which have sometimes had underage victims.

Recently, X (formerly Twitter) was forced to take action when AI deepfakes of obscene images featuring Taylor Swift circulated on the platform.

“Unfettered deepfake technology has the potential to impact everyone – even all of you,” Kyncl said in his statement on Tuesday.

“Your identities could be appropriated and used to mislead your constituents. The truth is everyone is vulnerable – families defrauded by voice clones pretending to be relatives; people placed in pornography without their consent; school children having their faces inserted into humiliating scenes.”

Kyncl said the NO FAKES Act should include three elements “to be effective”: an “enforceable” intellectual property right for likeness and voice; “effective deterrence” that would including “meaningful consequences for AI model builders and platforms that knowingly violate a person’s property rights”; and “respect for important First Amendment principles” – though without the creation of “loopholes that create more victims.”


In a column published in The Hill on Tuesday, Kyncl argued that, with each successive change in technology, music has been “the canary in the coalmine,” pointing to the dangers of that technological change even as it led the way in showcasing its benefits.

“Music has already given us glimpses of what this incredible technology [AI] can do when artists are on board — whether it’s musicians permitting fans to create new songs using replicas of their voices, artists’ estates empowering posthumous biopics featuring perfect replicas of late stars, or singers who have lost their voice to illness miraculously recording again,” Kyncl wrote.

“At the same time, we’ve witnessed the specter of AI’s possible downside for artists: vast scraping and copying of creative works, and the rise of deep fakes where artists’ voices, faces and identities are appropriated without their consent…

“The starkly different positive and negative uses of AI that music is grappling with today represent two alternate versions of the future — with artists at the tip of the spear.”

Kyncl said that the responsible development of AI will depend on the adoption of four principles: Consent for the use of people’s likeness and voice; monetization, meaning likeness and voice should be available for licensing; attribution, meaning AI-generated content must be labeled as such; and provenance, i.e., AI developers should keep publicly available records of the materials they used to train their models.


“What is not acceptable is when my art and my identity can simply be taken by a third party and exploited falsely for their own gain without my consent due to the absence of appropriate legislative control.”

FKA Twigs

Also scheduled to testify before the Senate subcommittee on Tuesday is British artist FKA Twigs, who is signed to WMG’s Atlantic Records.

“I am here because my music, my dancing, my acting, the way that my body moves in front of a camera and the way that my voice resonates through a microphone… are essential reflections of who I am,” Twigs said in her prepared statement.

“Yet this is under threat. AI cannot replicate the depth of my life journey, yet those who control it hold the power to mimic the likeness of my art, to replicate it and falsely claim my identity and intellectual property. This prospect threatens to rewrite and unravel the fabric of my very existence.”

Twigs noted she isn’t opposed to AI technology, and has in fact developed her own AI-generated doppelganger, “AI Twigs,” who will debut on her social channels later this year.

While tools like these are “highly valuable… what is not acceptable is when my art and my identity can simply be taken by a third party and exploited falsely for their own gain without my consent due to the absence of appropriate legislative control,” Twigs added.

“We must get this right – you must get this right, now, before it is too late,” Twigs said.Music Business Worldwide

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article