Are AI Music Covers Ethical? Here’s What People in the Industry Are Saying

Views on AI Music Covers Vary in the Industry

Robot hands type on a piano.

Main image courtesy of Variety

With advancements in artificial intelligence in recent years, machine generated art has become more and more commonplace. And the music industry is not exempt from this. Companies like Suno and Udio put AI tools in the hands of those with little to no experience making music, allowing them to create new songs based on the work of their favorite artists.

One subset of music that AI has particularly influenced is the world of music covers. Music covers, re-recordings of an artist’s previously released work by a new artist, can breathe new life into old classics or re-imagine songs that were lacking in some fashion. Sometimes it’s just nice to hear a different artist’s take on a song. And the discussions around covers vs. the original can be rather interesting. Look no further than the recent controversy around Beyoncé’s cover of Jolene, originally written by Dolly Parton.

With AI in the mix, you don’t have to bank on an artist deciding to cover a song. You can just make it yourself with an AI tuned to their voice. This opens up a wave of possibilities, but it makes some in the industry antsy. Today, we’ll be going over AI music covers and the reactions it has elicited from artists and other players in the music industry.

AI Music Covers in a Nutshell

The Jammable homepage.
AI covers are popping up all over social media. Image courtesy of Jammable.

By now, you probably already know the basics around how generative AI works. AI models are fed data from existing work (songs, paintings, books, etc.), it processes that data via machine learning, and it then produces an output based on that initial input. This is how you end up with songs that imitate the work of popular artists like Drake and Michael Jackson. And that also explains why some big players in the industry are pushing back against AI.

In the case of AI covers, models are trained on specific voices and can then be prompted to speak, or in this case, sing, whatever the user desires in that voice. This opens up a huge amount of possibility in voices. If you’ve heard it before on a song, tv show, movie, interview, or anything else, you can replicate it with AI. It doesn’t even have to be a real person. 

You could try to train an AI model yourself if you’re tech savvy or want to replicate your own voice, but if it’s a famous voice you’re after, you’ll likely be able to find it on a site like Jammable

The range of possibilities available with AI covers can often have rather comedic results. Want to hear Elmo to cover Billie Jean? That’s totally doable with AI. Always wonder what it would sound like to have Homer and Marge Simpson perform Somebody That I Used To Know? Now you can. You’ll find many wacky combinations across Youtube and Tiktok.

The Problem With AI Music Covers

Ghostwriter, covered in a white sheet and wearing sunglasses, stands in a dark room.
The mysterious Ghostwriter made a name for himself after his AI Drake song took off. Image courtesy of Billboard.

The more pressing concern is when an AI cover mimics the voice of an established artist. These covers, depending on how successful they become, have the potential to impact an artist’s brand. You might think that AI music couldn’t be THAT good at replicating popular artists, but you might change your tune when you read that an AI generated Drake song was submitted to the Grammys.

Situations like this bring up some interesting questions. Can people making AI covers profit off it? Should they be able to use an artist’s voice without their permission? Regulations around these questions are still in the works and will no doubt be a hot button issue going forward. 

For now, we can take a look at reactions to the emergence of AI covers, both from artists themselves and record labels.

The Industry’s Perspective

A man in a sequin jacket speaks into a microphone.
Musical artists have rather polarized opinions on the AI covers. 

In the past few years, artists and music companies have made their stance on AI covers known on social media.

Bad Bunny

Last year, Puerto Rican rapper and singer Bad Bunny had some not so nice things to say when an AI cover featuring his and Justin Beiber’s vocals was released on social media. He called it “a shit of a song” in Spanish and warned his fans not to listen to it. I think it’s safe to say that he’s not a fan of AI covers. 

The Beatles

On the flip side, for their final single, titled Now and Then, The Beatles used AI to extract John Lennon’s voice from old recordings, bringing the legendary artist back for one last hurrah in 2023. Paul McCartney gave a candid statement about the matter in interviews: “There it was, John’s voice, crystal clear. It’s quite emotional. And we all play on it, it’s a genuine Beatles recording. In 2023 to still be working on Beatles music, and about to release a new song the public haven’t heard, I think it’s an exciting thing.”

While The Beatles seem to have had a positive experience with the technology, this experiment opens up a whole new can of worms. Just as AI can replicate living artists, it can do the same for those who have passed on. Can such replications be considered a tribute to an artist? Or is it disrespectful to use someone’s voice after their death? The Beatles got permission from Lennon’s family, but one wonders how similar attempts might be handled in the future.

Drake

One of the biggest stars in hip-hop today, Drake was at his wits end with an AI cover last year. He didn’t take kindly to an AI cover of his voice singing Ice Spice’s “Munch” and had this to say on Instagram: “This is the last straw.” It’s unclear how much of his disdain was due to AI vs. the particular choice of song. 

Universal Music Group

Mirroring Drake’s sentiments, Universal Music Group, one of the most powerful record labels in the world, has taken steps to prevent AI from being trained on the copyrighted music of their artists. UMG issued a complaint with Spotify and Apple Music about AI accessing and training off of copyrighted songs on these platforms. One of UMG’s statements reads

“We have a moral and commercial responsibility to our artists to work to prevent the unauthorized use of their music and to stop platforms from ingesting content that violates the rights of artists and other creators. We expect our platform partners will want to prevent their services from being used in ways that harm artists.”

Grimes

The musician Grimes splits off from the herd with her take. She not only has a positive view on AI, citing the idea of “open-sourcing all art and killing copyright,” she also gave her fans permission to use her voice, stating on Twitter: "I'll split 50 per cent royalties on any successful AI-generated song that uses my voice. Same deal as I would with any artist I collab with. Feel free to use my voice without penalty. I have no label and no legal bindings."

Her only caveat was that she may have to take action against covers with “rly rly toxic lyrics,” an understandable concern to be sure. 

Billie Eilish

Billie Eilish shared her worries about the future of AI music covers during an episode of The Late, Late Show with James Corden last year. "I feel like my approach is not as optimistic, I'm a little bit scared of it," she admitted. "I'm a little scared of what someone could create of me doing something with it." 

The Future of AI Music Covers

Who knows what the final verdict will be around AI generated music covers in the years to come. The music industry certainly has misgivings around the technology though, with valid concerns all around. Time will tell whether this technology is allowed to thrive or if it will be snuffed out.

Nathan Eke

Nathan Eke is a professional writer based in Pittsburgh.
See All Posts >>

You Might Also Like...