From courtroom to conference roomThe speed of the industry's position on AI is dizzying. Just last year, Universal Music Group, Sony Music, and Warner
Music Group sued the AI music startups Suno and Udio, accusing them of training their models on copyrighted music without permission. Now Universal has settled with Udio, agreeing to launch a subscription service next year where fans can create remixes and customized tracks using licensed songs. The settlement terms remain undisclosed, but the structure hints at the industry's strategy. Artists must opt in to have their music included, and all AI-generated content must stay within Udio's platform. Similar deals are reportedly weeks away. According to the Financial Times, Universal and Warner are in talks with Google, Spotify, and various AI startups including Klay Vision, ElevenLabs, and Stability AI. The labels are pushing for a streaming-like payment model where each
use of their music in AI training or generation triggers a micropayment. The urgency is understandable. Besides Monet, Billboard said at least one new AI artist has showed up on the charts for the last five weeks, meaning there are more and more chances for chart-topping confusion. Spotify revealed that it removed 75 million tracks last year to maintain quality, though the company won't specify how many were AI-generated. Deezer, another streaming platform, reports that up to 70% of AI-generated music streams on its platform are fraudulent, suggesting the technology is already being weaponized for streaming
fraud at scale. The human costFor independent artists and smaller acts, the implications are stark. Unlike Taylor Swift or Billie Eilish, who have leverage through their labels and massive fanbases, emerging musicians face an ecosystem where they compete against infinite variations of
themselves. The lack of transparency about what music AI models are trained on means independent artists could be losing compensation without even knowing their work was used. Industry groups are calling for mandatory labeling of AI-generated content, warning that without safeguards, artificial intelligence risks repeating streaming's pattern of tech platforms profiting while creators struggle. Currently, streaming platforms have no legal obligation to identify AI-generated music. Deezer uses detection software to tag AI tracks, but Spotify doesn't label them at all, leaving consumers in the dark about what they're hearing. The industry's challenge goes beyond detection or regulation. Music has always been more than sound waves arranged in pleasing patterns. It's been about human connection, shared experience, and
the stories we tell ourselves about the songs we love. As AI-generated artists climb the charts and secure record deals, the question isn't whether machines can make music that sounds real. They already can. The question is whether listeners will still care about the difference once they know the truth. —Jackie Snow, Contributing Editor |