It’s a thrill, isn't it? You’re scrolling through your music app, maybe Spotify, and there it is – a brand new track from your favorite artist. You hit play, ready to be swept away. But what if that song, that fresh melody, isn't actually from the artist you love? What if it’s an AI, a digital imposter, singing their tune?
This isn't some far-off sci-fi scenario; it's happening now, and NPR's technology correspondent Bobby Allyn has been digging into this increasingly common, and frankly, unsettling trend. He recently reported on how artificial intelligence is starting to flood music platforms, sometimes impersonating real musicians, both living and, surprisingly, deceased.
Allyn’s reporting brought to light the experience of Los Angeles musician Luke Temple, formerly of the band Here We Go Magic. Temple woke up one morning to a barrage of messages from fans asking about a new song released under his band’s name. The kicker? He and his band hadn't released new music in years, and the song itself bore no resemblance to their actual sound. As Temple put it, it was “predatory and so terrible.”
And he’s not alone. Allyn spoke with industry insiders who confirmed this is a growing problem. We're seeing AI-generated songs popping up on the Spotify pages of artists like Wilco’s former band, and even country singer Blaze Foley, who passed away in 1989.
So, how is this even possible? You might think that for megastars like Taylor Swift or Beyoncé, there are ironclad protections. And you’d be right; there are special safeguards for top-tier artists. But for independent bands that are no longer active, or artists who have passed on, it’s become disturbingly feasible. Using AI generators, someone can create a song, slap a familiar artist’s name on it, and upload it through third-party distribution tools. The sophistication of AI music has advanced to a point where automatic detection systems are being tricked, allowing these tracks to slip through the cracks.
Why would someone do this? The prevailing theory, according to Allyn’s sources, is financial. By flooding platforms with AI-generated tracks on a massive scale, spammers can potentially earn tiny fractions of a cent per stream. It might seem insignificant, but multiplied by millions, it adds up. It’s a digital hustle, preying on the established presence and fan base of real artists.
Spotify, when contacted, acknowledged the issue. They’ve been working on defenses and reported taking down a staggering 75 million “spammy” songs last year, a significant portion of which were AI-generated. But as Allyn’s reporting shows, it’s a constant cat-and-mouse game, with AI technology evolving rapidly, making it harder and harder to distinguish the genuine from the generated. It leaves us, the listeners, with a new layer of caution to apply every time we discover what seems like a surprise new release.
