Can You Spot AI-Generated Music? 97% of People Can’t

Can You Spot AI-Generated Music? 97% of People Can't - Professional coverage

According to TechRepublic, a Deezer-Ipsos survey found that 97% of respondents couldn’t identify AI-generated music, highlighting how convincingly machines now mimic human creativity. The streaming service Deezer reports roughly a third of content uploaded to their platform—about 50,000 tracks daily—is fully AI-generated, and they’ve launched detection tools that flagged viral band The Velvet Sundown as “100% AI-generated.” Spotify has removed over 75 million low-quality AI tracks in the past year and plans new metadata tags for AI disclosure. Meanwhile, artists from The Beatles to Imogen Heap are using AI creatively, with Heap developing her own AI voice model called ai.Mogen that appears as a co-contributor on tracks.

Special Offer Banner

The telltale signs

Here’s the thing about AI music—it’s getting scarily good, but there are still clues if you know what to listen for. Musician LJ Rich points out that AI tends to produce formulaic verse-chorus structures that feel catchy but emotionally thin. The vocals often sound breathless, endings feel abrupt, and lyrics are grammatically perfect but emotionally flat. Basically, AI struggles with the weird, imperfect poetry that makes human music memorable—think Alicia Keys’ “concrete jungle where dreams are made of” or The Rolling Stones’ intentional double negatives.

Then there’s the productivity red flag. When an unknown artist suddenly drops multiple albums in one go, it’s worth asking whether a machine is doing the heavy lifting. University of Cambridge professor Gina Neff described one case where tracks sounded like “really classic rock hits that had been put in a blender.” And according to music industry adviser Tony Rigg, AI vocals lack those micro-imperfections—the small strains, breaths, and emotional breaks—that make human singing feel authentic.

The ethics of synthetic sound

Now we’re getting into some seriously tricky territory. Hundreds of major artists including Dua Lipa and Elton John have protested the unlicensed use of their songs in AI training datasets. But what happens when artists willingly embrace AI, like Imogen Heap with her ai.Megan model? She’s transparent about it being a co-contributor, but admits the voice “does sound different if you really know my voice.”

The real question is: if a song gives you chills, does it matter whether a human or algorithm created it? For many listeners, emotional connection is the only metric that counts. But informed choice matters too, especially when it comes to supporting actual human artists versus synthetic creations. We’re facing what LJ Rich calls “weird and beautiful ethical questions” that society is just beginning to grapple with.

Where this is headed

Look, the cat’s out of the bag—AI music is here to stay. The technology has evolved from requiring hours of computing for seconds of audio to generating complete tracks instantly from a single prompt. Streaming platforms are scrambling to respond, with Deezer already labeling AI content and Spotify preparing spam filters and disclosure systems.

But here’s what keeps me up at night: as detection gets harder and AI gets better, we might reach a point where even experts can’t reliably tell the difference. The BBC reports that current signs are “hints not proof” as tools grow increasingly sophisticated. We’re heading toward a musical uncanny valley where the only thing separating human from machine might be a metadata tag—if the creators choose to disclose at all.

So what happens when your favorite new artist turns out to be an algorithm? The answer is still being written—by humans and machines alike.

Leave a Reply

Your email address will not be published. Required fields are marked *