“Music producers are rejecting AI”: what the new studies tell us and what it means for music’s future
Brother — the headlines have been loud this week: a new survey says over 80% of music producers are actively against AI-generated songs, while separate listener-focused research shows most people can’t tell AI-made tracks from human ones. That collision — creators rejecting the tech at the same time listeners can’t reliably detect it — is the perfect storm for arguments about ethics, jobs, creativity, and regulation. Below, I’ll walk you through the findings, explain why they matter, and give a blunt take on what could happen next.
The headline findings:
• A global survey of music producers, commissioned by Tracklib, reports that a very large majority of producers are sceptical or hostile to generative-AI use in music: adoption is low and rejection high — the press release frames it as “music producers are rejecting AI,” with only a small percentage actively using generative tools.
• At the same time, Deezer and Ipsos ran listening tests and found that ~97% of people couldn’t reliably tell AI-generated tracks from human-made ones in blind tests — a striking demonstration that AI music quality has reached near-parity for the ear of the average listener. That same study shows big public support for transparency (label AI-made tracks) and concerns about ethics and copyright.
• Major outlets reporting on these themes also point to real-world effects: AI tracks are already rising on streaming platforms and even some charts, which is intensifying industry debate about promotion, royalties, and verification.
What the producers’ rejection actually means
When a study says “80%+ of producers are against AI,” don’t read that as a single emotion — it’s a mix of reasons:
-
Artistic integrity and craft. Many producers see production choices as part of artistic identity. Machines that spit out passable arrangements or vocal lines feel like a shortcut that removes the human voice from the process. Tracklib’s survey captures that sentiment: a lot of rejection is because producers feel AI undermines the craft.
-
Quality concerns — and nuance. Some producers reject full-song generation but still use AI for small tasks (like reference stems, mastering presets, or chord suggestions). The “rejecting AI” headline often conceals that many are open to assistive AI, just not to AI that replaces core creative decisions.
-
Copyright and fairness. Producers worry about training data (AI trained on copyrighted music without consent), displacement, and how royalties should be split if models borrow from identifiable artist outputs. The Deezer/Ipsos work also shows public support for ethical training and labeling — so this is both a creator and consumer issue.
-
Economic risk. If streaming platforms allow AI tracks to surge without clear rules for attribution and payment, session work, ghost production, and publishing could lose income. That’s why producers are defensive — it’s their livelihood, not just ideology.
Why listeners’ inability to tell matters
The Deezer/Ipsos finding that ~97% of listeners failed to detect AI tracks is a game-changer for three reasons:
-
Market-level indistinguishability. If audiences can’t tell, then AI tracks can compete on the same platforms, playlists, and algorithms. That puts human artists in direct competition with low-cost synthetic output unless platforms treat them differently.
-
Transparency demand. The public’s unease in that study shows people want to know. If platforms don’t label AI content, they risk trust erosion and regulators stepping in. Deezer has already started tagging AI submissions — an early institutional response.
-
Policy & chart implications. Charts and awards rely on attribution; if synthetic works crowd charts, the industry must decide whether to treat them equally. We’re already seeing cases of AI songs making chart noise and being removed or disputed. That bubbling conflict is the reason labels, distributors, and streaming services are scrambling.
The contradictions: why both things can be true at once
It seems paradoxical: producers reject AI while listeners don’t care or can’t detect it. But both can be true because they’re answering different questions. Producers are speaking about process, ethics, and livelihoods; listeners are responding to end-product enjoyment. The industry is now forced to reconcile process (how music was made and who benefits) with product (what people enjoy or stream).
The pressure points: where conflict will be decided
Here are the battlegrounds we should watch:
-
Streaming platform policies. Will platforms require AI labels? Will they remove AI tracks from editorial playlists? Deezer’s tagging is a blueprint; other platforms may follow or resist.
-
Copyright litigation & licensing. Lawsuits are already happening and will accelerate. Courts will decide whether training models on copyrighted works is infringement, and that ruling will determine cost structures for AI music firms.
-
Royalties & metadata standards. If AI tracks use samples or stylistic mimicry, who gets paid? The industry needs metadata standards and traceability to assign rights and royalties correctly — otherwise, payments and charts will be gamed.
-
Consumer-facing transparency. Mandatory labelling is likely to be politically popular and technically feasible. Expect regulators to push for clarity so consumers know whether they’re listening to AI.
-
Artist and union pressure. Creators’ associations and unions will push for protections, training data consent, and revenue-sharing terms. If producers maintain unified resistance, they can shape licensing frameworks more than tech companies expect.
The likely near-term scenarios (what I actually expect)
-
Hybrid adoption increases. Most producers will continue using AI as assistants (mixing tools, stems, mastering suggestions) rather than full-song generators. The “pure AI” artist will mostly be an outlier or novelty that occasionally breaks through virally.
-
Platform responses: tags + restrictions. Platforms will adopt transparent tagging and may restrict AI-only tracks from editorial playlists and algorithmic boosts until legal/ethical frameworks are set — Deezer is already an early mover here. Expect other major services to follow to avoid reputational and legal risk.
-
Regulatory and legal clarifications. Courts and regulators will clarify what “training on copyrighted works” means. If rulings favor copyright holders, AI music firms will need licensing deals akin to the way streaming licenses catalogs, which raises costs and reduces the free-for-all.
-
An uneven marketplace. We’ll end up with two flavors of music commerce: human-made content protected, promoted, and monetized under traditional deals; and AI-generated content that lives in a parallel space with different rules, visibility, and monetization models. Chart systems may split or flag AI entries.
What this means for producers, labels, and listeners: practical takeaways
For producers:
-
Lean into what makes you uniquely human: emotional risk-taking, imperfections, personal storytelling, and brand as a creator. These are the qualities AI struggles to authentically replicate.
-
Treat AI as a tool, not a competitor. Learn to use assistive AI for workflow speedups (mastering, stem separation, idea generation) while protecting your signature creative decisions.
For labels & distributors:
-
Invest in metadata standards and provenance systems now. If you don’t capture who did what — and whether a track used AI — you’ll lose control over chart inclusion, payouts, and legal defenses.
-
Think about curated playlists as a premium human-made space; reserve some editorial real estate for verified human artistry to protect brand value.
For listeners:
-
Expect transparency features (filters, labels, opt-outs). If you want only human-made music, the tools to enforce that preference are likely coming.
Final thought, this is a crossroads, not an end
AI music is no longer a futuristic demo or toy. The technology is good enough to matter commercially and culturally, and the industry’s response will shape careers and revenue flows for a generation. Producers’ concern is legitimate — it’s about artistry and survival. Listeners’ reaction is equally important: they show curiosity but also want honesty.
The healthy path is a negotiated one: transparency, licensing, and new business models that protect creators while allowing innovation. If the industry can’t agree, regulation will force the issue — and that could either protect creators or ossify the market. Either way, producers rejecting AI today is a loud signal: the music world wants rules before mass adoption. That moment of negotiation is the real story behind the headline.