When STIM announced what they described as the world’s first AI license, it wasn’t surprising that the claim generated a lot of questions. It certainly raised eyebrows.
To be fair, it may indeed be the first AI license launched by a collective management organization (CMO). Yet, as Music Ally noted:
- In October 2023, French society Sacem announced its opt-out of members’ works for AI training, precisely to force negotiations toward licensing.
- In September 2024, German society GEMA presented its framework for AI licensing, stressing attribution and accountability.
And outside the CMO sphere, AI licensing frameworks are already a reality. Several companies have been experimenting, implementing, and commercializing such models for years:
- Wondera & SourceAudio (USA): AI access to over 14M ethically pre-cleared tracks.
- BandLab Licensing (Singapore/global): Artists/publishers can mark works “Open to AI licensing.”
- Beatoven.ai / Musical AI: Generative AI platforms with explicit rights-holder compensation.
- DigiTrax / KR38R LAB (USA): “AI Music Training Model License” with royalties to composers.
- AIxchange (Germany) & GCX: Ecosystems for ethical, licensed AI training datasets.
- CCC (USA): Piloting AI training licenses for text and academic works, relevant to copyright at large.
These initiatives make one thing clear: AI licensing is already global.
The Real Issues
The real challenge is not about being “first” but about whether these licenses are truly fit for purpose.
- Voluntary uptake. Licenses only work if AI companies actually agree to pay. Many don’t, and many already use data without consent. Voluntary models cannot be the ultimate solution.
- Weak bargaining position. History shows that rushing to be “first” often leads to weaker deals. CMOs are not yet leveraging existing technology that can track outputs, and legislators have been left with vague, voluntary-based laws — lowering value again.
- Backwards design. Too often, licenses start with: What could a small AI start-up afford? Then the pot is divided across a limited set of rights holders through unclear processes. That undermines transparency and equal treatment — the very principles CMOs are meant to uphold.
- Transparency gap. Not all members were asked whether they wanted to participate. In collective management, strength lies in the collective. Divide the collective, and negotiating power weakens.
Some CMOs have existed for a century, and that heritage is valuable. But it also makes change harder. “If it worked for 100 years, why change?” Well — because we are entering a new era. By keeping IT and tech innovation in-house only, many talented developers and innovators have already left collective management in search of faster progress.
The good news: relevant technology already exists. Tools can analyze AI outputs, detect which works were used in training, and attribute royalties directly to the original creators. CMOs have been reluctant to adapt user-centric models in the past — but in the AI era, this is no longer optional. It is a necessity.
A Constructive Path Forward
This is not to say that STIM’s or any other license is meaningless. On the contrary — it is encouraging to see CMOs engage with the AI question rather than avoid it. But credibility requires framing such initiatives for what they are: first steps, not first-in-the-world solutions.
Real leadership in this space will come from:
- Honesty about both achievements and limitations.
- Collaboration with external technology partners that already provide attribution tools (waiting to build them in-house will be too late).
- Transparency not only demanded of AI companies, but practiced internally toward members.
The future of music and AI will not be built in silos. It will be built through systems that combine licensing, attribution, and enforcement.
Being first rarely matters most.
Being right — and being trusted — always does.