Recent policy setbacks in the United States and the United Kingdom are exposing a critical gap in how the creative industries advocate for artificial intelligence (AI) regulation. While technology companies advance sophisticated licensing systems, creator advocacy often remains focused on defensive restrictions—without offering workable alternatives. This misalignment may be costing the industry its influence at the policy table.
In both Washington and Westminster, proposed reforms aimed at protecting creators in the age of AI have stalled. In the UK, momentum behind comprehensive copyright reform lost traction, despite widespread mobilization from industry bodies. Similarly, in the US, lobbying efforts failed to produce meaningful protections for artists and rights holders as AI tools become increasingly integrated into content production and distribution.
A common factor in these failures is advocacy that highlights risks without demonstrating practical, operational solutions.
Lessons From the EU
This defensive posture contrasts with more successful efforts elsewhere. The European Union’s Copyright Directive—provides an example of solution-oriented advocacy. Its progress was aided by industry stakeholders who showed how legitimate content access could coexist with fair compensation frameworks for creators. Rather than seemingly opposing innovation, advocates in Europe presented policymakers with clear, scalable models that balanced access and rights. The fact that the remuneration models were not optimized for individual and correct distribution (user centric) is something that is probably coming back to haunt us now though.
Building Trust Through Transparency
Ongoing disputes—such as the legal case involving PRS for Music over “black box” royalties—underscore why lawmakers may hesitate to support new licensing frameworks involving AI. Without transparency, there’s little political appetite to introduce additional layers of legal complexity.
However, that caution could be addressed through systems already functional. Infrastructure that provides full audit trails and real-time licensing verification is increasingly viable. These systems offer both transparency and efficiency, addressing policymakers’ concerns while enabling responsible, large-scale AI usage.
Some systems can now employ AI-powered monitoring tools that track content use across over 240 channels, recovering revenue that previously went unclaimed. This type of enforcement makes legal licensing more attractive than infringement and strengthens—not stifles—technological innovation.
Shifting the Advocacy Narrative
Current advocacy strategies risk framing the creative sector as resistant to progress. That perception not only weakens its position but fuels adversarial relationships with technology developers. To reverse that trend, we need a shift toward constructive engagement—where creators are positioned as indispensable collaborators in AI’s future.
“Support grows when we show how seemingly conflicting interests can work in harmony.”
Two examples demonstrate this:
- Zero-knowledge blockchain infrastructure offers privacy for rights holders while enabling oversight for regulators. This supports cross-border licensing without compromising commercial confidentiality.
- Fractional ownership and smart governance structures broaden access to intellectual property assets while meeting institutional compliance needs, offering both equity and accountability.
A Call for Leadership
Ultimately, progress in AI policy will hinge on showing—not just telling—how innovation and fair compensation can coexist. The creative industries must evolve their advocacy from a narrative of resistance to one of opportunity and partnership.
Rather than focusing on what doesn’t work, the next phase of lobbying must clearly demonstrate how modern, scalable solutions can serve not just creators—but the broader public interest.
As the legislative landscape around AI continues to evolve, those who bring actionable models to the table will shape the rules of tomorrow. And those who don’t may be left out of the conversation entirely.