In July 2023, 160,000 Hollywood workers walked off the job. For the first time in decades, actors and writers were on strike simultaneously.
The issue that united them: artificial intelligence.
What happened next reshaped not just entertainment, but how we think about AI and creative work.
The Breaking Point
The Writers' Fears
What screenwriters saw coming: - ChatGPT writing first drafts for executives to "polish" - AI-generated dialogue replacing room collaboration - "Created by AI, edited by human" becoming standard - Writing credits (and residuals) going to machines
The specific demands: - AI cannot write or rewrite literary material - AI-generated content cannot be considered "source material" (which affects credits) - Writers' work cannot be used to train AI without consent - Studios must disclose when AI is used in development
The Actors' Nightmare
What performers saw coming: - Digital doubles replacing background actors - AI-generated voices for dubbing and localization - Deceased actors "performing" indefinitely - Likeness used in perpetuity without compensation
The specific demands: - Consent required for digital replica creation - Compensation for each use of digital likeness - Living actors' likenesses protected from AI replication - Clear terms for posthumous use
What the Studios Wanted
The initial studio position: - Maximum flexibility in AI use - Ability to train models on existing content (they own the rights, after all) - Use of AI for "efficiency" without additional compensation - Vague language that preserved future options
The industry argument: - AI is just another tool, like CGI or auto-tune - Resistance to technology is futile and counterproductive - Markets should decide how AI is used - Unions are overreaching
The Settlement
After months of negotiations, both strikes ended with landmark agreements:
Writers Guild Agreement
AI provisions: - AI cannot be credited as a writer - AI-generated content doesn't qualify as source material - Writers can use AI if they choose (not mandatory) - Companies must disclose if material is AI-generated - Writers' work used to train AI requires negotiation
SAG-AFTRA Agreement
AI provisions: - Consent required for any digital replica - Minimum compensation for AI replica use - Clear terms for posthumous AI use - Background actors protected from bulk scanning - Regular re-negotiation as technology evolves
The Aftermath: Two Years Later
What Actually Changed
In production: - AI use in writing rooms remains controversial but present - Studios are cautious about visible AI use (public relations concern) - Some use AI for development, then hire writers to execute - Animation studios face fewer restrictions and use AI more aggressively
In technology: - AI video generation reached feature-film quality - Voice cloning became indistinguishable from original - Digital actors can now "perform" in real-time - The capabilities the unions feared are now real
The Loopholes
What the agreements didn't cover: - Non-union productions (increasing internationally) - Video games (separate union, different rules) - Social media content (no union at all) - International productions subject to different laws
The strategy: Some studios moved production overseas, used AI extensively, then brought content back for distribution.
The Broader Pattern
Creative Workers Across Industries
Similar anxieties: - Graphic designers facing Midjourney - Musicians facing Suno and Udio - Voice actors facing ElevenLabs - Translators facing GPT-based tools - Journalists facing automated reporting
The difference: Hollywood has strong unions. Most creative workers don't.
The Gig Economy Reality
For freelance creatives: - No collective bargaining power - Contracts increasingly include AI training rights - Clients expect faster, cheaper work - "AI-assisted" becoming expected skill
The emerging split: - Premium market: human-created as luxury/differentiator - Mass market: AI-generated with human oversight - Middle market: disappearing
What It Means for AI Development
The Training Data Question
The Hollywood precedent: - Content creators have some rights over training use - Compensation may be required for training data - Disclosure requirements are reasonable asks
The implications: - Getty, New York Times, and other data lawsuits have precedent - AI companies may need licensing deals with content owners - The "open internet" training era may be ending
The Replacement vs. Augmentation Question
What Hollywood decided: - AI can augment but not replace certain roles - Human involvement is required for certain credits - Some creative decisions must remain human
What this means: - The "AI will replace all jobs" narrative isn't inevitable - Social and legal choices shape technology's impact - Collective action can influence outcomes
Lessons Beyond Hollywood
For Workers
- Collective action matters when technology shifts power
- Early engagement beats reactive resistance
- Specific demands are more effective than general opposition
- Technology can be shaped by policy, not just markets
For Companies
- Worker concerns about AI aren't just self-interest - they often identify real problems
- Transparency about AI use builds more trust than secrecy
- Short-term efficiency gains can create long-term reputation costs
- Inclusive technology deployment tends to go more smoothly
For Society
- AI's impact on work isn't purely technological - it's social and political
- Different choices lead to different outcomes
- The "inevitable" often isn't
- Who benefits from AI is a choice, not a given
The Story Continues
Hollywood's AI reckoning isn't over. The 2023 agreements expire. Technology advances. New use cases emerge.
But the precedent is set: creative workers can shape how AI is used in their industries. Not stop it - but shape it.
That's a model worth understanding, whether you're in entertainment or anywhere else AI is transforming work.
