IE 11 is not supported. For an optimal experience visit our site on another browser.

SAG-AFTRA is fighting a dystopian AI takeover so you don't have to

We can look to Hollywood to understand the immediate and far-reaching risks we all face from artificial intelligence.

For years, Hollywood has helped viewers imagine science fiction dystopias where the future of humanity is threatened by machines. Now, Hollywood screenwriters and actors are at the forefront of a fight over artificial intelligence that could set the standard for the future of labor rights and compensation for jobs we once thought were immune to outsourcing and automation.

This tech is only getting better and cheaper.

“If we don’t stand tall right now, we are all going to be in trouble. We are all going to be in jeopardy of being replaced by machines and big business,” Fran Drescher, actor and president of Screen Actors Guild-American Federation of Television and Radio Artists, or SAG-AFTRA, said last week, after the union’s national board unanimously voted to strike against studios and streaming giants. It joined Writers Guild of America (WGA), which has been striking since May, in part also over AI concerns — launching the first dual strike in the industry in more than 60 years.

But this isn’t just a Hollywood problem. It’s one many workers, especially those in white collar creative roles, are now or will soon be facing: Their past work being used to train their robot replacements.

That’s why, with actors facing an “existential threat” to their livelihoods due to the rise of computer generative technologies — as SAG-AFTRA National Executive Director Duncan Crabtree-Ireland said at a recent press conference — we should all be paying attention to the current strike.

Studios can now make digital clones of actors and even digitally resurrect the dead — as Disney did with the late Peter Cushing to reprise his role as Grand Moff Tarkin in 2016’s “Rogue One,” which also used AI deepfake technology to create a cameo of a de-aged Princess Leia by superimposing a youthful version of Carrie Fisher’s face onto a body double.

This tech is only getting better and cheaper. It’s even rolling out in social media apps via filters and cartoon avatars as well as other parts of the consumer market. For example, Tel-Aviv-based company One Hour is paying actors an initial fee of $500 to scan a digital likeness of them, which the company can then use for a certain number of onscreen credits for content like marketing videos and employee tutorials, Slate reported last month.

Now studios are ready to exploit digital doppelgänger technology and the workers whose very faces the systems depend on.

“They proposed that our background performers should be able to be scanned, get one day’s pay, and their companies should own that scan, their image, their likeness and should be able to use it for the rest of eternity on any project they want, with no consent and no compensation,” Crabtree-Ireland said.

The Alliance of Motion Picture and Television Producers, or AMPTP, which represents studios in the negotiations with the SAG-AFTRA and WGA, disputed that description of its proposal. In a statement to The Verge, AMPTP spokesperson Scott Rowe said it “only permits a company to use the digital replica of a background actor in the motion picture for which the background actor is employed.“ Future uses would require the actor’s consent and be subject to a minimum payment, Rowe added. 

How a person’s likeness can be used raises deeply personal, ethical questions. Now, when a working actor auditions for a show or accepts a part, they are making a choice about which roles they feel comfortable literally embodying. That agency could be stripped, without rules for how digital clones of them are used, potentially leaving them open to abuse or harassment through things like deepfake pornography.

Without an agreement for how actors should be paid for when their past work is used to train AI models or how their likeness is used for something new, they’re also set up to essentially compete for roles against digital clones of themselves — clones who, notably, don’t need to eat or sleep. 

WGA directly addressed this issue during bargaining, demanding in May that AI not be used to “write or rewrite literary material,” be used as source material, or trained on work covered by the agreement. So did the Directors Guild of America, which reached a tentative agreement with the studios in June that requires guild covered positions to be “assigned to a person” and specifies that AI “does not constitute a person.”

Paul Kedrosky, an AI investor, told Nick Bilton of Vanity Fair that the WGA strike represented the “first skirmish in a new war, one where more than half of all jobs are at risk as we lose control of language itself — and thus of being human — to large language models.” 

In many industries this amounts to workers’ having their expertise digitally colonized, giving bosses and corporations the ability to extract additional value from past work completed on a for-hire basis, potentially forcing less favorable labor conditions.

Consider architecture, an industry where Kate Wagner at The Nation writes AI isn’t so much a recipe for disruption as one for boring, repetitive buildings. Due to liability concerns, Wagner posits, the drudge-work part of architecture — design and construction documentation — is the least likely to be automated. 

The things that AI might be trusted to do, like writing descriptions designs for investors or coming up with illustrative images of proposed buildings, are all drawn from databases built from prior work. So AI could allow firms to take on more projects because architects will be spending less time on the creatively fulfilling part of their jobs — all while pushing their output toward a monoculture. 

AI use in Hollywood risks a similar trend: a cascade of automated decisions pushing whatever content that’s popular at the moment, in the process drowning out original voices and creating a pipeline that prioritizes profit over art. 

Of course, the conflict between corporate interests and artistic vision isn’t new. It’s, well, capitalism. 

But Hollywood itself has already envisioned a few ways the current technology trends could lead, including in the “30 Rock” episode “Seinfeld Vision,” in which NBC studio executive Jack Donaghy plans a month of programming around inserting a digital avatar of Jerry Seinfield created from old “Seinfield” footage into prime-time shows, without the comic’s knowledge or consent.

In the episode, Seinfeld is outraged and negotiates a deal to limit the promotion, but he’s only able to do so because he’s a powerful person who (presumably) has contractual control over the footage the digital clone was drawn from.

Back in the real world, SAG-AFTRA’s membership includes powerful and influential people — and the union is very loudly making the argument that people need to have more control over who profits from their labor as AI influences their workplaces.

So even as tech leaders raise the alarm over an AI superintelligence causing a human extinction event, most of us should look to Hollywood to understand the immediate risks we face from AI — and how to organize with our fellow humans against it.