By: Zachary Abbott
Increasingly, AI has created tension between corporations seeking to streamline product development and trim costs and creatives seeking protections against having their contributions entirely replaced. In July, this came to a head in the video game industry, where members of the Screen Actors Guild – American Federation of Television and Radio Artists (SAG-AFTRA) who perform in video games went on strike for protection against AI encroachment. Against this background, Christoph Hartmann, CEO of Amazon Games, caused significant controversy last week when he stated that “[in] games, we don’t really have acting” and advocated adding AI to game development.
In an interview with IGN, a popular video game news source, Hartmann stated that he hoped that AI could lessen the lengthy development times of games because it currently takes “five years per game,” and commented further that “hopefully AI will help us to streamline processes so hand-done work will go fast. Ideally we can get it down to three years.” When asked about the SAG-AFTRA strike, Hartmann responded that “especially for games, we don’t really have acting . . . [t]he majority of the team sits in programming and that’s not going to go away because that’s all about innovation.” He received heavy criticism over these comments, with people pointing to games like Baldur’s Gate 3, The Last of Us, and Cyberpunk 2077, which are extremely popular games largely because the character acting elevates them into a unique and special experience.
Advertisement
In response to this criticism, an Amazon spokesperson addressed the “confusion” caused by Hartmann’s comments and explained that they were intended to reference internal development teams, and that Amazon Games typically does not have actors on staff. Amazon stated, “[a]s with any tool, we believe generative AI needs to be used responsibly and we’re carefully exploring how we can use it to help solve the technical challenges development teams face.” Reading Hartmann’s remarks in context, these “explorations” appear to include the localization of game dialogue in different countries. Hartmann remarked that “what could be super helpful is localization. We’re currently localizing our game into a certain set of languages. Does it commercially make sense to have it in a language, yes or no? Having AI actually will help us.” It should be noted that using AI to directly localize voice acting by using an AI copy of an actor’s voice to generate new dialogue in a new language would likely be a violation of the AI protections that SAG-AFTRA members are currently seeking, if the actor has not granted approval or been compensated.
Sarah Elmaleh, the SAG-AFTRA Interactive Media Agreement negotiating committee chair, stated in response to Hartmann’s remarks, stating that “[w]hether game design, localization, programming, acting, anything – these highly specialized and professionalized workers are the ones who understand whether and how AI might be assistive or detrimental in their work. And workers should have the right and the means to advocate for the proper use of this tool.”
As game developers explore new ways to use generative AI, the livelihood of artists, actors, and other creatives remains at risk until the parties can have a productive dialogue. Negotiation of the SAG-AFTRA Interactive Media Agreement provides a framework to do this. However, Hartmann’s recent remarks and the reaction to them suggest that this dialogue is still a work in progress.