If you can’t beat AI, join it—but be transparent about the alliance. That’s the takeaway from both on-stage and hallway conversations about the emerging technology at SEG3 Los Angeles.
Previously hosted in London, SEG3 aims to bring together the sports, entertainment, and gaming industries to build better digital products and experiences.
Content creators are leveraging the technology to build businesses and brands faster than ever, and the assembled entertainment and gaming industry leaders largely cheered advances in generative AI. Still, they urged caution and stressed the need to disclose its use to their fans and communities.
“If it’s a creator that has an established community and trust, they’re going to have to [disclose],” Vice President of Partnerships at Misfit Gaming Group, Nicole DuCane Spencer, told Decrypt. “If they already have built a following and they’ve worked with [AI], most likely they’re going to have to share that, just because you don’t want it to come out later.”
The potential ramifications of keeping quiet on the matter include triggering a backlash after their audience finds out and labels them as someone who doesn’t support human artists.
With the rise of generative AI platforms like ChatGPT and Midjourney, individual creators, game studios, and corporations have used the technology to create libraries of images and art in hours and days that would previously have taken months and years to create. But as with NFTs before them, a vocal backlash against AI-generated art has emerged.
Last summer, Cyan Worlds—the developer of the classic video games Myst and Riven—faced a torrent of criticism after they revealed that they used AI to develop its new game Firmament. In October, Embark Studios similarly enraged its fans after revealing that they used AI to create most of the voices in the first-person shooter The Finals.
“I think it’s more of managing backlash, which is usually top of mind for most creators,” she said. “Maybe for people creating NFTs for different projects, it might be a completely different story. If it’s a large creator, they probably need to manage how they talk about it.”
“It’s a little different if you’re creating something from scratch and leveraging AI because you don’t have any money,” Spencer added. “There are different approaches depending on where the project or the use is coming from. But that’s what many people are asking: what is the morality of AI?”
Riot Games broadcast host James Dash agreed with Spencer’s assessment, adding that a creator who lets their audience know they are using generative AI from the start will have more leeway with fans.
“If from inception, the product I’m pitching to an audience is an AI-generated thing, there’s probably a little bit more leeway towards entering that space,” Dash said. “Because I haven’t invested my time in you as a creator, we have not built that trust that you are now breaking—it’s from the outset,” he said, comparing the scenario to a Vtuber and their animated personae.
Dash acknowledged that there could be debates over whether disclaimers like “all art here was generated using AI” are necessary. However, not being an artist himself, he expressed that he doesn’t feel qualified to morally judge whether artists should or shouldn’t use AI.
“I 100% agree that at the end of the day, when you’re talking about higher or bigger level creators, you’ve built trust with your audience,” Dash said. “They have an expectation of what your product is; if you’re going to change that on them, I think that there should be a discussion about that, and ultimately, I would hope you’ve created a community that you feel you have the permission to tell them and get their feedback.”
Dash said creators will encounter situations where viewers dislike their content and decide not to return. However, they should question whether those individuals genuinely belong in the community.
“Create what you want to create, but understand you’re gonna cultivate the audience that likes and is okay with that thing that you’re doing,” he said.
Edited by Ryan Ozawa.