
As the global film industry descends on the 79th Cannes Film Festival beginning May 12, it does so at a pivotal moment: Cinema is redefining who, or what, gets to be considered a creator.
The rise of generative AI has ignited one of the most consequential debates the industry has faced since the transition from film to digital: Is storytelling still a fundamentally human act?
The festival recently ruled that any film where generative AI serves as the “principal authoring tool” is ineligible for the Palme d’Or, the highest prize awarded to the director of the best feature film at Cannes. This includes AI-driven scripts, visual generation and principal performance synthesis.
Jason Maurer, Head of Animation at the Florida State University College of Motion Picture Arts, is exploring how AI can be used ethically in creative spaces, with a strong focus on storytelling, filmmaking and animation production pipelines.
One of his guiding principles for navigating AI and filmmaking is Wharton Associate Professor Ethan Mollick’s four basic rules: Be the human in the loop, invite AI to the table, treat AI like a person and assume this is the worst AI you’ll ever use. Maurer believes that Cannes’ push toward human-centric filmmaking is less a rejection of technology than a defense of authorship.
“AI is a collaborator, not a creator,” Maurer said. “The ethics around how it’s built are non-negotiable, and the real threat isn’t the tool — it’s the humans wielding it without accountability.”
Jason Maurer is available to speak with media on these angles that intersect AI and the film industry. He can be reached via email at jmaurer@fsu.edu.
- Authorship and accountability still belong to humans: AI can speed up production, but it can’t take responsibility for a story. The real debate isn’t about banning AI — it’s about keeping humans accountable for what ends up on screen.
- Audience trust will hinge on transparency, not technology: Viewers are open to AI-assisted films if the story resonates, but they want honesty about how it was made. The industry’s challenge isn’t AI itself — it’s clearly labeling and owning the creative process.
- AI is expanding access while raising real ethical risks: The technology is lowering costs and giving indie filmmakers new creative power, but many tools are built on legally and ethically uncertain foundations. The opportunity is huge, but how the industry addresses those risks will define its future.
Jason Maurer, Head of Animation, FSU College of Motion Picture Arts
To the average film viewer, much of the AI-generated content might be indistinguishable to actual human content. Are these human-centric stances like what Cannes is doing good for the industry, or are we at a point where we’ve lost the audience’s trust for what is real on the screen?
Whether it’s good for the industry depends on what we mean by “human centric.” If the standard is that a human directed the work, made the choices and bears the artistic responsibility, then a filmmaker using an ethically sourced AI tool is still making a human-centric film. The human is still in the loop — to borrow Ethan Mollick’s framing, which I apply to my own creative process. We didn’t stop telling stories around the campfire when the printing press arrived. Portrait painting didn’t disappear because of the photograph. Tools change. Authorship doesn’t.
The animation industry has already lived through this. When CG arrived, the field declared 2D dead — and it nearly was, commercially, in the U.S. for most of the 2000s. Hundreds of traditional animators lost careers in that transition. We should be honest about that. But 2D didn’t die. It evolved, and the medium today is richer for having both languages available.
Proponents of human-centric cinema endorse its authentic storytelling as one of the standards of film. While generative AI can perhaps maximize efficiencies in film, what makes the human element more important?
The reframe I’d offer is this: The question isn’t really whether AI can be human-centric. It’s whether humans using AI are operating in good faith.
Iris Knobloch, the Cannes president, said when announcing the 2026 selection that “AI knows how to imitate very well, but it will never know how to feel.” I’d sharpen that. AI doesn’t need to feel. The humans making the work need to. The humans watching it need to. That’s where the human element actually lives — not in the tool, but in the people on either side of it.
The deeper case for human-centric storytelling is responsibility. Films are made by people. People can be questioned, credited, sued, hired or fired. Models can’t. As long as humans are answerable for what shows up on screen, we have an industry that can correct itself when something goes wrong. And here’s why that matters for the aesthetic question, not just the legal one: a story someone is staking themselves on is a story that carries weight. Audiences can feel the difference between work someone is answerable for and work that’s been generated to fill a slot. That stake is what authenticity actually is.


