The GenAI Wall and Film Making
The debate around generative AI in creative industries is often framed in extreme terms. Either AI will replace artists, or it will change nothing essential. Both positions miss what is actually happening. A recent study by researchers from Harvard Business School and Stanford University offers a more precise way of understanding the impact of generative AI, not by speculating about replacement, but by examining where AI genuinely helps and where it stops helping.
The study looks at how people perform tasks outside their core professional expertise when supported by generative AI. Instead of comparing humans against machines, it compares different kinds of humans working with machines. Some participants are insiders who already perform the task as part of their profession. Others are adjacent outsiders whose work is related but not identical. A third group consists of distant outsiders whose background is far removed from the task itself.
What emerges from this comparison is not a smooth curve of improvement, but a sharp boundary. Generative AI significantly helps people in the early, conceptual stages of work. It narrows gaps between insiders and adjacent outsiders and, to a surprising degree, even helps distant outsiders formulate plausible ideas. But when tasks move from conceptual framing into actual execution, the benefits diminish rapidly. At a certain point, AI no longer compensates for the absence of domain specific judgment. The researchers describe this point as the GenAI wall.
This idea is especially useful for understanding what is happening in creative work and filmmaking, because film production is structured almost entirely around the transition from concept to execution.
Early stages of filmmaking are dominated by possibility. Stories are explored, themes are tested, moods are sketched, worlds are imagined. This phase thrives on variation and speed. Generative AI fits naturally here. It can generate story alternatives, visual directions, tonal references, and narrative outlines at a pace that no human team can match. For directors, writers, and designers, this feels empowering. The creative space opens up instead of narrowing too quickly.
But filmmaking does not end with ideas. In fact, ideas are the cheapest and most abundant part of the process. The real work begins when choices must be made and defended. Which performance is right. Where the camera should be placed. When a cut must happen. How silence should be used. These decisions cannot remain flexible. They must become concrete, irreversible, and accountable.
This is where the GenAI wall becomes visible in practice. AI can suggest ten ways to stage a scene, but it cannot tell which one will survive contact with actors, weather, budget, time pressure, and emotional reality. AI can generate an edit that looks correct, but it cannot feel when the rhythm is emotionally false. The difference is subtle but decisive. Execution is not about generating options. It is about committing to one option under uncertainty.
This explains a pattern many filmmakers already sense intuitively. AI tools feel astonishingly capable during pitches, treatments, storyboards, and early previs. They feel much less convincing when pushed toward final shots, performances, or emotionally complex sequences. The problem is not quality in a technical sense. The problem is judgment.
Judgment is not a rule set. It is accumulated experience responding to context. A cinematographer adjusts light not because a reference suggests it, but because something feels wrong in the space at that moment. An editor cuts not because a beat aligns mathematically, but because a performance breathes differently than expected. These are not failures of data. They are expressions of human perception.
The GenAI wall therefore does not protect filmmaking because of tradition or hierarchy. It protects it because craft is embodied. The more a task depends on micro decisions that respond to human presence, emotion, and consequence, the less transferable it becomes through AI support alone.
This also explains why generative AI is already reshaping large parts of the content economy without threatening cinema at its highest levels. Where quality thresholds are vague and speed matters more than depth, AI thrives. Marketing videos, social media clips, explanatory visuals, and formula driven entertainment benefit from acceleration and volume. They do not require sustained emotional judgment. They require acceptability.
Cinema operates under different conditions. A film is not judged by whether it is plausible, but by whether it is necessary. Emotional precision, narrative risk, and specificity cannot be averaged. They must be chosen, and choice always involves responsibility.
This does not mean creative roles remain unchanged. In fact, the opposite is true. Generative AI expands what individuals can do before involving others. Directors can visualize earlier. Writers can explore structure faster. Designers can prototype worlds without waiting for full teams. Editors can test tonal ideas long before footage exists.
But this expansion does not eliminate specialists. It raises expectations. When everyone can reach a competent baseline, competence stops being the differentiator. Depth becomes the differentiator. The creative floor rises. The ceiling does not move.
This also clarifies why the claim that prompting is the new directing feels intuitively wrong to many practitioners. Prompting is about articulating intent and exploring variation. Directing is about standing behind a decision that might fail. A director owns the moment when an actor breaks down on set, when a scene does not work, when time runs out. AI cannot own failure. It cannot take responsibility. That boundary is not technical. It is ethical.
The implications for film education follow directly from this. Teaching tools alone is no longer sufficient. If AI accelerates conceptual work, then education must focus on what AI cannot provide. Perception. Critique. Narrative judgment. Ethical awareness. The ability to evaluate one’s own work honestly and to decide when to stop generating options and commit.
Future filmmakers will not be distinguished by how fluently they use AI. They will be distinguished by how well they know when to ignore it.
The GenAI wall does not signal the end of creative work. It reveals where creative work actually begins. Not with ideas, but with decisions. Not with possibility, but with commitment. Not with output, but with responsibility.
Cinema has always lived there.
Studies
https://www.hbs.edu/ris/Publication%20Files/26-011_9b5d5a53-3e93-4c6f-b9a5-9bfe4fd9bcb3.pdf
https://d3.harvard.edu/insights/why-ai-helps-until-it-doesnt-inside-the-genai-wall-effect/


