How Ben Affleck Secretly Built Hollywood's AI Film Tool

In Brief
In March 2026, Netflix announced it had acquired InterPositive, an AI filmmaking company founded by Ben Affleck, for up to $600 million. Our breaking-news piece covered the announcement and its immediate context. This follow-up goes deeper: how Affleck built the company in secret for four years, what the technology actually does at a technical level, and why the deal is dividing Hollywood along unexpected lines.
Related reading:
Four years of silence
InterPositive did not begin as InterPositive. When Ben Affleck incorporated the company in Los Angeles in 2022, he registered it under the shell name Fin Bone, LLC, a deliberate move to keep it off the radar. Funding came from RedBird Capital Partners, a $14 billion investment firm led by Gerry Cardinale, along with Affleck's own money. The team never grew beyond 16 people: a tight mix of engineers, researchers, and film professionals. For four years, they told almost no one what they were building.
That secrecy was intentional. Affleck had watched early generative AI tools arrive in Hollywood and noticed something that alarmed him: strong engineering, almost no filmmaking knowledge. "I was worried that this was a technology that was gonna grow outside of the ecosystem of filmmakers and artists," he has said publicly. He wanted to get ahead of that. Building quietly gave the team time to develop the technology on their own terms before the larger AI industry could define what film AI should look like.
During those four years, the team did something unusual: they went on a controlled soundstage and filmed a proprietary training dataset from scratch. Real cameras, real lighting rigs, real filmmaking technique: the kind of setup you'd find on a professional film set. This mattered for what came next.
What the technology actually does
To understand what InterPositive does, it helps to know what it does not do.
Most AI vision systems are trained to recognize what is in an image. Is there a dog? Is someone driving a car? Is a person crossing the street? InterPositive asks an entirely different question: how was this shot filmed? What lens was used? Where was the camera? How did it move? How did the light fall?
It is also not a text-to-video system like OpenAI's Sora or Google's Veo, which create video clips from written descriptions. InterPositive does not create something from nothing. It starts with real footage from a real film.
The system is trained on the raw footage from the specific film it will be used on. Dailies (the raw footage recorded each day on set, reviewed by the director to assess what is usable) are fed in, and the AI learns the unique visual signature of that particular project: lighting conditions, lens characteristics, camera movement, framing, and color palette. Then the custom model is used for concrete tasks.
What kind of tasks? Removing safety wires visible in action scenes. Fixing the lighting in shots where the sun did not cooperate. Cropping or adjusting framing for different screen formats. Correcting backgrounds and small visual inconsistencies between scenes filmed days apart. And perhaps the most striking: recreating scenes from camera angles that were never physically filmed on set.
Affleck put it concretely: "If you can shoot a scene in a studio and then make it realistically look like the North Pole using AI instead of actually going there, that saves money, saves time, and lets you focus on the performances."
Every change requires human approval before it enters the editing timeline. There are built-in constraints that protect the director's intent. And the models are trained exclusively on closed, authorized production footage.
How it works under the hood
The four granted US patents (with protection running to approximately 2045) and 12 pending international applications reveal a seven-layer system. We do not need to understand all seven. What matters are three main parts that work together, like three specialists with different roles:
The LiDAR mapper: the eye on set
Laser scanners that measure distances with sub-centimeter precision record exactly where the camera is positioned, where it points, and how it moves. This information is combined with the actual video footage and technical film data into a structured training dataset. A separate layer uses the game engine Unreal Engine to generate additional training material with virtual cameras and lighting rigs.
SamildAnach: the director
This model looks at a video frame and describes it in filmmaking terms: what lens was used, how open is the aperture, how the light falls, whether the camera is moving, and in which direction. It does not just understand what is in the shot — it understands how the shot was filmed.
Filmmaker: the VFX artist
This model takes the description from SamildAnach and produces new or adjusted frames to match. If SamildAnach says "wide angle, low camera, warm light from the left," Filmmaker produces frames that look exactly like that.
Then SamildAnach checks the result. Did Filmmaker get it right? Almost, but the lighting is slightly off. Filmmaker adjusts. SamildAnach checks again. This back-and-forth loop continues until the result looks like the rest of the film. This is what makes the system different from typical AI tools: most have a single model that guesses. Here, two models hold each other accountable.
Filmmakers use the system through a browser-based application that speaks the language of film production, not AI jargon. It connects to After Effects and ShotFlow, tools already used in professional post-production. Directors and cinematographers work in their own vocabulary.
The deal and its strategic logic
Netflix announced the acquisition on March 5, 2026. Bloomberg later reported the total deal value reaches up to $600 million, structured as an upfront payment plus an earnout (extra payments tied to hitting performance targets after the acquisition). This makes it Netflix's second-largest purchase ever, behind only its approximately $700 million acquisition of the Roald Dahl Story Company.
All 16 InterPositive employees joined Netflix. Affleck took the title of Senior Advisor. The same week, his and Matt Damon's production company Artists Equity signed a separate first-look streaming deal with Netflix.
The timing was striking. Less than a week before the InterPositive announcement, Netflix had walked away from an $83 billion bid to acquire Warner Bros. Discovery, pocketing a $2.8 billion breakup fee in the process. Analysts read the pivot as a signal: instead of buying legacy media infrastructure through consolidation, Netflix would build a technology advantage instead.
Netflix's Co-CEO Ted Sarandos put the philosophy plainly: "There's a better business and a bigger business in making content 10% better than it is making it 50% cheaper." The tools will be available exclusively to Netflix's creative partners and will not be sold or licensed to others. That exclusivity is a strategic choice, not an oversight. It turns production technology into a competitive moat that rivals cannot simply purchase.
David Fincher, the director known for his exacting visual control on films like Fight Club and Gone Girl, has already used InterPositive tools on an upcoming Netflix production.
Affleck's philosophy
Affleck has been unusually consistent and articulate on AI throughout the period he was secretly building InterPositive. At a CNBC summit in November 2024, he said AI-generated films were "highly unlikely" and that movies would be "one of the last things to be replaced by AI." His most memorable line: "AI can write you excellent imitative verse that sounds Elizabethan — it cannot write you Shakespeare." Craftsmanship, he argued, is knowing how to work. Art is knowing when to stop. Knowing when to stop requires taste, and taste is something AI cannot learn.
On the Joe Rogan Experience in January 2026, weeks before the deal closed, he was equally direct. He called AI creative writing output "really sh**ty" and offered a pointed critique of AI hype: "A lot of that rhetoric comes from people who are trying to justify valuations around companies, where they go, 'we are gonna change everything.' But the reason they are saying that is they need to ascribe a valuation for investment that can warrant the cap expenditure they are gonna make on these data centers."
His vision for InterPositive has three parts: a post-production power tool that handles technical drudgery so filmmakers can focus on performance and story; a way to lower barriers for independent filmmakers who lack massive budgets; and a new source of revenue through personalized fan content built on negotiated rights, potentially replacing the DVD economics that disappeared and took "15 to 20 percent out of the economy of filmmaking."
The two-tier problem
Netflix shares rose about 1.5% on announcement day. Analyst reactions were broadly positive. The creative community's response was more complicated, and reveals a tension the deal has not resolved.
Above-the-line workers in Hollywood are actors, writers, and directors: the "creative" names on a project. Below-the-line workers are the technical crew: editors, VFX artists, colorists, camera operators, set designers, and sound engineers. These are two very different groups with very different power.
Affleck is a signatory to the Creators Coalition on AI, a 500-plus-member group that includes Cate Blanchett, Natalie Portman, Aaron Sorkin, and Rian Johnson. Their stated position: "This is not a full rejection of AI. The technology is here. This is a commitment to responsible, human-centered innovation." InterPositive deliberately avoids touching actor performances, protecting the people above the line.
But wire removal, relighting, color correction, and continuity work are done by people below the line. IATSE (the International Alliance of Theatrical Stage Employees), the main union representing those technical workers, declined to comment on the acquisition. That silence was conspicuous.
Several industry commentators identified this as a two-tier AI problem: above-the-line talent loudly opposing generative AI tools that could replace writers and actors, while more quietly accepting production AI tools that could displace technical crews. The workers whose concerns have received the least public attention are the ones whose jobs overlap most directly with what InterPositive does.
SAG-AFTRA (the actors and performers union) and the WGA (Writers Guild of America) both have contracts expiring in 2026, with new negotiations already underway and AI as the central issue. How those negotiations handle below-the-line work will determine whether "filmmaker-first AI" means protecting all filmmakers or primarily the famous ones.
What this means for film
For the films you watch
Tools like InterPositive could make more films look better without the budget exploding. Visual post-production work that currently takes weeks of manual labor could be done in a fraction of the time. That does not just mean cheaper Hollywood productions. It means independent filmmakers with small budgets could access tools that were previously reserved for studios with the deepest pockets.
For the people behind the camera
Faster and cheaper also means fewer hours for the technicians who do this work today. Wire removal, color correction, relighting, and continuity work are performed by people with specialized skills. If AI does it in minutes instead of days, the question is unavoidable: what happens to the jobs? The 2026 union negotiations will provide the first answers.
For the streaming wars
Netflix now owns patented production technology that no one else has. Disney explored a licensing relationship with OpenAI (reportedly since abandoned). Amazon builds internal AI teams. Netflix has a system that improves with every film that uses it. That is a self-reinforcing competitive advantage.
The deeper significance lies in who built it. By buying technology created by an Oscar-winning filmmaker rather than a Silicon Valley startup, Netflix purchased something harder to copy than patents: legitimacy. In an industry where AI trust is close to zero, Ben Affleck's name on the product provides credibility that no amount of corporate messaging could achieve.
Whether that credibility holds as union negotiations intensify will determine whether the InterPositive model becomes Hollywood's template for AI adoption or its most contentious flashpoint.
Glossary
| Term | Definition |
|---|---|
| Dailies | Raw footage recorded each day on set, reviewed by the director and crew before the next day of filming |
| Post-production | Everything that happens after filming: editing, color grading, visual effects, and sound mixing |
| LiDAR | A laser scanning technology that measures distances with sub-centimeter precision, originally developed for surveying and autonomous vehicles |
| Stealth mode | When a startup operates secretly, making no public announcements about its product or funding |
| Earnout | An extra payment after an acquisition, triggered when the acquired company hits specific performance targets |
| Above-the-line | Actors, writers, and directors: the "creative" roles whose names appear prominently in credits and contracts |
| Below-the-line | Technical crew: editors, VFX artists, colorists, camera operators, sound engineers, and set workers |
| Wire removal | The process of digitally erasing safety wires from stunt sequences after filming |
Sources and resources
- Variety: Netflix Acquires InterPositive
- Deadline: Ben Affleck's AI company InterPositive explained
- TechCrunch: Netflix buys Ben Affleck's AI filmmaking company InterPositive
- Netflix official blog: Why InterPositive Is Joining Netflix
- The Hollywood Reporter: Creators Coalition on AI
- Stephen Follows: InterPositive patent analysis
- US Patent 12,322,036 B1 — InterPositive spatial capture system (Google Patents)