With the recent release of Avatar: Fire and Ash, the third instalment in the Avatar franchise, the debate surrounding its use of high frame rate in large portions of the movie has reignited anew. Well, somewhat. It remains a point of discussion among critics and film-goers but ever since Avatar: The Way of Water, the perceived ferocity of this conversation has most definitely subsided. And the question remains: why?

The high frame rate format (HFR) was first rolled out worldwide with the release of Peter Jackson’s The Hobbit: An Unexpected Journey. Though, at this point it is worth noting that Jackson wasn’t technically the first filmmaker to pioneer this technique in the context of moviemaking. The idea of filming with the use of HFR cameras at increased frame rates of 48fps or 60fps was explored by Douglas Trumbull in the 1980s, most notably through his Showscan process—an experiment less undone by aesthetics than by the sheer impracticality of exhibition infrastructure at the time. And despite the fact that HFR has remained in frequent use in TV, especially in the context of live sports and video games, it wasn’t until 2012 when it was attempted at scale and resulted in widespread disapproval from paying audiences.

The notion of doubling the 24fps, which had been the cinematic standard forever, was meant to breathe freshness into movies and bring them closer to life. I remember Jackson describing his ambition to turn the cinema screen into a window through which we’d be able to observe the world of the movie, unperturbed by technological limitations. And the world rejected this idea wholesale. The Hobbit in 48fps was criticized for its “soap opera” look and the fact that the increased frame rate made imperfections in set design, costumes and makeup look more pronounced.

Nevertheless, Peter Jackson persisted, later joined by Ang Lee who shot Billy Lynn’s Long Halftime Walk and Gemini Man in 120fps. None of these films worked, and it was only when James Cameron deployed HFR while making the long-awaited Avatar: The Way of Water, the immediate audience response was no longer tantamount to complete aversion. The widely reported reason why HFR was not dismissed as violently when used in the Avatar sequel was related to the fact that Cameron didn’t use it throughout the entire film and strategically deployed it to film underwater sequences and action scenes. He would carefully choose when and how to use this technique, predominantly to counteract jittering and smearing effects often encountered in 3D movies in scenes including rapid movement of actors, cameras or both.

It’s hard to argue with this explanation, as it technically makes sense and also suggests that Cameron was using HFR as a sophisticated tool, in contrast to Jackson and Lee who initially treated the format far more bluntly. However, there might be more reasons as to why HFR now seems to be a more acceptable filmmaking technique than it was thirteen years ago when audiences balked at its use in The Hobbit. Perhaps this is an outgrowth of the simple fact that Avatar movies do not attempt—for the most part—to depict reality as we know it and demand that viewers acclimate to Space Smurfs and fully animated sequences where mo-cap performances are composited into CGI backgrounds in the vast majority of scenes. Therefore, the uncanny effect might be somewhat negated because we rarely look at faces of real people, or real actors moving about. But we still do in some scenes and the backlash is still not forthcoming. Perhaps what has changed is not the technology itself but rather its perception.

It could just be that—at least to some extent—Peter Jackson and Ang Lee arrived with this invention too early for the public to adopt it. After all, the concept of reformatting the entire theatrical experience around the HFR format might have been too disruptive and alienating to audiences who have been conditioned culturally to interact with movies shot at 24fps from their early childhood when they watched morning cartoons on TV.

However, this may no longer hold true in 2025. What has changed in the intervening years is that for many young people, the first and most immersive experiences involving interacting with entertainment through screens might no longer be a movie or a cartoon, but rather a 3D video game. While Millennials have most definitely lived through and experienced the widespread adoption of 3D graphics in gaming, as well as introduction of some of the most popular gaming consoles, the younger cohorts are simply more likely to see those entertainment modalities as more formative. And this is where 24fps ceases to function as a default. In fact, higher frame rates of 48fps and 60fps and beyond offer objectively more immersive and fluid gaming experiences.

Consequently, folks who have effectively grown up on a steady diet of kinetic 3D games produced to high polish (as opposed to what I remember from the late 90’s when 3D-accelerated graphics worked well only on the most expensive PC rigs) are now slowly beginning to outnumber older generations raised on TV and movies in the marketplace of opinion. And as we know, cultural consensus is a game of numbers. It might not matter that much that many filmmakers continue to voice their disdain at this quiet acquiescence to HFR because the ranks of viewers who see it as alienating are slowly thinning.

In fact, a few short years ago it became fashionable in some online corners to re-post classic cartoons like Tom & Jerry and Looney Tunes in 60fps for no other reason than because it was probably fun to do. The response was binary: either wholesale rejection coming mostly from older commentators, or unadulterated joy at the perceived smoothness of animation. This potentially suggests that the idea of cultural acclimation to HFR as a standard might be age-correlated. And the same people who not only did not object to seeing Bugs Bunny in 60fps, but openly cheered it—while I, an ancient Millennial, remained aghast—are simply less likely to object to HFR in movies now because smoothness, rather than judder, had already been established as the perceptual norm.

I would even go as far as to suggest that maybe a movie shot using HFR as a cultural manifesto—the way Jackson and Lee initially approached the format—might stand a chance of being accepted culturally by large swaths of moviegoing audiences, provided that the target demographics would overlap with cohorts who grew up gaming in 60fps as standard. Twenty-four frames per second, long defended as ‘cinematic,’ was never a perceptual ideal so much as a historical compromise—one that hardened into convention through repetition rather than inevitability. Now that this cultural norm has been disrupted by the ascendancy of gaming, the fact that Avatar sequels met much softer criticism of their use of HFR could mean that other adopters of this technique could follow suit.

Although I don’t think that a new movie by Kelly Reichardt filmed using HFR cameras would butter any parsnips, but a superhero film shot in 60fps would not fall flat on its face due to widespread rejection of the HFR technique any more. It still might if the movie in question is a creative dud—and blockbusters do struggle these days to remain authentic, relevant and entertaining all at the same time—but maybe a new Zack Snyder movie or a DC superhero gig would have a shot at proving that 24fps has always been a culturally self-propelled convention: one that persisted only because all users assumed it was the only natural way to go. The fact that HFR might become accepted has less to do with James Cameron being a visionary spearheading the movement, but with the fact that audiences have simply changed.


Discover more from Flasz On Film

Subscribe to get the latest posts sent to your email.

Leave a comment

FEATURED