The Console Framerate Revolution - ArticleLee Mehr , posted on 16 October 2020 / 2,653 Views
Ever since the middle of the 7th generation, the all-that-is-holy framerate topic has been one that’s interested me. After dealing with nonsense from both industry talking heads and belligerent community members in the past, it’s something I appreciate learning more about. And as we tip towards next generation and the promise of prettier visuals and 120 fps (on occasion) comes to fruition, it’s worth considering what this all means going forward. Are we in the midst of a framerate revolution for consoles? If so, what does that look like?
Before answering these pressing questions, it's worth delving into the past to understand our present situation. The advent of motion pictures essentially birthed the concept of individual frames per second and putting in place a practical limit that would still be fine for the viewer experience. Back then, there wasn’t a specific standard, since film cameras were hand-cranked devices and projectors were capable of outputting at various speeds. When the nascent ability to add sound came into frame firm standards became necessary. Over or under-cranking would result in inconsistent sound quality; as a result, 24 fps became the standard for sound film.
That standard was for a number of reasons: the practicality of that fitting onto a film reel, the costs not being too radical, and research Thomas Edison had made on the topic back in the day. He found that eye strain occurs if a person sees less than forty-six images in a second. While the set standards appears to be less than Edison’s suggested number, 24 fps actually means 48 fps or 72 fps because two-blade or three-blade shutters in movie projectors repeat that individual image twice or thrice respectively. This specific technique isn't seen in lower framerate video games, but cheat codes like motion blur and other workarounds fulfil the same task.
If we could be “fooled” in such a way, does that mean our highest fps threshold rests at the common 30 fps seen in most 8th-gen AAA games? I know I may be dating myself here, but you’d be surprised to see how typical that claim was in forum threads. Although the human eye that doesn't technically see visual information as individual frames like a computer, that old hypothesis can be easily debunked.
The first way to dismantle those old preconceived notions relies on a study done by the US Air Force. It showed that fighter pilots could correctly identify an image flashed on-screen for 1/220 of a second (~2.2 milliseconds) under specific circumstances. For comparison: 60 fps translates to ~16.7 milliseconds per individual image. If your contention is that trained pilots aren’t comparable to your average person then let’s looks towards a real-world example. The Hobbit films were shot in 48 fps and netted mixed responses from critics and film-goers alike. The nascent rendering technology combined with an over-emphasis on CGI proved to be something untested on an audience; even if you could handle it, or admire it (as I did), the point remains that your typical laymen did notice the difference.
Whenever I’ve seen these arguments downplayed still images tend to be the go-to in proving the difference between 30 and 60 fps is apparently unnoticeable. Such a feeble comparison doesn’t adequately represent what the fps difference would look like when player movement is considered; and since there are plenty of ways to demonstrate exactly that, this counter-argument has always been intellectually lazy. An easily relatable way for sports fans to understand would be in considering a high-speed camera’s shutter speed. Viewing images of a baseball player taking a swing, you can see the drastic amount of new information per frame wholly dependent upon how fast the pictures are taken.
Transplant this template to watching a gameplay video comparison running at 25% speed to take this even further. Bear in mind: it’s not just about any visual information loading onto the screen at a quicker rate but also reacting to whatever new visual information, be it enemies or otherwise, comes into view at a faster rate. This is why that heralded gold standard of 60 fps was made in the past. Games being based on player input raises the stakes compared to inactive forms of media. Transitions between seeing a target, getting a bead on said target, and taking a shot with double the frames of the typical 8th-gen open-world console shooter feels smoother despite the action taking equal time to accomplish; it’s essentially operating on Inception logic.
In an ideal world, the debate for this would’ve been over ages ago; as it stands, marketing can’t help but demand developers & publishers utilize whatever linguistic tricks they can. Witnessing them conjure up excuses or even validating a lower framerate as the better option gets cringe-inducting.
There are a few examples—limited to my recent memory—that most would scratch their head over. The first being Ubisoft’s damage control of Assassin’s Creed: Unity running at 900p and 30 fps across both 8th gen platforms. Suspicious console parity aside, the excuse Ubisoft used for that framerate was one that’s gotten pretty old at this point. Another instance was when Ready at Dawn explained its rationality behind two key visual choices: locking at 30 fps and setting the resolution to 1920:800:1, the visual standard when watching films in widescreen. Despite these being early 8th-gen examples, this defense is alive and well thanks to Bloober Team's new 30fps-locked horror title: The Medium.
Of all the claptrap on this topic, though, this is still my favorite of the bunch: “60 fps is really responsive and really cool. I enjoy playing games in 60 fps. But one thing that really changes is the aesthetic of the game in 60 fps. We're going for this filmic look, so one thing that we knew immediately was films run at 24 fps. We're gonna run at 30 because 24 fps does not feel good to play. So there's one concession in terms of making it aesthetically pleasing, because it just has to feel good to play.” (Dana Jan, The Order: 1886 Director).
The notion of deliberately seeking out a lower framerate because said option enhances the game’s aesthetic is a bold-faced lie. In all cases, it’s obvious they’re just managing damage control. There have been many examples in the past in which that sacrifice was made for visual fidelity, but it’s tough for me to remember witnessing a creator so effusively proclaim a lower framerate could actually be seen as some kind of benefit. Considering these developers are financed by the most ubiquitous game publishers and are consistently headlined, I have a worrisome presentiment about the impact such statements might have on those less-knowledgeable about the subject.
Before detailing what I hope to see in the future, it’s fair to acknowledge I shouldn’t be considered an arbiter on “the rules.” I’m not Supreme Pontiff of Arcanis Frameratum with the Digital Foundry team being among my closest disciples. You’re reading an article from someone whose favorite game is Star Wars: Knights of the Old Republic (KOTOR) on Xbox. For those of you who also played it back then, you’ll probably remember some annoying hiccups when the action became hectic. Even with my expanded gaming vocabulary, such as knowing the difference between framerate and visual bugs/glitches today, I can still go back to KOTOR, Ocarina of Time, Batman: Arkham games, and more while still appreciating what else they bring to the table. I’m capable of adoring future titles that are locked at 30fps—frustrating though it’ll be when that occurs.
What I’m aiming for is the focus to come down to something simple to digest: player choice. Hell, if there’s a specific niche of PC players who want to crank their graphics settings as far as a locked 30fps setting will allow them then that’s totally fine. You are free to utilize your GPU, CPU, RAM, etc. however you please; just as I’ve done for a bevy of PC games I’ve played. It’s odd to consider just how constraining the average console game has been about this in the past. The PS4 version of Final Fantasy XIV: A Realm Reborn having 30 or 60fps options, the BioShock games on 7th-gen enabling an unlocked framerate, and relatively few others during the previous two gens have enabled some sort of fidelity/performance dichotomy. It’s true that next gen has more grease on the tracks for this becoming a norm; and yet, it’s annoying to ponder why this quality many gamers are praising was seldom pursued before.
If nothing else, the waves of change that’ve sluggishly happened recently and now into 9th gen suggest if not a “revolution” than at least a framerate evolution is happening. What I believe we’ll see from it is less opportunities for nonsense terminology disseminated to the gaming populous. The perpetual cycle of creatives & consumers financially feeding each other false reasoning of “high-definition fidelity at all other costs” will effectively be severed, and the cinematic misinformation campaign along with it. Maybe developers going for broke on high-definition screenshots wasn’t the best approach. Conversely, perhaps satisfying the hardcore PC crowd with framerate stats isn’t the end-all for each person either. Maybe—just maybe—affording every player a goddamn choice on how they prefer to play is the way forward.
Despite being one of newest writers on VGChartz, Lee has been a part of the community for over a decade. His gaming history spans several console generations: N64 & NES at home while enjoying some Playstation, SEGA, and PC titles elsewhere. Being an Independent Contractor by trade (electric, plumbing, etc.) affords him more gaming luxuries today though. Reader warning: each click given to his articles only helps to inflate his Texas-sized ego. Proceed with caution.