
The Console Framerate Revolution - Article
by Lee Mehr , posted on 16 October 2020 / 6,522 ViewsEver since the middle of the 7th generation, the all-that-is-holy framerate topic has been one that’s interested me. After dealing with nonsense from both industry talking heads and belligerent community members in the past, it’s something I appreciate learning more about. And as we tip towards next generation and the promise of prettier visuals and 120 fps (on occasion) comes to fruition, it’s worth considering what this all means going forward. Are we in the midst of a framerate revolution for consoles? If so, what does that look like?
Before answering these pressing questions, it's worth delving into the past to understand our present situation. The advent of motion pictures essentially birthed the concept of individual frames per second and putting in place a practical limit that would still be fine for the viewer experience. Back then, there wasn’t a specific standard, since film cameras were hand-cranked devices and projectors were capable of outputting at various speeds. When the nascent ability to add sound came into frame firm standards became necessary. Over or under-cranking would result in inconsistent sound quality; as a result, 24 fps became the standard for sound film.
That standard was for a number of reasons: the practicality of that fitting onto a film reel, the costs not being too radical, and research Thomas Edison had made on the topic back in the day. He found that eye strain occurs if a person sees less than forty-six images in a second. While the set standards appears to be less than Edison’s suggested number, 24 fps actually means 48 fps or 72 fps because two-blade or three-blade shutters in movie projectors repeat that individual image twice or thrice respectively. This specific technique isn't seen in lower framerate video games, but cheat codes like motion blur and other workarounds fulfil the same task.
If we could be “fooled” in such a way, does that mean our highest fps threshold rests at the common 30 fps seen in most 8th-gen AAA games? I know I may be dating myself here, but you’d be surprised to see how typical that claim was in forum threads. Although the human eye that doesn't technically see visual information as individual frames like a computer, that old hypothesis can be easily debunked.
The first way to dismantle those old preconceived notions relies on a study done by the US Air Force. It showed that fighter pilots could correctly identify an image flashed on-screen for 1/220 of a second (~2.2 milliseconds) under specific circumstances. For comparison: 60 fps translates to ~16.7 milliseconds per individual image. If your contention is that trained pilots aren’t comparable to your average person then let’s looks towards a real-world example. The Hobbit films were shot in 48 fps and netted mixed responses from critics and film-goers alike. The nascent rendering technology combined with an over-emphasis on CGI proved to be something untested on an audience; even if you could handle it, or admire it (as I did), the point remains that your typical laymen did notice the difference.
Whenever I’ve seen these arguments downplayed still images tend to be the go-to in proving the difference between 30 and 60 fps is apparently unnoticeable. Such a feeble comparison doesn’t adequately represent what the fps difference would look like when player movement is considered; and since there are plenty of ways to demonstrate exactly that, this counter-argument has always been intellectually lazy. An easily relatable way for sports fans to understand would be in considering a high-speed camera’s shutter speed. Viewing images of a baseball player taking a swing, you can see the drastic amount of new information per frame wholly dependent upon how fast the pictures are taken.
Transplant this template to watching a gameplay video comparison running at 25% speed to take this even further. Bear in mind: it’s not just about any visual information loading onto the screen at a quicker rate but also reacting to whatever new visual information, be it enemies or otherwise, comes into view at a faster rate. This is why that heralded gold standard of 60 fps was made in the past. Games being based on player input raises the stakes compared to inactive forms of media. Transitions between seeing a target, getting a bead on said target, and taking a shot with double the frames of the typical 8th-gen open-world console shooter feels smoother despite the action taking equal time to accomplish; it’s essentially operating on Inception logic.
In an ideal world, the debate for this would’ve been over ages ago; as it stands, marketing can’t help but demand developers & publishers utilize whatever linguistic tricks they can. Witnessing them conjure up excuses or even validating a lower framerate as the better option gets cringe-inducting.
There are a few examples—limited to my recent memory—that most would scratch their head over. The first being Ubisoft’s damage control of Assassin’s Creed: Unity running at 900p and 30 fps across both 8th gen platforms. Suspicious console parity aside, the excuse Ubisoft used for that framerate was one that’s gotten pretty old at this point. Another instance was when Ready at Dawn explained its rationality behind two key visual choices: locking at 30 fps and setting the resolution to 1920:800:1, the visual standard when watching films in widescreen. Despite these being early 8th-gen examples, this defense is alive and well thanks to Bloober Team's new 30fps-locked horror title: The Medium.
Of all the claptrap on this topic, though, this is still my favorite of the bunch: “60 fps is really responsive and really cool. I enjoy playing games in 60 fps. But one thing that really changes is the aesthetic of the game in 60 fps. We're going for this filmic look, so one thing that we knew immediately was films run at 24 fps. We're gonna run at 30 because 24 fps does not feel good to play. So there's one concession in terms of making it aesthetically pleasing, because it just has to feel good to play.” (Dana Jan, The Order: 1886 Director).
The notion of deliberately seeking out a lower framerate because said option enhances the game’s aesthetic is a bold-faced lie. In all cases, it’s obvious they’re just managing damage control. There have been many examples in the past in which that sacrifice was made for visual fidelity, but it’s tough for me to remember witnessing a creator so effusively proclaim a lower framerate could actually be seen as some kind of benefit. Considering these developers are financed by the most ubiquitous game publishers and are consistently headlined, I have a worrisome presentiment about the impact such statements might have on those less-knowledgeable about the subject.
Before detailing what I hope to see in the future, it’s fair to acknowledge I shouldn’t be considered an arbiter on “the rules.” I’m not Supreme Pontiff of Arcanis Frameratum with the Digital Foundry team being among my closest disciples. You’re reading an article from someone whose favorite game is Star Wars: Knights of the Old Republic (KOTOR) on Xbox. For those of you who also played it back then, you’ll probably remember some annoying hiccups when the action became hectic. Even with my expanded gaming vocabulary, such as knowing the difference between framerate and visual bugs/glitches today, I can still go back to KOTOR, Ocarina of Time, Batman: Arkham games, and more while still appreciating what else they bring to the table. I’m capable of adoring future titles that are locked at 30fps—frustrating though it’ll be when that occurs.
What I’m aiming for is the focus to come down to something simple to digest: player choice. Hell, if there’s a specific niche of PC players who want to crank their graphics settings as far as a locked 30fps setting will allow them then that’s totally fine. You are free to utilize your GPU, CPU, RAM, etc. however you please; just as I’ve done for a bevy of PC games I’ve played. It’s odd to consider just how constraining the average console game has been about this in the past. The PS4 version of Final Fantasy XIV: A Realm Reborn having 30 or 60fps options, the BioShock games on 7th-gen enabling an unlocked framerate, and relatively few others during the previous two gens have enabled some sort of fidelity/performance dichotomy. It’s true that next gen has more grease on the tracks for this becoming a norm; and yet, it’s annoying to ponder why this quality many gamers are praising was seldom pursued before.
If nothing else, the waves of change that’ve sluggishly happened recently and now into 9th gen suggest if not a “revolution” than at least a framerate evolution is happening. What I believe we’ll see from it is less opportunities for nonsense terminology disseminated to the gaming populous. The perpetual cycle of creatives & consumers financially feeding each other false reasoning of “high-definition fidelity at all other costs” will effectively be severed, and the cinematic misinformation campaign along with it. Maybe developers going for broke on high-definition screenshots wasn’t the best approach. Conversely, perhaps satisfying the hardcore PC crowd with framerate stats isn’t the end-all for each person either. Maybe—just maybe—affording every player a goddamn choice on how they prefer to play is the way forward.
Despite being one of newest writers on VGChartz, Lee has been a part of the community for over a decade. His gaming history spans several console generations: N64 & NES at home while enjoying some Playstation, SEGA, and PC titles elsewhere. Being an Independent Contractor by trade (electric, plumbing, etc.) affords him more gaming luxuries today though. Reader warning: each click given to his articles only helps to inflate his Texas-sized ego. Proceed with caution.
More Articles
Since both PS5 and Series X are basically poor man's PCs all games should have graphics sliders. People should be able to play at whatever framerate and resolution they desire.
This is easily one of the best articles that I've read on vgchartz. Well done!
Well, I got over 100 fps with Duke Nukem 3D on a good PC decades ago. All hail unlocked FPS.
Great article, and there have been more than a few coming out here recently...
On the topic of 30/60 lock, while less than 'full' freedom in terms of settings, I think that has and certainly HAD plenty of rationale, in terms of any FPS not exactly lining up with display frequency was going to cause issues somehow or another (and entire scheme of specialized techniques existed to manage those). On consoles designed to output to consumer TVs, the display frequencies were not arbitrary, so locking game to match them makes alot of sense. Quite honestly the complaint is more that 30 fps was THE locked frequency without option for 60 fps most often. More recently variable rate displays have removed that issue, albeit there remains a certain desirability for locked FPS even if display can adapt to it. Probably all the more so in competitive games where radically shifting FPS impacts gameplay experience, a fixed FPS is reasonable goal. Overall I think it's fair to consider that genre of game plays big part in this, in PvP game fast reactions are key, but in PvE game that tends to be less true, particularly in certain genres.
people shouldn't fight for it, it's all about your personal experience, I have PC and consoles, I played Ocarina on 64 running at 20fps and it was amazing, if you ask me, for ME, the truth is ... the differrence between 60 and 30 fps is a real thing, a game locked 30fps without drops runs very well, but 60fps locked is a plus, it's better, smoother, cleaner, shapper, you can fell it easely, the same thing occours about audio, you can enjoy stereo or even 7.1 virtualized surround, but there is some people like me that have Home theater and for them play games using stereo / 7.1 on headsets is a downgrade experience. We need to be careful about our posts, I got angry sometimes, why people claim something as true when it is only about a limited personal experience? to affirm something you should need in first place access to the majority part of that theme, it's the only way to measure properly your opinion. (for instance, I'm not allowed to affirm anything about cars, I don't have enough knowlegde)
Ocarina of Time ran at 20fps, coming from A Link to the Past which ran at 60fps most of the time. Hopefully I don't have to play a racing game like WRC 8 again with 30fps. Motions are ugly blurred.
Where are you getting the idea that Link to the Past is 60 frames per second? It's a 2D sprite-based game, there isn't a global frame rate. Non-static objects have independent animations.
Considering all the animation frames are visible in Link to the Past, it's safe to say none of them are running anywhere near 60 frames per second. Maybe 12. Link, for example, usually has 4 frames per motion, including his sword swing, so that's probably 2-3 swings in a second or 8-12 per second. To hit 60 frames in a second, he'd need to swing 15 times.
Just wanted to add that I think the difference between 30 and 60 is hard to notice. I definitely notice once I get up to 80 fps, and personally prefer around 100-144 fps. But the notion that 60 is miles better than 30 is just a PC fanboy's talking point IMO.
Personally I find the difference quite easy to notice, at least in comparisons. That said, it's not once or twice that I've noticed a game is running at 30 FPS without trying to think about it - it's just fairly obvious. I think the first time I noticed it in a console game was in Metal Gear Solid 3. When I noticed it, I was fairly sure it ran worse than MGS2, and indeed it does.
i think the difference between 30 to 60 is more noticeable than the difference between 60 to 120
The 16 Bit twins both ran at 60 FPS (50 FPS on Megadrive games in Europe). It wasn't until PS1 and N64 that games started running at 30 FPS.
I don't think Starfox ran at 60 or even 30 on the SNES.
Yeah, Starfox is an exception.
It's a huge oversimplification to only say that the SNES ran games at 60fps. Constant slowdown on the SNES was very, very common. Most games having slowdown frequently in the moment to moment gamepaly. Especially comapred to Sega Genesis games. There aren't many faced paced games that could be said to provide any sort of consistent experience on the platform.
I think people are conflating television refresh rate with video game console frame-rate. It's also not correct to say SNES games ran at 60 frames per second. Most games used 2D Sprites, some of which were static elements (or just staying on 1 frame), but NES games don't have high frame rates, the vast majority of the time you can see the animation frames without even slowing things down.