30 Frames Per Sigh: Choosing Glam over Game - Article/ 6,982 Views
There was an idea, a hope, really, that something big would change this generation. Sub-HD resolutions and 30-dropping-to-20 framerates would be a thing of the past. 1080p and 60 frames per second (fps) has become a sort of rallying cry for the current generation of gaming. It's the gold standard for what players expect, and for a brief six months it looked like that's what we would get. We were wrong.
You will have difficulty finding a single AAA PlayStation 4 or Xbox One game in the pipeline built for that 1080p and 60 fps standard. But more worryingly, you'll have a lot of trouble finding games that shoot for the 60 frames per second. What was only recently an ideal design decision has quickly fallen out of development favor. Indeed, headlining titles for the systems are almost universally 30 frames per second, and many drop lower. Titanfall, Sunset Overdrive, DriveClub, The Order: 1886, Destiny, The Witcher 3, and Watch Dogs are all coming in at 30 frames per second. Titanfall is technically unlocked and averages in the 40s, but regularly drops below even 30.
Titanfall's choice to prioritize graphical fidelity over better framerate is a decision made at the expense of gamers.
Now I know many of you will immediately complain I'm nitpicking about something which doesn't matter and 30 fps is completely acceptable. It is completely acceptable (most of the time, we'll get to the exceptions to that later), but it's also almost universally worse. In any game in which you control an object or character in motion and timing has an affect on your performance, 60 fps is far more valuable than a 1080p resolution or better lighting and shaders. Framerate has a direct affect on your ability to control a game as accurately as possible. It simply feels much better to play a game at 60 frames than 30.
Perhaps unintuitively higher framerate provides a large benefit in a game's visual presentation, even if it isn't particularly precision control dependent. Because video games lack the motion blur of film, lower framerates are more noticeable. Film frames are exposed over time, and every frame blurs into the next, creating smoother picture at lower framerates (24 fps in the case of most films). While motion blur can and often is artificially added to games, the effect is not the same. A 60 fps game is simply more visually pleasing, and a more noticeable visual bump than higher resolution for most people.
"So Nick", you might ask, "if higher framerate is better for both gameplay and visuals, why wouldn't all games shoot for framerate over resolution and visual fidelity"? Well, clever reader, there are a couple answers to that, but the most obvious one is that it's a lot easier to advertise higher visual fidelity than higher framerate. 60 fps won't come across on the 30 fps locked YouTube, and it certainly won't come across in those screenshots posted on IGN and Game Informer.
DriveClub is one of the few major racing games to not shoot for 60 fps.
The second reason for this is slightly more complicated. As we progress further into a generation, there's a general expectation of constantly improving graphics. To acquiesce to this pretty ridiculous demand by gamers, companies make continuous sacrifices in performance for the sake of visual effects in other places. By the end of the previous generation, dubbed by many the HD generation, most games weren't HD at all. They were running in sub 720p (and often sub 600p) resolutions. Framerates were almost universally locked at 30, although again framerate drops below that were common in big games (looking at you, Grand Theft Auto V).
I'd like to say this is a small problem, but I don't really believe it is. There are certain places where 60 fps is almost a requirement. It's particularly important in three genres: shooters, racing games, and fighting games. Loss of frames reduces precision in genres where precision is essential. We can already see the framerate dropping in shooters with Titanfall, The Order: 1886, and Killzone: Shadow Fall (Killzone gets somewhat of a pass for sacrificing resolution for 60 fps in multiplayer). Racing seems to be the next sacrifice, with Evolution Studios' DriveClub opting for 1080p and 30 fps instead of sacrificing resolution for framerate. Fighters may be next on the list.
Nintendo's Mario Kart 8 may not be the prettiest game in still screenshots, but 60 fps makes it gorgeous in motion.
I'm not suggesting all games will make this same choice going forward. Nintendo is surprisingly a stalwart champion of 60 frames per second, delivering it in a majority of their Wii U and 3DS titles including Mario Kart 8, Super Smash Bros. for Wii U and 3DS, The Legend of Zelda: A Link Between Worlds, and even Super Mario 3D World. Nintendo shoots for 1080p, but when the choice has to be made has regularly chosen a lower resolution and higher framerate.
Sadly, the status quo here is unlikely to change. As the generation progresses we'll see fewer games choose framerate over pretty screenshots, and we partly have ourselves to blame. Our demand for constantly improving graphics has led to a less fun, responsive, and immersive experience for the end user. Regardless, be aware of the choices these companies are making when they design these games. Understand that it's almost always a marketing decision, rather than a choice based on the best experience for players. In the end, at least in this instance, it would be nice if more PS4 and Xbox One games were a little more like Wii U games.
Editor's Note: This is an editorial piece and does not necessarily represent the views of gamrReview or its other writers.
There are no comments to display.