Developer Claims Xbox Series S Will Struggle Later Into the Generation - NewsWilliam D'Angelo , posted on 01 April 2023 / 7,124 Views
Caverns of Mars: Recharged game designer and producer Tadas Migauskas speaking with GamingBolt claims the lower power of the Xbox Series S compared to the Xbox Series X and PlayStation 5 will mean it will struggle later on in the generation.
He thinks the Xbox Series S is unlikely to consistently run games at 1440p at 60 frames per second as time goes on as technical demands for games continues to grow.
"If we take the (flawed) metric of FLOPS and compare Xbox Series X to Xbox Series S, you get a 3x difference in GPU computation power," said Migauskas. "Most of the current games use deferred pipelines, so rendered pixel count can translate to computation complexity pretty directly.
"Now, if we take the expected resolution targets for both consoles, we get a difference of 2.25x. Add that to lower available memory size, and it gets pretty hard to keep up."
Migauskas also discussed the power of the graphics cards in the PlayStation 5 and Xbox Series X consoles.
He said that the power difference, which is pretty small, won't make much of a difference for another year or two when newer rendering approaches are developed. The new rendering tools he expects to be pushed using the power of the GPUs of the consoles.
"You can see a general trend in games for consoles of previous generations – it takes a few years and a launch title or two for developers to realize and utilize the hardware capabilities," said Migauskas at the time.
"At the moment, the impact on development is still quite low. A ~15% increase is significant, but both GPUs are so powerful, most developers probably won’t bat an eye. You can do a bit more on one of them. And since most of the games launch on both of the consoles, you have to work with the lowest common denominator.
"In a year or two, when newer rendering approaches are developed, more rendering work will be expected to be pushed through the GPU pipelines. Then, it’s likely developers with more resources, mainly AAA companies using custom engines, will try to adhere to platform differences increasingly more."
A life-long and avid gamer, William D'Angelo was first introduced to VGChartz in 2007. After years of supporting the site, he was brought on in 2010 as a junior analyst, working his way up to lead analyst in 2012 and taking over the hardware estimates in 2017. He has expanded his involvement in the gaming community by producing content on his own YouTube channel and Twitch channel. You can contact the author on Twitter @TrunksWD.
I don't think anyone expects it to be hitting 1440p 60 fps in the mid-late gen, we already see many games running at less than 1440p. We also won't see PS5 and Series X running many games at 4k 60 fps in the mid-late gen. Pretty safe bet that Series S will be around 1080p on most mid-gen games, falling to 900p or even closer to 720p, but FSR upscaled to 1080p or 1440p, by late gen. Series X and PS5 meanwhile will be more like 1800p on most games in the mid-gen, falling to more like 1440p or even lower, but then FSR upscaled to 4K, in the late gen.
This pretty much. The Series S is not meant to be a beast so dev should not be expecting beast like performance. Its a budget friendly system meant to experience every feature of the new gen at lower image fidelity and this is exactly what I expect it to do the entire gen.
Beast is a relative term, both Series X and PS5 are beast in the console space even if they aren't when compared to PC. Struggle is also only relative to the goal intended to achieve. Both Series X and PS5 have been intended for 4k 60fps and if you see that figure as your goal then they both struggle in 95%+ of the cases.
You can say Series S running title at 900p is struggling, I can say Series S running game at 900p with stable FPS show less struggle than the same title at 1440p on Series X and PS5 title that don't achieve a Stable FPS.
Honestly, I don't think many Xbox Series S users would care if their console was 900P and using reconstruction... It's a low-cost digital box, not a high-end gaming system.
Obviously when you only have an 8 Core Zen 2 @ 3.4-3.6Ghz, 10GB of Ram @ 224GB/s, Radeon 6500XT Class GPU you aren't going to be pushing high-end visuals, framerates and resolution... And that is okay, not everyone cares about that stuff, they care about the gameplay and story.
When the Series S console launched I think many of us predicted that over time we will see cutbacks to resolution and framerate to hit targets anyway... Even though it had a few releases that ran at 4k native, it's just happening a little quicker than many predicted.
And despite Cerny stating 8Teraflops is needed for 4k, the Playstation 5 and Xbox Series X will also have sub-native 4k titles released frequently, the more technology you push, the more hardware you simply need to run it...
Consoles are a game of compromises and normally one of the first aspects of console games to receive cutbacks is resolution and/or framerates... When that starts to occur, the onus is on Sony and Microsoft to release updated hardware in my opinion for us hardware snobs.
Pretty much yes. I had a thought and curious to what's your take on it. It appears to me that currently the biggest factor for the performance target and stability of the fps. As much to do with the choice of an engine for the game like unity, UE5 and many in house ones. I'm thinking the devlopment is heading into a consolidation on that fronts where many in house engine will simply be abandonned in favor of more stable feature rich one like EU5. Wo Long Fallen Dynastie use in house engine and is a mess in performance accross all platforms, I would better like them to optimize their time on the dev of their game directly rather then having them splitting their effort between building an engine an devloping a game which in most case result in engine that have less feature and are less versatile anyway.
Has anyone done any proper research into how much of a gpu is cut when going from 1080 to 4k? Since everygame going forward is going to be made to run with 4k as an option, i'm pretty sure any future release that runs in 4k, can run on the smaller gpu in 1080p at minimum.
Also how many of these devs have started designing games based only for the the new tech in these newer current systems. There is a learning curve. Even cerny said you can't go by teraflops anymore when comparing to last gen. The way the gpu's have evolved , along with the ssd and i/o , other chip advancements, it's a massive overhaul. But people see a tiny white box and think it's not powerful. Also the new system soft ware and engine upgrades to take advantage of the new architecture , I doubt anyone has begun to reach the potential of what's possible.
I was making fun of Cerny statement, mostly because of the debates here on this forum who took Cerny's statements as gospel, ignoring any kind of reason or logic despite myself presenting a counter argument.
Hate to say that I was right again though.
And I have always said that flops is bullshit.
GPU's tend to have an "efficiency" curve. - That is... Any given compute/bandwidth/caching is optimal for a specific given resolution.
For example, around 150-250GB/s of bandwidth is perfect for 1080P gaming, but at 4k? Then you get an orders-of-magnitude performance penalty because you simply don't have the fillrate to manage that resolution.
When did cerny say that about flops? I thought he was the one that said you can't compare this gen to last gen gpu's and flops because of the architecture. It would be like "comparing apples to oranges" he said.
And we know from experience that we can get native 4k with less than 8 teraflops... But can also fall short of 4k even with 16 teraflops.
Because flops is bullshit.
1) It just needs to run the games.
2) The percentage of games IT MAY struggle to run are very small.
3) Developers did amazing ports on Switch with an even greater power disparity.
They fail to realise that as games become more demanding, so does becoming more efficient over time aswell. XB1 and XB1X to PS4 and Pro were fine. So will the Series S.
Have you seen how many games run on the og PS4/Xone at the end of their life, let alone cross platform games. This will only get worse when the hardware gets weaker. Also PS4 and Xone where the target platform and then optimised for pro and OneX. This gen PS5/Series X is the target platform and it will get 'optimised' for Series S. It's usually better if you target the weakest platform instead of stronger platforms if you want to run them well on all platforms.
Series S is designed to cut corners to play the game at lower settlngs. People underestimate the Series S hardware. The cut from 4k to 1080p is a massive preformance saver.
Everyone is loving RE4 Remake, one of the best looking games on the market and its also on PS4.
Many PS5 and series X titles already don't hit 4k though. I can't see that improving as games will only get more taxing. The series S will he quite alright for quite a while, but the last 2 years of its respective gen it will struggle to hit 900p/30fps. Whereas PS5 will probably have difficulties maintaining a 1440p/60fps target.
4k is the industry standard and is what majority aim for. There might be some exceptions however anything not hitting 4k becomes questionable on the Devs.
Series S is no different to a current low end PC. How much they push next gen hardware is how much sacrificing the Series S makes.
The Series S can and will run any next gen game just depends on how well. People forget the XSS ran the Matrix demo, one of the most demanding showcases we have seen so far.
Probably. Its likely to be a comparable situation to the base Xbox One and the Xbox One X in the previous gen.
Yeah, pretty much. This story's a nothingburger. Yes, the Series S will increasingly struggle to him 1440p@60 as the generation wears on. But the Series X and PS5 will increasingly struggle to hit 4K as levels of detail, AI, RT, etc., ramp up too. So this story incorrectly frames this as a Series S issue, when it's really a gaming-wide issue.