Are Consoles Holding PC Gaming Back? AMD Thinks So. - News
by VGChartz Staff , posted on 18 March 2011 / 3,281 ViewsHardware manufacturer AMD has suggested that by bypassing software interface programs to enable interaction with other software - in other words, APIs like DirectX - would be the best way to unlock the full potential of current PC hardware. Like some others, AMD believes that current PC hardware is being held back by the prioritization of console development.
In a recent interview with bit-tech.net, AMD's GPU worldwide developer relations manager Richard Huddy had quite a bit to say about the matter.
'It's funny,' says Huddy, 'We often have at least ten times as much horsepower as an Xbox 360 or a PS3 in a high-end graphics card, yet it's very clear that the games don't look ten times as good. To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way.' Huddy says that one of the most common requests he gets from game developers is: 'Make the API go away.'
"...being able to have total control over the machine, which is what the very best games developers want. By giving you access to the hardware at the very low level, you give games developers a chance to innovate, and that's going to put pressure on Microsoft – no doubt at all.' Huddy added.
The advantage of using standard 3D APIs like DirectX is that your game will run on a wide range of hardware and that you'll get easy access to the latest shader technologies. Huddy suggests that developers are gaining one thing while losing another.
"Wrapping it up in a software layer gives you safety and security," says Huddy, "but it unfortunately tends to rob you of quite a lot of the performance, and most importantly it robs you of the opportunity to innovate."
Huddy says thanks to APIs, many games have the same look and feel these days on the PC. Programming "direct-to-metal" isn't for every games developer though, according to Introversion's lead designer and developer Chris Delay.
"I don't want anything to do with that, but presumably it depends on what you're developing. If you're making Crysis 3 or something like that, then it may be exactly what you want."
In any case, if game developers are looking to create cutting edge graphics, DirectX may not be in the equation anymore.
More Articles
@ssj12- Nope, didn't know that! I got most of my info from the article and what they implied. I did some outside research so that I could write it but I don't natively know much about programming games.
Programming the HW would be a nightmare, considering how many different ones there are, but between the lines he suggests that the problem is just DirectX, now that MS interest in gaming is focused on XB360 or in making PC/360 multiplats easy to develop even at cost of holding back the PC version. A better HW abstraction layer, not tied to a single big market player already exists, MS doesn't like it because it has on it the same influence as competitors instead of exclusive, it's OpenGL.
@ser - when a PC game loads a full screen game the OS is shifted into background mode which reduces resource use. Like the XMB and Dashboard on consoles.
@Xelestial - You do know that the only console using DX is the 360 right?
this is why I support OpenGL, its just as powerful, but developer driven and can have extensions added to improve performance how the dev wants.
i agree with no directx... devs should put all in the disk... their own way to get into HW... there arent that much GPU makers so shouldnt be that hard... if wasnt for the need to install on HDD, games should boot and run without the need of any OS...
@Hephaestos- You have to draw conclusions from the article...The main reason developers use DirectX on the PC is for compatibility with consoles. The source itself goes more in depth on the topic.
My personal opinion is that yeah, maybe that's true but eventually consoles will catch up. I don't think it means consoles or PC gaming is dying.
ok i don't see the link with consoles, it's all about direct X...
Consoles are holding PC gaming back...so is the fact that moving one step forward means having the production costs 2 or 3 times higher. Most of videogame developers are still not prepared to that.
I believe that innovation is made by design & tech, not just tech. Most games these days are just poor designed and rip-off.
If PC gaming really is going to die, the console gaming will follow it within a few years. If there's no PC games then why would gamers spend their money upgrading their system every two years? PC component manufactorers will face the decline of their hardware sale and soon the game industry will crash like what happen with Atari.
If only 7th gen consoles launched at the price of less then $350 then we might be able to move to next gen next year, sadly it didn't launch with that low price range. :(
That's right
examples : Dragon Age II, Mass Effect 2, Crysis 2...
Speaking as a developer, I agree with him. APIs like DirectX and programming languages like ActionScript can make developing easy, safer and faster, but they simultaneously rob you of your ability to really optimize and control.
They give you great tools which are easy to use but take away some things. Think of it as safety scissors vs. normal scissors. Safety scissors will prevent you from cutting yourself and will cut through paper. But if you wanted to use the scissors to cut through other materials or do something more creative which the safety scissors won't allow, you're out of luck.
Adding more layers to allow for higher level abstractions isn't a bad thing, but limiting interaction with hardware only though those layers is. It all comes down to control. It would be great if you could get as much control as you want. High level stuff if you don't care that much and low level if you want to optimize your loops/use part of the GPU's memory to hold game information/whatever you want to do.
The reason Consoles can compete graphically so well with PCs well being FAR cheaper, is because a game on a console, is optimized for its hardware. That's the truth of it.
This guy is a such a tit it's not even funny. What an ignorant tool. Games would take 10 years to develop following his idea of coding "bare to metal". It's not 1995 mate, these days hundreds of people collaberate on a game and they need a common platform. What does he want it to go back to coding in assembly or something lmao absoultely ridiculous.
What a load of utter rubbish.
It's true that consoles are holding pc gaming back now, it started a few years ago and has got steadily worse. This has nothing to do with directx though, directx is a PC API FFS that originates from nearly 15 years ago, it actually encouraged game development on the PC.
Going back to proprietory languages will bring back the bad old days of 3dfxpowervr where lots of games wouldn't render or even run correctly depending on what graphics card you had in your computer.
The real reason for consoles holding the pc gaming back is simply down to MONEY!!!! nearly all developers are owned by publishers these days and make games on a contract, not the other way round where a group of developers would start a game project - then get a publisher. There's many times more people with games consoles as decent pc gaming setups so hardly any money to be made relatively speaking. That is the only reason, and it's ineveitable and just the way it is. PC gaming will be dead in a few years. Shame as I just splashed out on 2 x radeon 6970 crossfire setup lol.
@linlhutz- The developer probably cited that as that is a big game that's about to come out :)
@Jereel- I do agree, the average consumer would be hurt the most. There needs to be a a compromise somehow, but perhaps this is not the answer.
well dev's get more and it benefits us all.
Bad idea and to be honest it is probably the engines that seem to drive devlopers down the samey route not the lack of access to physical hardware via an API.
What he wants is ridiculously bad for the consumer. This only works if you have VERY specific chipsets. Gone will be the days when upgrading to a fancy new video card will make a game run better. Heck, if the game wasn't updated by the developer, it may not run at all. Anyone remember the old dos days? You had to change autoexec.bat and config sys settings and reboot to get games to work with certain configs. We are passed that because software layers make everything play nice. Is there overhead? Yes. Could they do more accessing the hardware directly? Yes. But sooo few would benefit. This would increase dev time and cost (things would have to be individually coded for different GPU chipsets). And aren't devs already complaining about the massive budgets of current video games? Do they want to double their effort yet again to make prettier games?
I wonder why Crysis was mentioned in this article....hummmm







