» Mon Feb 14, 2011 4:57 am
When you're making something as complex as a game, it really comes down to the engine driving the experience more than the hardware supporting said engine. Most developers today say they don't feel impeded in any way by consoles (Crysis developers being the only exception so far), and I have to agree. There are many games - open world, mind you - that look absolutely beautiful on the console. Ever seen Just Cause 2 in action? How about the upcoming Two Worlds II? The engines driving those experiences are absolutely top-notch in terms of graphics and they certainly run on consoles without any issues.
Developers rarely blame the hardware. What the challenge becomes is: how do we (the developer) optimize the engine so it can show as much graphical detail as possible without sacrificing framerate or functionality? New technologies are emerging all the time that make these sorts of challenges easier to approach and solve, and cheaper (in terms of processing-power) as well, so the console is not really going to be an impeding factor. Especially considering the date they're aiming for - if you really expected graphics that would push a console's muscle to its true limits, you'd also expect a good deal of time spent developing their engine up to that point. They've only been working on their engine for a year or two now - absolutely not enough time to get the graphical detail up to a place where the console would struggle.
tl;dr: Consoles are much more powerful than most PC gamers seem to think they are. With modern consoles, it's down to the software, not the hardware, to solve the challenges necessary to meet modern graphics standards. Which, according to Todd Howard, they've been able to do already. So all this "doom" speak about "your" game being limited by those "old" consoles is rather childish and, frankly, wrong.