This is a multi-platform game. It needs to run smoothLY on Xbox360 and PS3 so.... Some people may have upgraded PC's since Oblivion, but the Xbox360 is no better than it was four and a half years ago.
Since Oblivion is just about three months shy of being five years old, the first week of March, did the Xbox version not arrive until June? No way I'd know. Never wanted any console, never had one, nor did my kids ever want one.
There will likely be some upgrades to the engine, so it will look a bit better than Fallout 3 and Oblivion, and they will also likely make it semi-scalable so that it looks better on PC than it does on the 360. But the reality is that the PC market isn't that big any more, so they aren't going to spend a fortune doing lots of extra work especially for the PC.
So yeah, it will no doubt look really good, but when I see people posting pictures of real photographs and expecting it to look similar, you are only setting yourselves up for disappointment. Same goes for upgrading your PC components. My other PC still has an 8800GTX which is about 3.5 years old and it still runs 99% of games cranked. The only ones it struggles with are PC exclusives (Arma2, FSX, etc). The fact is, the crapbox360 is holding back progress of gaming graphics, and this game will be no different.
Don't forget, folks, bethesda has never bothered to learn anything useful about hardware, as witness bad official requirements on all of their games, at least for graphics. Oblivion couldn'r run on anything less speedy than a Geforce 6600 GT, and that couldn't handle anything better than the small textures. The Radeon 9500s were older than the Radeon 9600s, and numbering didn't quite mean as much to ATI then, so instead of naming the 9600 a 9300, they gave it a higher number to show it was newer, whether it was slower, or not, and it was. In other words, the FXes and the 6200 were nothing but "Pie in the Sky" lies before the first patch introduced the Ultra Low ("mud") image quality option, and that didn't help any Gefiorce FX less capable than the best of the 5700s.
Did they manage to improve their aim three years later? Let's see:
* Direct X 9.0c compliant video card with 256MB RAM
(NVIDIA 6800 "GS" or better/ ATI X850 or better)
{Added my note here, the X800 Pro & up also, and
avoid the Geforce 6800 SE, and 6800 XT}
For
SMALL TEXTURES, the X800 / X850 are very fast, which makes the newer cards after them that are useful, much slower, but able to run Medium Textures, which really wasn't made clear. The only reason the X700 and X800 weren't tested was the drivers from AMD stopped including any cards other than X850s in their newer drivers (although Omega drivers are fine, and one of the X700s, the "XT", can be used as well. A minimum MEDIUM Texture card, such as the X1650, should have been named, and the bad 6800s should have been shown as excluded. .
Comparing Oblivion with F3, the errors are far less eggrgious, but still were wrong, and what happened next?
Ridiculousness, that's what. I won't even quote the BS that Obsidian named as official for FNV, it's just plain
STUPID, is what it is.
Relatively close to the end of the Radeon X1n00 generation, they finally replaced the X1600 cards which had been embarrasingly slow compared to the Geforce 7600s. They renamed the X1600 XT, figured out a way to sell it for the X1600 Pro's price, and named it as the X1650 Pro. The X1650 XT was a very slightly detuned X1800 GTO. But there were still lots of the X1600 Pro chips in the warehouses. Those were run through a BIOS update, to answer to the name "X1300 XT", and a few of them were sold that way. The major part of that stock went back to the recycler, so the new name was hardly used.
Until Obsidian decided to publish
that it as a minimum Radeon, which really befuddled the owners of the real X1300 Pro cards, who really had no idea that the REAL name was "X1600".
Stupid, STUPID.
I can't wait to see how assinine the next version of the same requirements gets!