Actually, I would add a correction to the above in this thread: on the Xbox 360,
Oblivion used 576p. (1024x576, hence the resolution of a lot of earlier screenshots being this)
Very few AAA games for the 360 actually ran 720p. Instead, the game "upscaled" from a lower, semi-HD resolution to reach 720p. This includes even
Halo 3 and
Halo: Reach, which run at 640p. (1138x640) The "upscaling" doesn't add any detail: it just takes the image and stretches it up, using some filtering to avoid it from looking blocky and pixelated, like really old games might look when stretched. (it can look BLURRY if you upscale it too much, but going from even 576->1080 won't hurt it too much)
The reason for this? The console has a bit of a limit on memory and pixel throughput once you start stacking on modern high-end shaders, AND have to multi-render in order to achieve the (mandated by Microsoft) anti-aliasing. To the best of my knowledge, most PS3 games DO manage full 720p, because while the console is very close to the graphical capabilities of the Xbox 360, it
cannot do HDR and AA at the same time; the flip side is that it means that it frees up power it was going to use on one of them, and can use that to boost the resolution a little. In almost all games on the PS3, they pick HDR, and have no AA: this is why, with a good TV, you can notice jaggies in the PS3 version when they aren't there in the 360 version. It's not a hands-down advantage for either console: you basically get your pick of "AA or higher resolution." (one exception is
Final Fantasy XIII, which has AA on the PS3, but does not have HDR at all)
If
Skyrim tries to enforce a higher resolution, then the detail levels will suffer as a result: the consoles aren't magical "black boxes" that arbitrarily always reach a specified level. They CAN get lag, and they CAN fail to reach certain things... Just that they have fixed settings, carefully picked by the developers to get the best balance at all times. So bumping the resolution means, say, cutting draw distance or lighting detail to compensate. The gamer in the end won't know the inside battle and negotiation that took place in order to make sure they're always getting 30fps, but it happens all the same.
Final Fantasy XIII is 30fps at 720p, something some graphics enthusiasts refuse to believe when it's pointed out to them (because they refuse to think anything less than 60 fps at 1080p can look awesome). I once had someone literally refuse to believe it in a reply when I told them that. They get those visuals by not squandering the power on extra pixels per frame and extra frames per second.
Well, for one, FF XIII doesn't use HDR, as I noted above. Careful examination of videos of the game reveal that the lighting model is static, which saves a LOT of processing power, at the expensive of realism. Instead it relies more on frequent scene-changes and pre-scripted changes to things like bloom and saturation filters: things that do post-processing with a raw image (basically Photoshop-type effects) rather than actually adjusting the lighting model while it renders. The result is it takes a lot less power: 6th-generation consoles like the PS2 used those effects heavily. (think
Shadow of the Colossus and
Metal Gear Solid 3)
And of course, FF XIII also relies a lot on simpler, more-enclosed scenes: due to the less-freeform nature of FF games, they can do this to minimize the amount of extra objects on screen that svck away computing power. Even more important, it makes sure the programmers can always know WHAT will be on-screen: no players dragging monsters or items from a different area over. This lets them design things to run very close to the limit, rather than leaving in a "buffer zone" of memory/power to make up for things the player might do that are unexpected.
It might have been mentioned, but upscaling is often used in a marketing context it does not belong in at all. Sure the PS3 can "upscale" to 1080p, but that doesn't mean much of anything. Your 1080p TV upscales just as well on its own. It has to in order to display non-native resolutions.
Actually, it can have a point in a console, just like how a PS3 is a better blu-ray player than stand-alone ones. You see, the specifications for upscaling or HD content playing are vague: as long as it puts the right number of pixels on the screen, all's considered good. Many TVs then will cheap out on filtering, and include a very weak upscaling chip: sometimes it doesn't filter at all, and just stretches, like how Microsoft Paint did back in Windows XP: sure, you get a bigger picture, but you get blocky and jaggy artifacts.
Both the PS3 and 360 include more dedicated hardware that specifically uses bilinear filtering when upscaling: it's dedicated circuitry that handles upscaling and upscaling alone. This means you ALWAYS get filtering, which is better than it being left up to the TV, in case your TV cheaped out. Of course, even better would be the option to select other types of filtering other than bilinear: 2xSAI, for instance, works VERY well for cartoon (especially anime) styles, and actually can dramatically improve visual quality: almost as much as if the original had been drawn at double resolution to begin with. But even still, bilinear is better than just plain stretching without any form of filtering/interpolation.