In depth anolysis of Skyrim graphics engine

Post » Fri Apr 08, 2011 4:10 am

There were 3.4 GHz P4 in february of 2004. Relatively recently... trololololol.


uhmm...that's true. I wonder why the first 3,2 Ghz of the i-series came by Nov-08...
User avatar
lilmissparty
 
Posts: 3469
Joined: Sun Jul 23, 2006 7:51 pm

Post » Fri Apr 08, 2011 4:16 am

The easiest way to resolve this is too look at games that appear on console and PC hardware. In almost every case a mid range PC would be able to play the game at higher resolution than the console for a given frame-rate. And bear in mind that this is probably a conservative test since most AAA titles are optimised more heavily for consoles than for PC.

I believe that this is mostly due to the fact that mid-tier PC GPUs are significantly more advanced than the GPUs in consoles, although it's been a while since I've delved into this.

Hmmm...when I think of it that way...my PC performs better than my PS3, and I build it 2006 and I now think of it as mid-range.
So I guess you're right, you made a good point there.
User avatar
Wayne W
 
Posts: 3482
Joined: Sun Jun 17, 2007 5:49 am

Post » Thu Apr 07, 2011 6:12 pm

the trend is towards a LOT of cores. Seriously, i really don't know how much more thread level parallelism can we expect from programmers and in particular game devs to pull out. I say more than 8 cores won't give you much of difference for game/everyday use loads (hell everyday use loads are satisfied with 2 cores). Also a strong trend is towards gpu/cpu unification (apu, heterogeneous computing).
Unless we don't find a new material to carry us further, x86 sequential performance will at the end stagnate (we can't keep die shrinking for ever). Anyways that's way off topic (i'm seeing a lot of computer engineering students/professionals...).

As for Skyrim, I don't expect to see crysis-esque graphics. Better than fallout but nothing otherworldly. I wish PC had DX11 goodies though (even though I don't have and don't plan to buy dx11 hardware in the near future :confused: )
User avatar
Marie
 
Posts: 3405
Joined: Thu Jun 29, 2006 12:05 am

Post » Fri Apr 08, 2011 2:34 am

What, a dozen footstep textures with a lifetime you count in seconds and a particle effect? Wow, impressive /sarcasm
Cell is old stuff, getting fancy effects to run on it is due not "to the power of the cell", but to the skill of the developer. Give credit where credit is due - and none of that credit goes to the hardware. Was mid end 6 years ago.

Cell was mid end six years ago? I beg to differ. Sony touts it as the most powerful processor ever in a gaming console and ps3 hasn't even been on shelves for six years.
User avatar
Katie Samuel
 
Posts: 3384
Joined: Tue Oct 10, 2006 5:20 am

Post » Thu Apr 07, 2011 11:43 pm

Consoles are apparently very capable of making graphics like this:

http://www.youtube.com/watch?v=MEMxSUGZ6TU

http://www.youtube.com/watch?v=1Kvl31g77Z8
http://www.youtube.com/watch?v=-D9oINHI11E

Using the "it's a multi-platform game" is a bad excuse, when you're looking at what other games can acheive on consoles.
If Skyrim has bad graphics (we have to wait until we see a real video), it's "only" because lack of money, lack of knowledge, lack of will.

Probably not lack of money, since Bethesda's games have been really successful.
Probably not lack of knowledge, since Bethesda has been able to afford to hire some of the best.
Most likely lack of will; meaning that they rather spend their resources elsewhere.

Gameplay is very important, but so are graphics. I'd say it's 50 - 50; equally important. Gameplay (as broad as it is) satisfies the mind, while graphics satisfies the eyes? :)


I just want to clarify that I agree with this 100%. I think Red Dead Redemption has some of the best graphics I've ever seen in a game, and is definitely the standard to beat. From the screenshots I've seen of Skyrim, I believe that it may be on track to do just this (the mountain screenshots being the most compelling). I'm also looking forward to seeing Crysis 2 in action on the consoles.

I think that the current dominance of consoles is a positive development for gaming in general. The success of Oblivion and FO3 is largely built on console sales and has enabled BGS to become one the premier game studios; fixed hardware allows developers to develop more efficient graphics implementations (look how much better RDR is compared to GTA4); it means that I don't have to upgrade my PC every 2 years to keep up.
User avatar
Mr. Ray
 
Posts: 3459
Joined: Sun Jul 29, 2007 8:08 am

Post » Thu Apr 07, 2011 9:37 pm

I just want to clarify that I agree with this 100%. I think Red Dead Redemption has some of the best graphics I've ever seen in a game, and is definitely the standard to beat. From the screenshots I've seen of Skyrim, I believe that it may be on track to do just this (the mountain screenshots being the most compelling). I'm also looking forward to seeing Crysis 2 in action on the consoles.

Rage also looks very good for a console game too.
User avatar
Tanya
 
Posts: 3358
Joined: Fri Feb 16, 2007 6:01 am

Post » Thu Apr 07, 2011 11:55 pm

Cell was mid end six years ago? I beg to differ. Sony touts it as the most powerful processor ever in a gaming console and ps3 hasn't even been on shelves for six years.

Oh, well if sony's marketing department say that sony's gaming machine is more powerful than sony's competition, who am I to disagree with sony? :P
And no, the PS3 didn't launch 6 years ago, it launched late 2006 - but factor in design and production and 6 years is a reasonable figure to describe both the 360 and PS3 without having to describe both.

I'm not saying the processor isn't impressive in its own way, I'm saying that price, power, and cooling restrictions mean that no console will ever use cutting edge hardware, because cutting edge hardware tends to be expensive, hungry, and hot. Consider that the 360 already has overheating issues, and then imagine twice the heat output in it. Yeah, that's not going to happen.
User avatar
Andrew Tarango
 
Posts: 3454
Joined: Wed Oct 17, 2007 10:07 am

Post » Fri Apr 08, 2011 1:55 am

Rage also looks very good for a console game too.

Yeah, Rage does look pretty good, although I'm going to go out on a limb here and say that I think that what I've seen of Skyrim so far is more impressive. (This is doubly impressive when you consider that Skyrim, like RDR, is completely dynamic, whereas Rage is a lot more static.)
User avatar
Pants
 
Posts: 3440
Joined: Tue Jun 27, 2006 4:34 am

Post » Fri Apr 08, 2011 7:16 am

I don't know why everybody is hooked up on frequency vs architecture. Console games aren't CPU bound when it comes to "looking good". But, just look at the Core i7 920 (2.66GHz) vs the Phenom II X4 970 (3.5GHz) for example. The Intel outperforms the AMD in http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i7-2600k-i5-2500k-core-i3-2100-tested/15 with a huge frequency difference.

Plain and simple it's a RAM issue combined with GPUs from FIVE generations ago.

GTX 5xx > GTX 4xx > GTX 2xx > 9xxxGTX > 8xxxGTX > 7xxxGTX (PS3 Equivalent, 7800GTX specifically)
HD 6xxx > HD 5xxx > HD 4xxx > HD 3xxx > HD 2xxx > X1xxx (Xbox 360 Equivalent, X1900 specifically)
User avatar
Caroline flitcroft
 
Posts: 3412
Joined: Sat Nov 25, 2006 7:05 am

Post » Thu Apr 07, 2011 6:37 pm

I don't know why everybody is hooked up on frequency vs architecture. Console games aren't CPU bound when it comes to "looking good". But, just look at the Core i7 920 (2.66GHz) vs the Phenom II X4 970 (3.5GHz) for example. The Intel outperforms the AMD in http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i7-2600k-i5-2500k-core-i3-2100-tested/15 with a huge frequency difference.

Plain and simple it's a RAM issue combined with GPUs from FIVE generations ago.

GTX 5xx > GTX 4xx > GTX 2xx > 9xxxGTX > 8xxxGTX > 7xxxGTX (PS3 Equivalent, 7800GTX specifically)
HD 6xxx > HD 5xxx > HD 4xxx > HD 3xxx > HD 2xxx > X1xxx (Xbox 360 Equivalent, X1900 specifically)


I haven't really followed this CPU discussion or whatever it has been the last pages, but from what I recall:
Nowadays games aim to use as much of the GPU's power and memory as possible, right?
User avatar
Andy durkan
 
Posts: 3459
Joined: Fri Aug 03, 2007 3:05 pm

Post » Fri Apr 08, 2011 12:57 am

I'm enjoying the discussion, although I think we're in danger of straying a little off topic, so let's all agree (or not?) that current PCs are significantly more powerful than consoles, but that new games can still look very good on the consoles due to time developers have had to optimise for the hardware.

Moving back to the original topic, I'd be very interested to hear peoples opinions on what graphics features appear to be used in Skyrim based on the screenshots. Also, I think it always helps this sort of discussion to include current games as a comparison.

For instance, do you think that based on the screenshots Skyrim is using a deferred lighting solution?
User avatar
Angus Poole
 
Posts: 3594
Joined: Fri Aug 03, 2007 9:04 pm

Post » Fri Apr 08, 2011 4:54 am

I don't know why everybody is hooked up on frequency vs architecture. Console games aren't CPU bound when it comes to "looking good". But, just look at the Core i7 920 (2.66GHz) vs the Phenom II X4 970 (3.5GHz) for example. The Intel outperforms the AMD in http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i7-2600k-i5-2500k-core-i3-2100-tested/15 with a huge frequency difference.

Plain and simple it's a RAM issue combined with GPUs from FIVE generations ago.

GTX 5xx > GTX 4xx > GTX 2xx > 9xxxGTX > 8xxxGTX > 7xxxGTX (PS3 Equivalent, 7800GTX specifically)
HD 6xxx > HD 5xxx > HD 4xxx > HD 3xxx > HD 2xxx > X1xxx (Xbox 360 Equivalent, X1900 specifically)


Basically I agree with that. Nice GPU scheme, though :)
User avatar
Eire Charlotta
 
Posts: 3394
Joined: Thu Nov 09, 2006 6:00 pm

Post » Thu Apr 07, 2011 5:27 pm

Oh, well if sony's marketing department say that sony's gaming machine is more powerful than sony's competition, who am I to disagree with sony? :P
And no, the PS3 didn't launch 6 years ago, it launched late 2006 - but factor in design and production and 6 years is a reasonable figure to describe both the 360 and PS3 without having to describe both.

I'm not saying the processor isn't impressive in its own way, I'm saying that price, power, and cooling restrictions mean that no console will ever use cutting edge hardware, because cutting edge hardware tends to be expensive, hungry, and hot. Consider that the 360 already has overheating issues, and then imagine twice the heat output in it. Yeah, that's not going to happen.

Well what exactly is cell good at then? Its better than xenon for sure. I've heard many people claim cell is a beast and shreds through whatever you throw at it.

@hlvr developers have been using cell more recently to render a lot of the graphics and tricks per se. Insomniac said that they used cell quite heavily in uncharted for graphics and not just the gpu.
User avatar
Hot
 
Posts: 3433
Joined: Sat Dec 01, 2007 6:22 pm

Post » Thu Apr 07, 2011 7:16 pm

Apologies if this has been pointed out somewhere obvious, but have BGS stated that Skyrim will look identical on PC, PS3, and Xbox 360?

Or, for all we yet know, might BGS adopt Crytek's strategy and allow PC users to scale up the graphics relative to console users?
User avatar
Crystal Birch
 
Posts: 3416
Joined: Sat Mar 03, 2007 3:34 pm

Post » Thu Apr 07, 2011 9:18 pm

For instance, do you think that based on the screenshots Skyrim is using a deferred lighting solution?


I hope not.

Deferred rendering = all-encompassing term
Deferred lighting = kind of deferred rendering
Deferred shading = kind of deferred rendering

But, I'll admit, deferred lighting has fallen so out of favor that nobody acknowledges it existed. :P

It's far too early to tell, based on static screenshots. I hope they do use deferred shading, though, at least for interiors' sakes. Deferred shading would be kind of useless outdoors, as there is usually very few lights, and deferred shading is also completely impossible to implement for alpha transparency (foliage, etc). So what you have to do is still do a forward rendering pass just for anything transparent. And then you have to make sure to limit lights in the forward pass, as to not kill performance.

I'm not positive, but I think they could technically switch between forward and deferred rendering at load screens, say, between interior and exterior cells. You can switch between rendering paths in a lot of the tech demos I have compiled and run. Intel has a particularly good demo, where they use a DX11 Compute Shader to speed up deferred shading considerably. It's usually called "tile-based" deferred shading.
User avatar
Trevi
 
Posts: 3404
Joined: Fri Apr 06, 2007 8:26 pm

Post » Thu Apr 07, 2011 6:52 pm

Wait will snow fall throw bulidings like it did or not if it did that would svck. BUT I STILL CANT WAIT!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

That wouldn't be very "dynamic" haha
User avatar
Tinkerbells
 
Posts: 3432
Joined: Sat Jun 24, 2006 10:22 pm

Post » Fri Apr 08, 2011 8:18 am

I hope not.

Deferred rendering = all-encompassing term
Deferred lighting = kind of deferred rendering
Deferred shading = kind of deferred rendering

But, I'll admit, deferred lighting has fallen so out of favor that nobody acknowledges it existed. :P

It's far too early to tell, based on static screenshots. I hope they do use deferred shading, though, at least for interiors' sakes. Deferred shading would be kind of useless outdoors, as there is usually very few lights, and deferred shading is also completely impossible to implement for alpha transparency (foliage, etc). So what you have to do is still do a forward rendering pass just for anything transparent. And then you have to make sure to limit lights in the forward pass, as to not kill performance.

I'm not positive, but I think they could technically switch between forward and deferred rendering at load screens, say, between interior and exterior cells. You can switch between rendering paths in a lot of the tech demos I have compiled and run. Intel has a particularly good demo, where they use a DX11 Compute Shader to speed up deferred shading considerably. It's usually called "tile-based" deferred shading.


I think I probably meant deferred shading (although I did see something about Crysis 2 having deferred lighting). I'm slightly out of my depth here, but from what I gather Stalker, Crysis, GTA4/RDR and a number of other engines use a deferred rendering approach. As you say, one of the disadvantages is with alpha transparency. I think this was the reason that the foliage in Crysis was not very smooth around the edges. Given that the foliage in Skyrim looks pretty smooth, can we rule out deferred shading outdoors?
User avatar
maria Dwyer
 
Posts: 3422
Joined: Sat Jan 27, 2007 11:24 am

Post » Thu Apr 07, 2011 11:33 pm

I think I probably meant deferred shading (although I did see something about Crysis 2 having deferred lighting). I'm slightly out of my depth here, but from what I gather Stalker, Crysis, GTA4/RDR and a number of other engines use a deferred rendering approach. As you say, one of the disadvantages is with alpha transparency. I think this was the reason that the foliage in Crysis was not very smooth around the edges. Given that the foliage in Skyrim looks pretty smooth, can we rule out deferred shading outdoors?


No. Because at that point hardward/software implementations didn't allow MSAA on the G-Buffer (where all the view-space data needed to render the scene is accumulated), so, it would depend on if these screenshots were on PC or a console.

The jagginess of the foliage in Crysis WAS technically because of deferred shading, but not for the reason you're implying. It was because of the no-MSAA reason, not because of the forward pass compositing. However, at least on PC I believe CryEngine 3 now supports MSAA with deferred shading.

Honestly, I don't care about MSAA myself, if I can get by with MLAA or NFAA like I do in Oblivion (thanks to OBGE) then I'd rather have the extra power saved go toward rendering more objects on the screen, etc. MSAA can be quite expensive. MLAA/NFAA in case you didn't know are post-effects using edge detection. Crysis did have an edge detect AA, it should have helped out with foliage some.

And, honestly, at the resolution we're viewing these Skyrim shots we can't even tell if there is AA or not. And we especially don't know since we don't know what platform the shots are from.
User avatar
Ross Zombie
 
Posts: 3328
Joined: Wed Jul 11, 2007 5:40 pm

Post » Fri Apr 08, 2011 5:37 am

Name one exclusive pc game that have this "amazing" graphs you all keep talking about!(not crysis)
User avatar
Sarah Evason
 
Posts: 3507
Joined: Mon Nov 13, 2006 10:47 pm

Post » Fri Apr 08, 2011 5:09 am

Name one exclusive pc game that have this "amazing" graphs you all keep talking about!(not crysis)

Metro 2033 is beautiful, STALKER is (but in its own way, and is an example of a game that wouldn't come out on console but thrives on PC)
Starcraft 2 looks great, but Blizzard aim for their games to run on your toaster as well as a decent PC. Along the same RTS lines, Supreme Commander looks pretty awesome, and Sins of a Solar Empire gives an incredible sense of scale.
There are also genres that just don't /go/ on console, such as space simulations like X3, which can look beautiful and drive a PC to its limit.

Normally when people talk about PC graphics, though, they mean in all titles, not just exclusives. Just Cause 2, for example, looks leaps and bounds better than its console counterparts. Almost every game can look better, just through being able to run at a higher resolution with anti-aliasing, but the vast majority of games can also run with higher quality models and textures, with better shaders.
User avatar
Vera Maslar
 
Posts: 3468
Joined: Wed Sep 27, 2006 2:32 pm

Post » Fri Apr 08, 2011 1:47 am

Metro 2033 is beautiful, STALKER is (but in its own way, and is an example of a game that wouldn't come out on console but thrives on PC)
Starcraft 2 looks great, but Blizzard aim for their games to run on your toaster as well as a decent PC. Along the same RTS lines, Supreme Commander looks pretty awesome, and Sins of a Solar Empire gives an incredible sense of scale.
There are also genres that just don't /go/ on console, such as space simulations like X3, which can look beautiful and drive a PC to its limit.

Normally when people talk about PC graphics, though, they mean in all titles, not just exclusives. Just Cause 2, for example, looks leaps and bounds better than its console counterparts. Almost every game can look better, just through being able to run at a higher resolution with anti-aliasing, but the vast majority of games can also run with higher quality models and textures, with better shaders.


metro 2033, starcraft 2, supreme commander are avaible for xbox 360. Sins of a solar empire?hahhahah that was pathetic. I mean i dont wat to start a war here, but if thats the best pc have to offer, i think im gonna stick to my lovely 360
User avatar
SEXY QUEEN
 
Posts: 3417
Joined: Mon Aug 13, 2007 7:54 pm

Post » Fri Apr 08, 2011 7:29 am

metro 2033, starcraft 2, supreme commander are avaible for xbox 360. Sins of a solar empire?hahhahah that was pathetic. I mean i dont wat to start a war here, but if thats the best pc have to offer, i think im gonna stick to my lovely 360


Oh, Metro 2033 is on 360. The other two aren't, but regardless. There are very few 360 'exclusives' that are not also on PC, looking better and running better. Your preferences are fine by me, the toy you choose has no effect on me. Perhaps a little less ignorance would do you good, however :)
User avatar
Josee Leach
 
Posts: 3371
Joined: Tue Dec 26, 2006 10:50 pm

Post » Fri Apr 08, 2011 9:16 am

metro 2033, starcraft 2, supreme commander are avaible for xbox 360. Sins of a solar empire?hahhahah that was pathetic. I mean i dont wat to start a war here, but if thats the best pc have to offer, i think im gonna stick to my lovely 360

The xenon in your 360 can't hold a candle to the cell in the ps3. Just sayin.
User avatar
Phoenix Draven
 
Posts: 3443
Joined: Thu Jun 29, 2006 3:50 am

Post » Fri Apr 08, 2011 2:24 am

The xenon in your 360 can't hold a candle to the cell in the ps3. Just sayin.

ok
User avatar
JERMAINE VIDAURRI
 
Posts: 3382
Joined: Tue Dec 04, 2007 9:06 am

Post » Fri Apr 08, 2011 7:14 am

I think people in this thread are moving away from the actual discussion and maybe starting a flame war about comp game specs other then skyrim and a PS3 vs X360 comparison. Skyrim so far to me has looked amazing, it looks like they have completely overhauled the whole look and we shall wait and see till some gameplay videos to see the full extent :)
User avatar
Kay O'Hara
 
Posts: 3366
Joined: Sun Jan 14, 2007 8:04 pm

PreviousNext

Return to V - Skyrim