Video Card Ranking List

Post » Thu May 26, 2011 8:42 pm

Nottheking, for when you get back, I'd say you're very safe to put the 4870 ahead of the GTX 260 and 3870X2, but behind the GTX 280. The 4850 could fit either right before or right after the 260/X2. It's definitly in the Egad! category.
User avatar
Naughty not Nice
 
Posts: 3527
Joined: Sat Nov 04, 2006 6:14 am

Post » Fri May 27, 2011 11:18 am

Anyone have a prediction for how Oblivion would perform on this laptop?

http://www.newegg.com/Product/Product.aspx?Item=N82E16834220305
User avatar
lilmissparty
 
Posts: 3469
Joined: Sun Jul 23, 2006 7:51 pm

Post » Fri May 27, 2011 5:44 am

Anyone have a prediction for how Oblivion would perform on this laptop?

http://www.newegg.com/Product/Product.aspx?Item=N82E16834220305


It would do pretty well on high settings at 1280x1024. That HD 2600 has about the power of my 8600GT which runs at those settings.

However, you can forget about shadows.

EDIT: That processor is a beat on the weak side, but should be fine.
User avatar
Matthew Barrows
 
Posts: 3388
Joined: Thu Jun 28, 2007 11:24 pm

Post » Fri May 27, 2011 8:42 am

Sounds good, thank you. :)
User avatar
Oceavision
 
Posts: 3414
Joined: Thu May 03, 2007 10:52 am

Post » Fri May 27, 2011 7:03 am

i use a readeon x300 hyper memory it runs on low graphics (not very low) with textures on med and it gets about 40 fps all the time
User avatar
REVLUTIN
 
Posts: 3498
Joined: Tue Dec 26, 2006 8:44 pm

Post » Thu May 26, 2011 11:54 pm

Upon realizing that the GeForce 9800GTX+ is not actually available yet, I've removed it from the list; I normally only list cards that are actually possible to buy the very day you read that list. I apologize for my mistake.
Look at ATi's new cards. They are beasts at there price points. Wow, even has comparable performance to the much more expensive Nvidia 200 series.

Yeah, nVidia's in pretty deep crap right now, given that they blew hundreds of millions on developing the GT200, (GeForce GTX 200) which in the face of the RV770 (Radeon 4800) turns out to be an utter dud.

Nottheking, for when you get back, I'd say you're very safe to put the 4870 ahead of the GTX 260 and 3870X2, but behind the GTX 280. The 4850 could fit either right before or right after the 260/X2. It's definitly in the Egad! category.

Oh, it definitely is a marvelous card, though it just seems that once you pass the 1680x1050 range, it seems to start to trail even the GTX 260. I *think*, in that case, it might be a limitation due to having only 512MB of video RAM; Race Driver GRID demonstrates this theory verily when running at 2560x1600, where any card with only 512MB of memory suddenly bombs in performance, generally dropping to perhaps around 25-35% of the framerate the same card gets at 1920x1200.

Of course, this will be tested once the 1024MB version of the 4870 arrives this fall. Of course, there's also the fact involved that when nVidia and ATi/AMD switched architectures in going from GeForce 7 and Radeon X1k to GeForce 8 and Radeon HD 2k and onward, they more or less traded the advantage in Oblivion; Previously, it favored ATi's cards because each pixel shader unit had two fully-featured, well-rounded ALUs, while nVidia's had a really, really stripped-down second ALU, and the GeForce 6 cards had only one ALU. (Radeon X series cards had two as well) For the Radeon HD 2k, 3k, and 4k cards, they heavily stripped down each ALU, so they individually were far less flexible, and each group of five had to rely on the same thread for instructions. Likewise, they were all scalar, lacking a vector unit. As a result, performance in Oblivion was no longer as strong, as it appears Oblivion really prefers more flexible ALUs over raw maximum mathematical capacity.

i use a readeon x300 hyper memory it runs on low graphics (not very low) with textures on med and it gets about 40 fps all the time

However, I believe you're running at a resolution of around 640x480, no? That's the very minimum for the game, and hence "very low" graphics settings. While the game is definitely playable on the "very low" cards, I describe the category specifically to warn people against getting their hopes up too high.
User avatar
Ells
 
Posts: 3430
Joined: Thu Aug 10, 2006 9:03 pm

Post » Fri May 27, 2011 12:26 pm

Id like to point out on your list that the Intel GMA X3000 should be moved up to very low end.

because in the lastest drivers it enables the X3100 to process vertex shaders 2.0 . This means it can run the game without help, but oldbivion and other stuff will help to make it play better.
User avatar
brian adkins
 
Posts: 3452
Joined: Mon Oct 01, 2007 8:51 am

Post » Fri May 27, 2011 1:06 am

Today's the release date for the Radeon HD 4870X2. Already, I've seen some benchmarks, and it's... Impressive to say the least. While I've yet to spot an Oblivion benchmark from it, from the huge gains it shows in all the games tested, particularly Crysis, it demonstrates a huge lead over the GeForce GTX 280 and even the GeForce 9800GX2. Though because its preliminary, I can't say HOW big a gain it is... But if this were a list for other games, with its release I'd likely have to introduce ANOTHER performance category.

Id like to point out on your list that the Intel GMA X3000 should be moved up to very low end.

because in the lastest drivers it enables the X3000 to process vertex shaders 2.0 . This means it can run the game without help, but oldbivion and other stuff will help to make it play better.

Okay then, done. I was personally wondering when the drivers would come out to allow its stream processors to meet modern VS standards.
User avatar
Emma Louise Adams
 
Posts: 3527
Joined: Wed Jun 28, 2006 4:15 pm

Post » Fri May 27, 2011 6:03 am

But if this were a list for other games, with its release I'd likely have to introduce ANOTHER performance category.


I have a suggestion.

How about....*Complete mental breakdown*? Or "DAYUM!".

And also, I think that because of the release of the 4870X2, the 4870 1GB won't be far behind, and we'll have to keep an eye on that.

Oh, and there was actually one link with a benchmark of the 4870X2 and Oblivion in Community...just a few things to say...1920x1200, HDR, 24x AA, 16x AF, 70FPS.

http://www.firingsquad.com/hardware/amd_ati_radeon_4870_x2_performance_review/page10.asp
User avatar
Enny Labinjo
 
Posts: 3480
Joined: Tue Aug 01, 2006 3:04 pm

Post » Thu May 26, 2011 10:52 pm

70fps on 24X AA with that resolution!!!! It's totally insane!
User avatar
Nomee
 
Posts: 3382
Joined: Thu May 24, 2007 5:18 pm

Post » Thu May 26, 2011 10:39 pm

Either the HD 4850 needs to be dropped a notch to "extreme" or the 9800 GTX+ needs to be bumped up to "egad". Been able to find reviews on the 9800 GTX+ and only one of them showed the HD 4850 beating it in a majority of games benchmarked. The other three showed the 9800 GTX+ winning.
User avatar
Rachel Briere
 
Posts: 3438
Joined: Thu Dec 28, 2006 9:09 am

Post » Fri May 27, 2011 4:10 am

In exactly the manner referred to a few minutes ago, when mention of this list came up in the Community Discussion forum, NTK is dealing with Oblivion specifically, and in case you kept a blinder on for two years plus, Oblivion favors the Radeons, and the more you spend, the bigger the gap between what a Radeon can do for a given cost, than what a Geforce is capable of.
User avatar
Jani Eayon
 
Posts: 3435
Joined: Sun Mar 25, 2007 12:19 pm

Post » Thu May 26, 2011 10:22 pm

In exactly the manner referred to a few minutes ago, when mention of this list came up in the Community Discussion forum, NTK is dealing with Oblivion specifically, and in case you kept a blinder on for two years plus, Oblivion favors the Radeons, and the more you spend, the bigger the gap between what a Radeon can do for a given cost, than what a Geforce is capable of.


However, that was at release. The era of cards from two years ago DID perform better than their Nvidia counterparts. However, I'm not sure if the same can be said now, over two years later.

Holy [censored], it's been two years?!
User avatar
Shelby Huffman
 
Posts: 3454
Joined: Wed Aug 08, 2007 11:06 am

Post » Fri May 27, 2011 6:39 am

Either the HD 4850 needs to be dropped a notch to "extreme" or the 9800 GTX+ needs to be bumped up to "egad". Been able to find reviews on the 9800 GTX+ and only one of them showed the HD 4850 beating it in a majority of games benchmarked. The other three showed the 9800 GTX+ winning.


Perhaps. But in Oblivion, the 4850 wins in every benchmark I've seen, and that's what this list was specifically tailored for.

I think the 4850 does need to be split between 512MB and 1GB versions, though, with the 512MB version compared to the 9800GTX+ and the 1GB version kept separate and above. That extra VRAM really lets you pile on the visual mods and/or the resolution and AA without crippling performance (which seems to be the definition of the "egad" category).
User avatar
Ria dell
 
Posts: 3430
Joined: Sun Jun 25, 2006 4:03 pm

Post » Thu May 26, 2011 10:15 pm

Thanx for th list
just got my PC upgraded
CPU Intel 8500
Asus 9600 GT
4 GB Ram


running oblivion on max setting no anti... with Qarl's Texture Pack III installed and some other mods as well ,getting 62 indoor and 55 to 60 outdoor
hopefully this might help
User avatar
NeverStopThe
 
Posts: 3405
Joined: Tue Mar 27, 2007 11:25 pm

Post » Fri May 27, 2011 12:08 pm

Yes, I actually am still alive... First post from me in a month or so, even if it seems to be like it's been longer... :P

Well, today (or yesterday?) saw the release of the Radeon HD 4670, which many have compared to the Radeon HD 3850 and 3870 in part due to having similar specs... In spite of having only a 128-bit memory interface, due to a lot of re-working, including shrinking the stream processors (due to finding a way to make them using fewer transistors) and doubling the number of texturing units from 16 to 32, and a massive overhaul of AA techniques, has resulted in a card that http://www.newegg.com/Product/Product.aspx?Item=N82E16814102792, happens to http://www.guru3d.com/article/ati-radeon-hd-4670-review/2, meaning it doesn't need any external power connector at all, and runs cooly with only a single-slot cooler, and posts surprising performance in pretty much every game.

When it comes to Oblivion at the medium-high resolution of 1280x1024, with AA and AF enabled at ultra settings, http://www.anandtech.com/video/showdoc.aspx?i=3405&p=9, and likely close to the level of the notably more expensive 9600GT. It's possible it might surpass it in some areas, but without extensive comparisons to it, I cannot make that judgement, so I'm placing it below. However, it's clear that it does best the 3870 across the board where AA is concerned, and likewise to the 9600GSO; the the GeForce does hold a slight edge with AA disabled, once you enable it, the 9600 GSO takes a whopping 31.1% hit, dropping it to a noticable 10.7% below, due to the 4670 practically shrugging off the burden of AA and AF, taking a mere 17.4% penalty to the framerate, barely over half the hit the 9600 GSO takes.

Since this is a fairly high end of video card that it's in, and AA+HDR is possible with all these cards involved, the list at that end operates on the basis of the player using anti-aliasing. As such, card positions reflect this; the 9600 GSO is better than the 3850 when AA is enabled, while the reverse is true if it's disabled. Similarly, the Radeon 4670 bests both of those when AA is enabled, but loses to both when it is not being used. Yes, the 3870 goes from first to last among them, due to it taking a brutal 43.0% hit right between the eyes when you turn on AA in Oblivion; that has GOT to hurt. Nonetheless, the game's still quite playable at said settings, (31.8 fps is well above the 20fps "smooth playing" level for an RPG like this) so it's all quite fair.
Either the HD 4850 needs to be dropped a notch to "extreme" or the 9800 GTX+ needs to be bumped up to "egad". Been able to find reviews on the 9800 GTX+ and only one of them showed the HD 4850 beating it in a majority of games benchmarked. The other three showed the 9800 GTX+ winning.

The benchmarks you're looking at almost certainly aren't using AA. For the "very high-end" and above categories, I'm assuming it's used at a 4x level. And when you enable AA to 4x or above, the 4850 blows the 9800GTX+ away across the board. Even factory OC'd versions don't stand a chance.

Of course, if I assumed that no AA was used for the higher-end cards, then the listing would change, mind you... While Oblivion does somewhat favor the fewer, yet more-flexible stream processors of the GeForce 8 and 9 cards over those of the Radeon 2000 and 3000 cards, something that hurts them even worse is anti-aliasing performance. If AA is actually disabled, the GeForce 9600 GSO readily loses to the Radeon 3850, and the GeForce 9600GT is actually weaker than the 3870. With AA enabled to x4, the GeForces readily outpace their respective Radeon competitors.

However, that was at release. The era of cards from two years ago DID perform better than their Nvidia counterparts. However, I'm not sure if the same can be said now, over two years later.

Technically, starting with the GeForce 8 cards, Oblivion shifted favoritism toward nVidia. This is due to the fact that at the same time, ATi and nVidia effectively swapped design philosophies when it came to shader ALU design. In Radeon X/X1k cards, ATi went for big, beefy, flexible ALUs... while nVidia didn't place so much importance with their GeForce 6 and 7 cards, using much smaller ALUs, though in the GeForce 7 cards they did use two of them per shader unit... But it still didn't make up for their lack of flexibility; while their theoretical floating-point throughput was equal or higher, their real-world performance with Oblivion's shaders, which were designed for the type of ALU ATi used from the 9700pro to the X1950XTX, just didn't run as well.

This changed, obviously, when ATi opted for a "superscalar" approach to their stream processors for the Radeon HD 2000 and 3000 cards. Each individual SP is incredibly stripped-down, to the point where each cluster of five relies on being grouped together for them to be bound to individual instructions. Contrary to some belief, each one is an independent stream processor, and CAN operate with independant data from the other four, it's just that programming-wise, it's harder to keep all five occupied at once. Oblivion's code wasn't designed for that, and as such, a lot of them sit empty. Meanwhile, all of the stream processors in nVidia's GeForce 8 and 9 cards were significantly beefed up, being far more capable than the ALUs found in previous GeForce cards, and in fact much more resembled the capability and flexibility of the older Radeon shaders. As such, Oblivion effectively feels more "at home" with them.

Another thing that factors in here is that at higher-level settings, anti-aliasing is more often used. Again, in previous generations, since the original Radeon 9700/9800 generation, ATi has historically been much better at handling anti-aliasing than nVidia, with much better performance with it enabled. This went up until the time of the Radeon X18xx/X19xx cards against the Geforce 78xx/79xx cards. With G80, nVidia completely cleaned up their act, providing far better anti-aliasing performance than anything previously seen. Meanwhile, ATi, which still stuck to their old methods, became outclassed in the Radeon HD 2000 and 3000 series.

Since the introduction of the Radeon HD 4000 series, while the shaders are still lean and hard to use for Oblivion, ATi has given us their first big upgrade in AA performance in around 6 years, wildly besting even nVidia's improved AA, and in many cases making the performance hit for enabling it startlingly close to zero.

I think the 4850 does need to be split between 512MB and 1GB versions, though, with the 512MB version compared to the 9800GTX+ and the 1GB version kept separate and above. That extra VRAM really lets you pile on the visual mods and/or the resolution and AA without crippling performance (which seems to be the definition of the "egad" category).

I'm not quite so positive that it'll really make all that much of a difference. The list was mostly made and envisioned for no resolutions above 1680x1050 or 1600x1200, as I'll admit. And from waht I've seen in any benchmarks, even with a massive texturemap buffer, it's not until you roughly hit the 1920x1200 x4AA/x16AF range that you really start to notice a difference between 512MB and 1024MB cards in that range. Admitedly, save for the 4850, there are no cards in that performance range available right now with more than one amount of memory, and no benchmarks appear to have surfaced yet that strictly and effectively set out to see what, if any, benefit would come for the 1GB version.

Also, I've heard some rumors that some people have managed to get the VRAM usage in Oblivion to go well past a full gigabyte... I wonder if there is anyone that can demonstrate this? I'd care to see a screenshot of the debug text... One of the pages shows current rendering information from the video driver itself, including the number of triangles on the scene, as well as the current VRAM load, including a breakdown between what is used for texture cache and what is used for raster buffers. While a lot of increase to the texture cache usage can be made through texture packs, I'd note that a rather sizeable chunk of Oblivion's VRAM usage is due to raster buffers, since due to all the shader effects, there is a LOT of overdraw.

Thanx for th list

Glad to know you're enjoying your results. You've got a nice rig, as I can see, and I truly hope you enjoy your playing. :)
User avatar
Joanne
 
Posts: 3357
Joined: Fri Oct 27, 2006 1:25 pm

Post » Thu May 26, 2011 9:02 pm

i hate too be a nag but can you incorperate the change i mentioned a bit earlier in the thread.

i did make a mistake though, i forgot to mention too that the Intel GMA X3100 also has the same capabilities, but it slightly stronger, with later pixel shaders (3.0 verues 2.0) and more video memory.

just thought i might remind, they go in the Very Low End category
User avatar
James Potter
 
Posts: 3418
Joined: Sat Jul 07, 2007 11:40 am

Post » Thu May 26, 2011 8:42 pm

i hate too be a nag but can you incorperate the change i mentioned a bit earlier in the thread.

i did make a mistake though, i forgot to mention too that the Intel GMA X3100 also has the same capabilities, but it slightly stronger, with later pixel shaders (3.0 verues 2.0) and more video memory.

just thought i might remind, they go in the Very Low End category

I could've sworn that I'd put the GMA 3000s into the right category before... Perhaps they just were accidentally reverted in one of my edits... I've noted I've made some mistakes before while editing...

At any rate, I have them tentatively fixed. Will probably move them up some places, once I find some suitable benchmarks on them... To be honest, I never really paid much attention to the GMA's performance after the 900/950, which I found were roughly comparable to GeForce FX 5200s...
User avatar
Claire Lynham
 
Posts: 3432
Joined: Mon Feb 12, 2007 9:42 am

Post » Fri May 27, 2011 3:24 am

I could've sworn that I'd put the GMA 3000s into the right category before... Perhaps they just were accidentally reverted in one of my edits... I've noted I've made some mistakes before while editing...

At any rate, I have them tentatively fixed. Will probably move them up some places, once I find some suitable benchmarks on them... To be honest, I never really paid much attention to the GMA's performance after the 900/950, which I found were roughly comparable to GeForce FX 5200s...


yaeh, i just checked, the Intel GMA 3000 cannot run the game without oldblivion, the drivers only add the feature to the GMA X3000 and later chipsets,

so the GMA 3000 should be back in the Outdated category. the X makes a big difference.
User avatar
No Name
 
Posts: 3456
Joined: Mon Dec 03, 2007 2:30 am

Post » Fri May 27, 2011 10:57 am

I'm not quite so positive that it'll really make all that much of a difference. The list was mostly made and envisioned for no resolutions above 1680x1050 or 1600x1200, as I'll admit. And from waht I've seen in any benchmarks, even with a massive texturemap buffer, it's not until you roughly hit the 1920x1200 x4AA/x16AF range that you really start to notice a difference between 512MB and 1024MB cards in that range. Admitedly, save for the 4850, there are no cards in that performance range available right now with more than one amount of memory, and no benchmarks appear to have surfaced yet that strictly and effectively set out to see what, if any, benefit would come for the 1GB version.


Well, I run at 1680x1050. My VRAM usage is at or around 1GB (I'd have to re-install Rivatuner to be certain) - in game TDT shows anywhere from 650-750MB depending on where I am (http://luchaire.gamersoasis.net/Screenshots2/Perf4.jpg), and I'd hate to see what performance I'd get with just 512MB VRAM. ;)
User avatar
carly mcdonough
 
Posts: 3402
Joined: Fri Jul 28, 2006 3:23 am

Post » Fri May 27, 2011 6:44 am

Well, today (or yesterday?) saw the release of the Radeon HD 4670, which many have compared to the Radeon HD 3850 and 3870 in part due to having similar specs... In spite of having only a 128-bit memory interface,

When it comes to Oblivion at the medium-high resolution of 1280x1024, with AA and AF enabled at ultra settings, http://www.anandtech.com/video/showdoc.aspx?i=3405&p=9, and likely close to the level of the notably more expensive 9600GT.


I knew I should have waited a little bit longer could have gotten the 4670 for a little more than what I paid for my 3650. Oh well too late now. I am happy with what I have.

I have an ATI 3650 512ddr2 on Amd X2 4000 (2.0 ghz) and outdoors I average around 30 fps outdoor at 1024x768 with AA 4x(12x edge detect), AF 16x, Ultra Settings, with textures packs ( bomret normal maps, my lod package, diverse grasses - no qtp3 or qarl lod)

4670 - Those benchmarks are just about right for that card.

Considering that is High End mainstream Card card geared toward those that want to spend less than 100 for video card, and still be able to play most game at medium (crysis) to high settings (oblivion) at 1024x768 or 1280x1024 or 1440x990 (depending on cpu also)

I should have waited to buy this card instead of my 3650 once again.
User avatar
Austin England
 
Posts: 3528
Joined: Thu Oct 11, 2007 7:16 pm

Post » Fri May 27, 2011 12:47 am

The benchmarks you're looking at almost certainly aren't using AA. For the "very high-end" and above categories, I'm assuming it's used at a 4x level. And when you enable AA to 4x or above, the 4850 blows the 9800GTX+ away across the board. Even factory OC'd versions don't stand a chance.


Please, do share this because I can't seem to find much to verify that conclusion.

Perhaps. But in Oblivion, the 4850 wins in every benchmark I've seen, and that's what this list was specifically tailored for.

I think the 4850 does need to be split between 512MB and 1GB versions, though, with the 512MB version compared to the 9800GTX+ and the 1GB version kept separate and above. That extra VRAM really lets you pile on the visual mods and/or the resolution and AA without crippling performance (which seems to be the definition of the "egad" category).


Early test - but Anandtech is one of the few that still use Oblivion:

http://www.anandtech.com/video/showdoc.aspx?i=3340&p=5

See that little bar two spots down from the 9800 GTX+? ;) With 4X AA and 16X AF no less.
User avatar
Add Meeh
 
Posts: 3326
Joined: Sat Jan 06, 2007 8:09 am

Post » Thu May 26, 2011 9:59 pm

Two 4870 X2's in CrossFire. This is even better than three 280's. Get the http://www.newegg.com/Product/Product.aspx?Item=N82E16814261025 version. One of these will be all you'd ever dream of supercharging Oblivion with, and 2 in CrossFire would be like playing Morrowind.

Radeon cards also give better AA performance with DX10.1 than the nVidia cards.

Question answered?
User avatar
adam holden
 
Posts: 3339
Joined: Tue Jun 19, 2007 9:34 pm

Post » Fri May 27, 2011 11:34 am

Two 4870 X2's in CrossFire. This is even better than three 280's. Get the http://www.newegg.com/Product/Product.aspx?Item=N82E16814261025 version. One of these will be all you'd ever dream of supercharging Oblivion with, and 2 in CrossFire would be like playing Morrowind.

Radeon cards also give better AA performance with DX10.1 than the nVidia cards.

Question answered?

A dual 4870X2 doesn't make sense. They don't scale well (yet), if at all. It's nice to get some high benchmarks, but that's probably all you can do with them.
User avatar
STEVI INQUE
 
Posts: 3441
Joined: Thu Nov 02, 2006 8:19 pm

Post » Fri May 27, 2011 7:10 am

I noticed the 9800M GTS isn't on there. I own one. The way it's performed seems very good almost no problems with all settings maxed out (note though no AA over ride). Their is very little stutter at even a rediculous resolutions. I believe it should be listed under very high. I might be able to get a decent bencmark but I still have not connected that computer to the internet.
User avatar
roxanna matoorah
 
Posts: 3368
Joined: Fri Oct 13, 2006 6:01 am

PreviousNext

Return to IV - Oblivion