Yes, I actually am still alive... First post from me in a month or so, even if it seems to be like it's been longer...

Well, today (or yesterday?) saw the release of the Radeon HD 4670, which many have compared to the Radeon HD 3850 and 3870 in part due to having similar specs... In spite of having only a 128-bit memory interface, due to a lot of re-working, including shrinking the stream processors (due to finding a way to make them using fewer transistors) and doubling the number of texturing units from 16 to 32, and a massive overhaul of AA techniques, has resulted in a card that http://www.newegg.com/Product/Product.aspx?Item=N82E16814102792, happens to http://www.guru3d.com/article/ati-radeon-hd-4670-review/2, meaning it doesn't need any external power connector at all, and runs cooly with only a single-slot cooler, and posts surprising performance in pretty much every game.
When it comes to
Oblivion at the medium-high resolution of 1280x1024, with AA and AF enabled at ultra settings, http://www.anandtech.com/video/showdoc.aspx?i=3405&p=9, and likely close to the level of the notably more expensive 9600GT. It's possible it might surpass it in some areas, but without extensive comparisons to it, I cannot make that judgement, so I'm placing it below. However, it's clear that it does best the 3870 across the board where AA is concerned, and likewise to the 9600GSO; the the GeForce does hold a slight edge with AA disabled, once you enable it, the 9600 GSO takes a whopping 31.1% hit, dropping it to a noticable 10.7% below, due to the 4670 practically shrugging off the burden of AA and AF, taking a mere 17.4% penalty to the framerate, barely over half the hit the 9600 GSO takes.
Since this is a fairly high end of video card that it's in, and AA+HDR is possible with all these cards involved, the list at that end operates on the basis of the player using anti-aliasing. As such, card positions reflect this; the 9600 GSO is better than the 3850 when AA is enabled, while the reverse is true if it's disabled. Similarly, the Radeon 4670 bests both of those when AA is enabled, but loses to both when it is not being used. Yes, the 3870 goes from first to last among them, due to it taking a brutal 43.0% hit right between the eyes when you turn on AA in
Oblivion; that has GOT to hurt. Nonetheless, the game's still quite playable at said settings, (31.8 fps is well above the 20fps "smooth playing" level for an RPG like this) so it's all quite fair.
Either the HD 4850 needs to be dropped a notch to "extreme" or the 9800 GTX+ needs to be bumped up to "egad". Been able to find reviews on the 9800 GTX+ and only one of them showed the HD 4850 beating it in a majority of games benchmarked. The other three showed the 9800 GTX+ winning.
The benchmarks you're looking at almost certainly aren't using AA. For the "very high-end" and above categories, I'm assuming it's used at a 4x level. And when you enable AA to 4x or above,
the 4850 blows the 9800GTX+ away across the board. Even factory OC'd versions don't stand a chance.
Of course, if I assumed that no AA was used for the higher-end cards, then the listing would change, mind you... While
Oblivion does somewhat favor the fewer, yet more-flexible stream processors of the GeForce 8 and 9 cards over those of the Radeon 2000 and 3000 cards, something that hurts them even worse is anti-aliasing performance. If AA is actually disabled, the GeForce 9600 GSO readily loses to the Radeon 3850, and the GeForce 9600GT is actually weaker than the 3870. With AA enabled to x4, the GeForces readily outpace their respective Radeon competitors.
However, that was at release. The era of cards from two years ago DID perform better than their Nvidia counterparts. However, I'm not sure if the same can be said now, over two years later.
Technically, starting with the GeForce 8 cards,
Oblivion shifted favoritism toward nVidia. This is due to the fact that at the same time, ATi and nVidia effectively swapped design philosophies when it came to shader ALU design. In Radeon X/X1k cards, ATi went for big, beefy, flexible ALUs... while nVidia didn't place so much importance with their GeForce 6 and 7 cards, using much smaller ALUs, though in the GeForce 7 cards they did use two of them per shader unit... But it still didn't make up for their lack of flexibility; while their theoretical floating-point throughput was equal or higher, their real-world performance with
Oblivion's shaders, which were designed for the type of ALU ATi used from the 9700pro to the X1950XTX, just didn't run as well.
This changed, obviously, when ATi opted for a "superscalar" approach to their stream processors for the Radeon HD 2000 and 3000 cards. Each individual SP is incredibly stripped-down, to the point where each cluster of five relies on being grouped together for them to be bound to individual instructions. Contrary to some belief, each one is an independent stream processor, and CAN operate with independant data from the other four, it's just that programming-wise, it's harder to keep all five occupied at once.
Oblivion's code wasn't designed for that, and as such, a lot of them sit empty. Meanwhile, all of the stream processors in nVidia's GeForce 8 and 9 cards were significantly beefed up, being far more capable than the ALUs found in previous GeForce cards, and in fact much more resembled the capability and flexibility of the older Radeon shaders. As such,
Oblivion effectively feels more "at home" with them.
Another thing that factors in here is that at higher-level settings, anti-aliasing is more often used. Again, in previous generations, since the original Radeon 9700/9800 generation, ATi has historically been much better at handling anti-aliasing than nVidia, with much better performance with it enabled. This went up until the time of the Radeon X18xx/X19xx cards against the Geforce 78xx/79xx cards. With G80, nVidia completely cleaned up their act, providing far better anti-aliasing performance than anything previously seen. Meanwhile, ATi, which still stuck to their old methods, became outclassed in the Radeon HD 2000 and 3000 series.
Since the introduction of the Radeon HD 4000 series, while the shaders are still lean and hard to use for
Oblivion, ATi has given us their first big upgrade in AA performance in around 6 years, wildly besting even nVidia's improved AA, and in many cases making the performance hit for enabling it startlingly close to zero.
I think the 4850 does need to be split between 512MB and 1GB versions, though, with the 512MB version compared to the 9800GTX+ and the 1GB version kept separate and above. That extra VRAM really lets you pile on the visual mods and/or the resolution and AA without crippling performance (which seems to be the definition of the "egad" category).
I'm not quite so positive that it'll really make all that much of a difference. The list was mostly made and envisioned for no resolutions above 1680x1050 or 1600x1200, as I'll admit. And from waht I've seen in any benchmarks, even with a massive texturemap buffer, it's not until you roughly hit the 1920x1200 x4AA/x16AF range that you really start to notice a difference between 512MB and 1024MB cards in that range. Admitedly, save for the 4850, there are no cards in that performance range available right now with more than one amount of memory, and no benchmarks appear to have surfaced yet that strictly and effectively set out to see what, if any, benefit would come for the 1GB version.
Also, I've heard some rumors that some people have managed to get the VRAM usage in
Oblivion to go well past a full gigabyte... I wonder if there is anyone that can demonstrate this? I'd care to see a screenshot of the debug text... One of the pages shows current rendering information from the video driver itself, including the number of triangles on the scene, as well as the current VRAM load, including a breakdown between what is used for texture cache and what is used for raster buffers. While a lot of increase to the texture cache usage can be made through texture packs, I'd note that a rather sizeable chunk of
Oblivion's VRAM usage is due to raster buffers, since due to all the shader effects, there is a LOT of overdraw.
Glad to know you're enjoying your results. You've got a nice rig, as I can see, and I truly hope you enjoy your playing.