detecting wrong graphics card

Post » Wed Jul 07, 2010 8:22 pm

When i launch the launcher under the options menu where it says graphics adapter it reads as geforce 7900 gs. But the thing is my graphics card is a gtx 285. What im wondering is if this will effect game performance at all or is it just relevant to the auto detect settings it applies.
User avatar
Rob
 
Posts: 3448
Joined: Fri Jul 13, 2007 12:26 am

Post » Thu Jul 08, 2010 4:13 am

When i launch the launcher under the options menu where it says graphics adapter it reads as geforce 7900 gs. But the thing is my graphics card is a gtx 285. What im wondering is if this will effect game performance at all or is it just relevant to the auto detect settings it applies.

Please divert your attention to http://www.gamesas.com/index.php?/topic/1125539-graphiccard-i-do-not-own-found-wtf/
User avatar
Nymph
 
Posts: 3487
Joined: Thu Sep 21, 2006 1:17 pm

Post » Thu Jul 08, 2010 5:06 am

I'm not certain that anyone knows what Obsidian did in order to break the detection routines that Fallout 3 used, and Oblivion before that. It could be the nVIDIA drivers instead; I haven't read a single report that any relatively current Radeons are being detected as X1950s instead of what they actually are.

I also don't understand why Obsidian ignored the Dx9 installation procedure that Bethesda used for Fallout 3, unless that part of the issue is created by Valve (Steam). Just like a majority of multi-platform games available now, there is no Dx10 in the game, and no Dx11, because neither Microsoft's nor Sony's game consoles offer any graphics quality beyond that level. The developers have been adding Dx9 to systems that had Dx10 only for three or four years now, and to Win 7 systems for almost a year.
User avatar
Trey Johnson
 
Posts: 3295
Joined: Thu Oct 11, 2007 7:00 pm


Return to Fallout: New Vegas