- intel Core2Quad Q6600 G0 @ 2.4GHz
- Asus Rampage Extreme X48
- G.Skill Ripjaws-X Series 4GB (2x2GB) DDR3 1600MHz CL6 RAM @ 1066MHz CL6
- Nvidia Geforce 285 GTX 1GB-DDR3 x2 SLI @ 16x/16x "using SLIPatch 1.5b"
- Gigabyte Odin Pro 800w PSU
* OS: Windows 7 Ultimate 32Bit
* Nvidia Driver = 280.26 WHQL
** Using 285.xx driver doesn't work ! can't enable SLI in Nvidia's CP and can't run rage on single card !

RAGE Patched and updated + rageconfig.cfg settings :
vt_pageimagesizeuniquediffuseonly2 8192
vt_pageimagesizeuniquediffuseonly 8192
vt_pageimagesizeunique 8192
vt_pageimagesizevmtr 8192
vt_restart
vt_maxaniso 4
image_anisotropy 4
vt_useCudaTranscode 1
Ingame Settings:
- Resolution: 1024*768
- 4xAA
- vSync: off
- Texture Cache: Large
- AF: High
This pic shows RAGE stressing both GPU's :
[img]http://img6.imageshack.us/img6/451/ragesligpu.jpg[/img]
Using 8K textures loads more than 800MB on both cards "totaling more than 1.6GB" !
Game runs smooth 60fps, drops to 44fps when using 16xAA but over clocking my Q6600 to 3.2GHz and RAM to 1600MHz rises frames to 60fps and drops to 55fps.. "damn bottleneck"

Noticed that some times the screen flickers like SLI is going out of sync or something and I'm forced to exit and run the game again..
Forcing 16K textures through rageconfig.cfg makes the screen becomes blue and loads about 300MB on both cards "600MB total" ?! so how much VRAM I need to run 16K ?

I want more "eye candy", can I tweak it more ?

BTW, I've just finished "Dead City", damn this is an Awesome Game !
