I found out with MSI Afterburner that not even in Boston the GPU clock is maxed out. All my 3DMark versions and Unigine benchmarks etc max out the GPU clock with ease. Is the game really this unoptimized?
I found out with MSI Afterburner that not even in Boston the GPU clock is maxed out. All my 3DMark versions and Unigine benchmarks etc max out the GPU clock with ease. Is the game really this unoptimized?
Yes, R9 390X. I installed the hotfix 16.1 yesterday, it indeed solved the compass flickering issue as per the changelog but I haven't noticed any improvement regarding GPU usage.
I noticed with my last driver update, also using MSI Afterburner, my 280x was only utilizing half the Core and VRAM timings. Uninstalling then reinstalling MSI Afterburner overtop of the new display driver fixed it.
This may not be your issue but I thought I'd mention it.
Thanks. I guess there's something bad in my driver installation as Radeon Settings shows the new 16.1 release as intended but Windows Device Manager still shows 15.something
I used http://support.amd.com/en-us/kb-articles/Pages/AMD-Clean-Uninstall-Utility.aspx tool for a clean driver reinstall, it did uninstall the drivers [currently running at 1024x768 with the default Windows drivers] but Radeon Settings [ex CCC] is still there and I'm not sure how I'm supposed to completely remove it from the system. I'm going to investigate this before trying a complete drivers reinstallation.
edit: it appears Revo Uninstaller managed to remove any AMD trace from the system. I'm going to reinstall the 16.1 package and see how it goes
It's called a CPU bottleneck and I don't know of any setup that can brute force it yet.
Chances are I'm actually affected by a CPU bottleneck as my CPU isn't exactly the most recent [i7 960], and yet that doesn't explain why the GPU works at full speed with every other 3D app/game I own. The Witcher 3, Need for speed Rivals, 3D Mark 2015, according to Afterbuner they are all capable to push the GPU to its limits.
edit: a clean driver install didn't change one iota
edi2: same goes with Skyrim
Here you go. FO4 pic taken in Boston at Ultra settings, TW3 pic taken in the field at Ultra settings. If it's not a case of GPU underclocking I don't know what it is.. I'm aware that comparing two games this different from each other it's not fair and yet FO4 appears to be the only one in my collection beside Skyrim to not use the full GPU potential. If it's supposed to be a very CPU intensive game that might cause CPU bottlenecks then why the CPU usage is that low?
http://i.imgur.com/D6jHupf.jpg
http://i.imgur.com/BA6IL9d.jpg
Perhaps not the game itself, that wouldn't make much sense. But I'm told that different engines tend to interact differently with different video card brands. Every Unreal Engine developer I know goes with nVidia, to give one example. As for FO4 I'm not quite sure whether it's an engine issue or a driver issue, that is, not sure who is supposed to fix what.. but this isn't normal, that's for sure.
All bencmark results I found are coherent with my system. Not many people with both i960 and 390X though.. most have 290X which is basically the same card but with a slightly slower core clock.
Those AMD chips have power management in the BIOS. It will underclock the card to save power in certain situations.
What frame rates you are getting? DOes this happen everywhere in Fallout 4? What frame rates are you getting in Skyrim?
Jesus F Christ, what the heck is wrong with the forum? I can't submit long posts anymore, I'm always redirected to a stupid captcha.
GPU: http://www.techpowerup.com/gpudb/b3400/asus-r9-390x-directcu-ii.html
dxdiag: http://pastebin.com/gyTJpeYJ
It happens everywhere both in FO4 and in Skyrim, the GPU usage rarely goes to 100% and the core clock never gets to 1050Mhz, both fluctuate constantly. In FO4 with everything at Ultra I have a steady 60 FPS in the wasteland and around 30/40 in Boston. Skyrim at Ultra runs at a steady 60 FPS everywhere.
THose frame rates are not abnormal. The boston area is notorious for drops.
Skyrim is normal. Fluctuations isnt necessarily an indicator of a problem. Curious what happens with Vsync off? What resolution are you playing at? Ultra?
I know they do. I know how to get more FPS from the game, I was just wondering if it's normal or not.. I believe either the engine or the drivers or a combination of both are causing this but I can live with it.
Those are all due to having disabled Windows Defender.
1080p, Ultra settings. Vsync off doesn't improve one iota and as expected the engine goes crazy when it goes beyond 60 FPS.
Again, I'm not trying to get more FPS from the game. The framerate itself is pretty normal, I'm aware of that. What is not normal is the GPU usage, which according to MSI is very low in comparison to other CPU/GPU intensive games, i.e. TW3. But, again, I can live with that.
If you arent getting stuttering or any performance problems then it may very well be "normal".
If not typical for other games. Each game is different.
If that's your enquiry then I'd say it's normal, and wouldn't worry about it. If you can drop some settings and get higher framerates then that's fine.
It's if dropping settings didn't improve performance that you'd need to investigate.
Given the issues specifically related to that generation of AMD cards with this game, then whose to know where the cause is.
The problem is there are areas especially boston that drop frames no matter what setting you use.
It's actually a CPU and/or engine limitation. There isn't enough strain on the GPU for it's BIOS power management feature to automatically increase clock speed.
Depending on settings (and/or system) you can get the same behaviour on nVidia GPU's. I haven't been using AMD GPU's recently but I'm assuming they allow you to disable power management (or to set it to maximum performance) in it's per-game settings, just like with nVidia?