Do you want to win the BEST RPG 2015 and you don't even...

Post » Thu Nov 26, 2015 7:57 am

...Do you want to win the BEST RPG 2015 and you don't even...

support SLI?

Latest beta patch brings no performance improvements on my GTX980 Ti SLI and SLI doesn't work at all.

User avatar
Campbell
 
Posts: 3262
Joined: Tue Jun 05, 2007 8:54 am

Post » Thu Nov 26, 2015 9:19 am

:lol:

The merits of an RPG are judged on their ability to support SLI!

One sort of quick and dirty fix is forcing SLI by going in to Nvidia control panel and changing the SLi Rendering mode option to "Force Alternate Frame Rendering 2." It's been known to cause some crashes on systems though.

User avatar
Dalia
 
Posts: 3488
Joined: Mon Oct 23, 2006 12:29 pm

Post » Thu Nov 26, 2015 3:35 am

I hate to tell you but Fallout 4 is hardly an RPG compared to 1/2/NV.

User avatar
Baby K(:
 
Posts: 3395
Joined: Thu Nov 09, 2006 9:07 pm

Post » Thu Nov 26, 2015 10:37 am

Let me guess.. SLi Support? :laugh:

User avatar
Bird
 
Posts: 3492
Joined: Fri Nov 30, 2007 12:45 am

Post » Thu Nov 26, 2015 5:13 am


The merit of a game is based on everything.
First of all it should run on enthusiast rig at least.

I have a gtx980 ti SLI and I'm not able to max it out.
The second card is pointless, not used at all.

Framerate is very inconsistent. How can you judge positively a game that doesn't run smooth?
User avatar
Scared humanity
 
Posts: 3470
Joined: Tue Oct 16, 2007 3:41 am

Post » Thu Nov 26, 2015 12:09 pm

I totally agree with you that a merit of a PC game definitely is based on everything. But the merit of an RPG has nothing to do with hardware support.

Also.. did you have any luck with what I suggested?

User avatar
QuinDINGDONGcey
 
Posts: 3369
Joined: Mon Jul 23, 2007 4:11 pm

Post » Thu Nov 26, 2015 5:59 am


With your suggestion I have the same GPU utilisation on both cards but performance does not improve.
I can't get 60fps stable.
User avatar
cassy
 
Posts: 3368
Joined: Mon Mar 05, 2007 12:57 am

Post » Thu Nov 26, 2015 1:44 pm

Godrays and shadows destroy performance. Take them both down a notch from ultra if you haven't already. I get some dips into the mid 40s at 1080p with my 970 but for the most part have solid 60 fps.
User avatar
Suzie Dalziel
 
Posts: 3443
Joined: Thu Jun 15, 2006 8:19 pm

Post » Thu Nov 26, 2015 11:42 am

Sli support isn't fully managed by Bethesda. It's something that nvidia and Bethesda have to work on and then nvidia releases the driver profile update.

If your system doesn't run perfectly with a single 980 ti then you have other issues.
User avatar
Emmie Cate
 
Posts: 3372
Joined: Sun Mar 11, 2007 12:01 am

Post » Thu Nov 26, 2015 3:15 pm

Eh, I wouldn't use that last phrase per se. There ARE massive issues even on very high end systems. Some of the later CPU testing has shown i7s with 980tis average [relatively] low fps. Frankly, even with a 980ti in this game, if your CPU is an i5, even heavily oc'd you're probably averaging 55 and hitting lows around Good Neighbor and the coast near there down to the high 30s mid 40s. Check GPU clock speed and utilization. A lot of times in those areas you'll see you GPU clock down 200 to 300 Mhz and drop utilization. It's just not getting anything to render while it waits for the CPU.

Worst part is, on CPU no single thread is utilizing the equivalent of one whole core. I have know idea why, but it's like being CPU bound while the threads' usage only hover around 60 to 70% occasionally spiking to 80% periodically. It's almost like the main thread is waiting on another which is also waiting on the main thread for a significant portion of a single frame. Insane... I know. The state of the engine is, no offense Bethesda, absolute... um, bad news.

Sadly for us SLI users, but understandably, most of this will need worked out before they can even THINK about adding SLI. Adding it in now would most likely drop performance... hence why even compatibility hacks that "work" often have the same average FPS in problem areas, while requiring more total GPU cycles to even do it.

It's truly sad. If this kind of coding would be used in a release roll out in a corporate environment [not a complete anology, but pertinent] they would be fired. Do it directly to buying customers and it's almost expected and even defended.

User avatar
jason worrell
 
Posts: 3345
Joined: Sat May 19, 2007 12:26 am

Post » Thu Nov 26, 2015 4:23 pm

Honestly SLI/Crossfire systems are still a tiny enough minority of gaming rigs that even if it were up to gamesas to sort it all out it wouldn't really be worth their time. The whole multi GPU thing just never took off like AMD/ATI and Nvidia expected it to.

User avatar
Celestine Stardust
 
Posts: 3390
Joined: Fri Dec 01, 2006 11:22 pm

Post » Thu Nov 26, 2015 4:55 am

Funny, people complain that games that only used one core. Now, that games use multiple cores, but not at 100%, it is a problem. The only reason the CPU would be at 100% is if, it is the bottleneck. Most games stress the GPU not the CPU. My CPU stays at 30-40% (overall), as its not the bottleneck. My GPU is at 60% also not the bottleneck. My 'bottleneck' is the v-sync limiter of 60fps, if I would turn it off (or use a higher resolution), the GPU load would probably increase.

I am not having an issue at Good Neighbor, its still 59.5-60fps (v-sync). If your GPU is dropping to 2d clocks, then its a driver issue. RAM use is high >6GB (total system use) so it is possible that there is a bottleneck with RAM and page file use is causing the slow down.

Reply with a COC location and I'll test it with my setup.

User avatar
Harinder Ghag
 
Posts: 3405
Joined: Wed Jan 17, 2007 11:26 am

Post » Thu Nov 26, 2015 10:35 am

I can confirm on my system i7 5970k, EVGA 980 gtx 4gb non-SLI, 16gb ddr4 ram: I will sometimes crash or get cease to stop working message if I even overlcock a petty 2ghz.

User avatar
Assumptah George
 
Posts: 3373
Joined: Wed Sep 13, 2006 9:43 am

Post » Thu Nov 26, 2015 9:24 am

System i7 5970k, EVGA 980 gtx 4gb non-SLI, 16gb ddr4 ram:

I've tried every version of v-sync through the Nvidia control panel and nothing seems to work even tried Med and Low settings I still get dipped from 60fps to 40-50 in downtown Boston other dense areas like towns. I've tried everything even switched to from my monitor to my Phillips 43in' 122 hz plasma tv and still nothing is working. I'm really sensitive to frame rates to the point of irritation, but I know a lot of folks don't mind it.

User avatar
Miranda Taylor
 
Posts: 3406
Joined: Sat Feb 24, 2007 3:39 pm

Post » Thu Nov 26, 2015 8:28 am

First let me say I am extremely happy for you that your experience has been so great. I'm serious, I'm jealous.

Second, I understand all of that.

But seriously.......................................

Are you claiming that Vsync is what's keeping the GPU and CPU from even hitting 60? I'm not talking about 59 or 57 or some small frame drops.

I'm saying there are places that a large amount [have no idea the number of people, but it's been reported everywhere] of players have issues where regardless of new install OS, DDU'd various versions of drivers and regardless of vsync on or off, vastly overpowered machines [relative to a efficient engine] are barely utilizing their hardware. That's not an issue with vsync, that's an engine issue. If your game can't feed the GPU enough data fast enough to cap 60 vsync on a 970+, yes there is a problem.

It's called the gamebryo abomination they're currently using....

Look, again, great for you. I am really happy and jealous. I have no doubt there are times with certain settings that people can get 60 fps near Good Neighbor. Hell I get 60 all around Diamond City and Corvega, problem areas. But turn 3 degrees to the left, where there is no noticeable difference in obvious geometry or mesh data [save a really [censored] occlusion engine? idk] and I drop to 37, THEN the GPU downclocks. It's not Vsync. Please tell me how low CPU, low GPU usage, Vsync triple buffer gets you 37 fps suddenly without insanely erratic frame times. If it was just a few frame drops, you'd get a small drop and unlike Vsync double buffering or blitting, you'd barely notice. If you know anything about actual GPU render pipelines you'll understand that when the CPU all of a sudden decides to dump an additional 10k draw calls for objects that aren't even visible, that's an engine issue. Get out your DX debugger or whatever suite you use and take a look at the actual info the threads are exchanging and when. I think you'll be surprised even if you get perfect performance.

Lastly, I've used multiple drivers and even force constant 3d perf mode all that happens is the GPU drops to lower percentage usage when normally it'll clock down.

TLDR: low CPU plus low GPU =/= Vsync... it's the engine. Nvidia drivers are irrelevant right now. The GRD has almost no optimizations. But they're all doing what they're supposed to do, clock the GPU boost [keyword boost] down when the program isn't even STRESSING IT!

User avatar
Robert Jackson
 
Posts: 3385
Joined: Tue Nov 20, 2007 12:39 am

Post » Thu Nov 26, 2015 4:25 am

I'm of the opinion that until they fix their crappy cpu utilization SLI won't do you much good. Perhaps if you're running a couple of older cards it'd help for certain scenes but most of this game's framerate issues are cpu based.

User avatar
Holli Dillon
 
Posts: 3397
Joined: Wed Jun 21, 2006 4:54 am


Return to Fallout 4