4K 120 fps-drops to 40 fps. What the heck?

Post » Sun Nov 29, 2015 8:06 pm

It's actually because they're $1,000 cards that are quickly replaced, like how my 980 Ti performs as well as a Titan X in games at a $400 discount. Titans are said to be a waste of money because they're Flops cards, not gaming cards. If you're mining digital currency or running a little amateur workstation, they're great, but not so much for the cost for gaming, because they'll be replaced so quickly.

User avatar
Harry Hearing
 
Posts: 3366
Joined: Sun Jul 22, 2007 6:19 am

Post » Sun Nov 29, 2015 3:44 pm

Hello everybody! Just like many other players i was wondering, what the hell is going on indoors, why the hell fps drops from 60 to 30 or even below. I was trying to edit ini files etc, nothign worked, At first, just like everybody else i thoguht that it is one of the effects causing this, but then i decided to check, maybe there is something else, something outside of the visible area that is causing it. I thought that maybe its a culling bug. I activated noclip, moved outside of the map, and placed my self exactly above the point where i had fps issues.
Obviously at this opoint game started to render almost entire level, and the funny thing is, my fps droped to exact same numbers.
Conclusion. I think that is actualy a culling bug, maybe in certain places indoors, engine rendering more then it should, almost entire if not entire level.
User avatar
Motionsharp
 
Posts: 3437
Joined: Sun Aug 06, 2006 1:33 am

Post » Sun Nov 29, 2015 8:15 am

All of that has nothing to do with optimization which sorely needs to be done. If I'm looking at a forest, chances are many of those trees do not need to be calculated as part of the godrays pass. The issue is in FO4, I believe, it just renders them all anyway, whether there is a significant change to the volumetric lighting or not. It's just like the basics of culling triangles before calculating textures, lighting, shading, etc. to improve optimization.

User avatar
CSar L
 
Posts: 3404
Joined: Fri Nov 09, 2007 9:36 pm

Post » Sun Nov 29, 2015 9:32 pm

You're not understanding the whole point behind deferred rendering. The idea is to significantly reduce the number of fragments in a framebuffer that need to be processed for lighting calculations, the way it does this is it stores the relevant data for per-pixel lighting in separate textures. Volumetric lighting itself in this case wasn't written by Bethesda... it was implemented by them and written by NVIDIA, more-so it's only being processed on the visible pixels in a scene... not the invisible ones hidden by trees/walls in the foreground.

As gigertag just mentioned, it may very well be an occlusion/culling problem that is occurring. For me it doesn't seem to be an issue, though I find that AMD cards are far superior in raw geometry crunching power. NVIDIA excels at its single-pass stuff though.

User avatar
Charlotte Lloyd-Jones
 
Posts: 3345
Joined: Fri Jun 30, 2006 4:53 pm

Post » Sun Nov 29, 2015 2:40 pm

Please read before you post.
I'm sorry but I've tried to convey that resolution doesn't change the performance drop. 1080 p changes nothing. The witcher was inaccessible and far from a Bethesda game in regards to adjusting the ini. Bethesda was smart to update the sky rim engine. You think mic Micosoft makes a brand new Engine every time they release a new operating system? Heck no. say what you want about Titan x's but they are awesome. Belittle what I use to enjoy fallout is one thing but I'll ask politely you don't assume what I don't know or know about pcs. Bottlenecking I don't think is an issue but rather the handling of the game possibly poor directx optimization. The witcher svcked in 4K because of scaling of tessellation so I've heard. I think honestly they want to make PC gaming irrelevant and let Sony and Xbox decide what frame rate is good for me. But if I had access to more adjustments I'd likely be not experiencing crazy drops. By the way get some etiquette and not insult people.
User avatar
Bird
 
Posts: 3492
Joined: Fri Nov 30, 2007 12:45 am

Post » Sun Nov 29, 2015 5:39 pm

Which is an optimization issue, like I've been saying. I don't know if your opting to be so obtuse here because you made a mistake or what...

You stated, "I don't think God Rays really has the performance impact that people this is does."

I proved that it does have a very significant performance impact -- up to 57% decreased frame rate.

You also stated, "Shadow distance absolutely murders performance."

I proved that shadows have a relatively low performance impact when compared to godrays -- with only up to a 34% decreased frame rate.

I stated that, "It would be nice if Bethesda could optimize these areas a little better than what they are."

To this you responded that I'm not, "understanding the whole point of deferred rendering," as an argumentative statement, and finished your statement agreeing with me when you said that it may very well be an "occlusion/culling problem that is occurring."

Your statements are very confusing. It seems you're mistaken on both of your accounts, you claim your system doesn't have the same issue, you start in with a lot of technobabble that doesn't further the conversation at all, and then argue and agree with me in the same post. Is making appearances that you're not wrong more important to you here than actually furthering the discussion?

User avatar
El Goose
 
Posts: 3368
Joined: Sun Dec 02, 2007 12:02 am

Post » Sun Nov 29, 2015 5:43 pm

I find it hard to believe that dropping the resolution doesn't change anything. If I drop from 1440p to 720p, my performance increases significantly. Oh, by the way, I'd still like to know what bits you're using to get SLI to work.

User avatar
Devils Cheek
 
Posts: 3561
Joined: Sun Aug 13, 2006 10:24 pm

Post » Sun Nov 29, 2015 10:33 pm

I'm not looking for a PowerPoint on why my hardware is garbage or WHY there's a weird drop. I'm looking for solutions.
User avatar
Sierra Ritsuka
 
Posts: 3506
Joined: Mon Dec 11, 2006 7:56 am

Post » Sun Nov 29, 2015 9:35 pm

Lower your settings. That's the only solution. I would also suspect that SLI is gaining you no benefit, despite what you claim. Try disabling SLI.

User avatar
Cesar Gomez
 
Posts: 3344
Joined: Thu Aug 02, 2007 11:06 am

Post » Sun Nov 29, 2015 7:04 am


I would love to know why as well. I am not in the industry but setting nvidia control panel to alternate between cards per frame is not technically using both cards per frame -(it's in a league of its own)it is a different method that consistently opens a very large performance boost typically doubles my fps across many games new and old.
User avatar
Rusty Billiot
 
Posts: 3431
Joined: Sat Sep 22, 2007 10:22 pm

Post » Sun Nov 29, 2015 7:38 pm

First... please if you quote someone, quote them letter-to-letter...

People seem to think that the God Rays effect is causing their games to perform at sub thirty frames per second, which isn't the case for a large majority of the user base within the required and recommended settings bracket for Fallout 4. Most of these people haven't disabled V-Sync or have meddled with things they shouldnt or just haven't tweaked the game settings to fit their system. A lot of people are expecting miracles and claiming them to be perfectly acceptable expectations without any actual knowledge of how a lot of this stuff works internally.

Listing in bullet form a vague percentage value without providing video/picture evidence isn't proving me wrong nor proving you right. I accepted that those were real datum that you'd acquired yourself and they apply to yourself.

In all testing cases with myself and several of my friends, when I helped them lower their shadow distance from 10,000+ down to 5,000 they all saw a visible increase of frames per second. Further tweaking shadows to be only 2048x2048 resolution texture maps and reducing the quality filter down to High or Medium improved this again. The range is the kicker though because at 10,000+ the game is rendering a massive area of the game world onto a texture map... for each shadow casting light source in the world... so it is essentially re-rendering the entire game world that is visible in the projection of the light source. This is a huge impact on performance and why a lot of games hard limit dynamic shadow casting lights to a handful usually counted on one hand. Remember that each time it is rendering the world the Bethesda Creation Engine is re-rendering even tiny items and characters as well, not just world-space geometry.

You proved it was a problem on your end not that it is a problem for everyone. I proved that myself when I just opened the game and tweaked the God Rays settings and was only able to lose up to 5 fps between low and ultra quality God Rays. I managed to break performance when I changed the default God Rays rendering resolution from grid 32 up to grid 4096 and the game simply crawled at 9 fps, at no other time could I get the game to dip lower than 42 fps with God rays at Ultra and the grid setting on default 32. And this was in a very tree/plant heavy area in North Boston with plenty of water and buildings all in view. For reference I was achieving 54 fps in that area with God Rays off.

I was suggesting there may be merit to what the user "gigertag" had said about culling. I know the game uses spatial partitioning and might also use hardware dependant occlusion query tests and was offering that his statement may be correct and there may be a very big bug in regards to those technologies in the game engine.

An occlusion/culling bug is not an issue with optimisation, it is a bug, an error, something that isn't functioning correctly.
Optimising something means to reduce... say the amount of program code that is being processed while not removing any of the functionality said code was producing. In the case of code you're just optmising the lines of written code in a way that the outputted assembly/machine-code when compiled is less so the CPU has less instructions to perform, thus it would increase performance even if on a micro scale.
Thus I wasn't agreeing with you that optimisation is a problem, not in the way you are making it out that I am.

The reason I have gotten so technical is because a lot of people, and seemingly yourself included, seem not to understand just how complex and power hungry some of these technologies are, without understanding how the technology works under the hood, you can easily make assumptions that it is an optimisation problem or that the programmers are useless and can't program very well. It's better to provide the information for people so they understand and can re-evaluate their views, or discard it if they choose.

User avatar
Penny Flame
 
Posts: 3336
Joined: Sat Aug 12, 2006 1:53 am

Post » Sun Nov 29, 2015 10:27 pm

and this game definitely renders less than 16 ms. My background was AV and am very attuned to response times. Sounds silly but 16 ms would be unplayable minus maybe a regular controller.
User avatar
Andrew
 
Posts: 3521
Joined: Tue May 08, 2007 1:44 am

Post » Sun Nov 29, 2015 8:51 pm

If you are using Alternate Frame Rendering in DX11 effects like Motion Blur and other visual effects that rely on the previous frame may not work correctly (this is addressed by AMD and NVIDIA in public statements somewhere) because of the way the rendering pipeline now works, the previous frame may not be available to the second, third or fourth GPU for many different reasons. AFR is an old technology that was really meant only for use before DirectX 10/11 and OpenGL 4.0 came out.

If you are getting a boost, lucky for you but be aware that right now Bethesda aren't supporting multi-GPU setups and you are in unchartered waters.

For me, CrossfireX didn't garner any kind of benefit and actually caused bugs with textures (human heads were entirely black and unlit) and some lighting flickering sporadically, I also had to disable Anti Aliasing as it flickered so badly that it gave me a headache. In the end the extra 50 degrees of heat output just wasn't worth it and I gave up testing to see if it improved my frame rate anywhere.

16ms is 16 milliseconds. To achieve 60 frames per second, a single rendered frame must take no longer than 16.6ms real-time, this includes any behind the scenes AI and script calculation. If a frame and game logic takes less than 16.6ms then you are getting over 60 frames per second. I believe 8.3ms is 120 frames per second.

User avatar
Brandi Norton
 
Posts: 3334
Joined: Fri Feb 09, 2007 9:24 pm

Post » Sun Nov 29, 2015 10:19 pm

Double post.

User avatar
JeSsy ArEllano
 
Posts: 3369
Joined: Fri Oct 20, 2006 10:51 am

Post » Sun Nov 29, 2015 9:08 pm

I did not modify anything in those quotes. They are letter-for-letter.

The godrays effect is patently the go-to setting when people are complaining about frame rates oddly dropping at certain areas. It's also clear that the optimization that needs to be there for this to be a viable setting isn't there. When it brings hardware like Titan X's and 980 Ti's to a crushing ~35 FPS, then there's a problem inherent with that setting. It's like having a developer having a setting for ray-trace rendering then you blaming a person's setup for not being able to run it, not the developer for enabling an option which no hardware can feasibly run.

I didn't list anything vague, let alone percentages in bullet form. I don't know where you're getting that from.

Now what you proved wasn't that it was my system, but that it doesn't happen at every location, which was never in question. Go to where I told you to go and you will experience the exact same issues I presented to you.

User avatar
Paula Ramos
 
Posts: 3384
Joined: Sun Jul 16, 2006 5:43 am

Post » Sun Nov 29, 2015 8:02 pm

The programmers for this game are amazing in what they have created. This to me seems like a QA issue. Indoor vs outdoor is weirdly unbalanced like some older titles. We....-all of us non Bethesda programmer employees are all speculating here. Maybe it's a cpu timing code issue on my end causing some arbitrary process to hang and Gpu left to wait to render, or an over looked distance effect despite dropping all settings to low. I don't know why. I do know that over compensating by lowering a zillion settings doesn't fix the drop. It's probably not my hardware. And when non consistent frame drops occur -it makes me scratch my head why. And as a side note there are probably endless library's of tricks to make the indoor engine work better without sacrificing perceived quality vs quality. Other than weird arbitrary frame rates that originated in cinema years ago and stuck, this beautiful immersive and fun world Bethesda made I really believe should be on par with our eyes ability to perceive it at the very least. Comparing gaming fps to acceptable being around cinema to me seems wrong.
User avatar
vanuza
 
Posts: 3522
Joined: Fri Sep 22, 2006 11:14 pm

Post » Sun Nov 29, 2015 6:29 am

Issues with afr typically cause blinking every other frame but in most games it's actually really rare.like I said about the milliseconds- I get around 120 most times on my piece of crap hardware.
User avatar
Ally Chimienti
 
Posts: 3409
Joined: Fri Jan 19, 2007 6:53 am

Post » Sun Nov 29, 2015 1:02 pm

Our interaction with a virtual world needs a consistent time code. Not to write off the brilliant minds at Bethesda, and they are awesome, but if something takes to much time to render for a relatively sort of kind of okay piece of a computer, it's time to trim some resource fat.
User avatar
Rob Smith
 
Posts: 3424
Joined: Wed Oct 03, 2007 5:30 pm

Post » Sun Nov 29, 2015 9:17 am

Given the benchmarks I've seen for 4K this past year 40fps at 4k sounds about right. Games just don't scale indefinitely.
User avatar
ZzZz
 
Posts: 3396
Joined: Sat Jul 08, 2006 9:56 pm

Post » Sun Nov 29, 2015 6:43 am

I think we cant ignore the root problem here, turning off everything to get usable FPS in some parts of this game? I think we need to fix the issue not bandaid it

User avatar
Pawel Platek
 
Posts: 3489
Joined: Sat May 26, 2007 2:08 pm

Post » Sun Nov 29, 2015 9:45 am

There is a root issue and it's not a pissing contest if your 4K videos average 40 I must be doing something right :) Plus, I wouldn't play baseball or basketball in real life and have my frame rate drop right before I shoot so why do game producers think it is okay?
User avatar
+++CAZZY
 
Posts: 3403
Joined: Wed Sep 13, 2006 1:04 pm

Post » Sun Nov 29, 2015 8:04 pm

The problem is down to too many drawcalls, which DirectX11 can't handle. DirectX12 is meant to fix this by making the API more efficient. The solution is to wait until either Bethesda fixes the problem areas(which in Skyrim they didn't), or hopefully the community can fix. Even if you are running win10 the game isn't DiectX12(even though the XB1 is).

On a side note I get ~2.4 million drawcalls/s on 3DMarks API overhead test(DX11). If you run the same test I bet you get roughly the same.

User avatar
Felix Walde
 
Posts: 3333
Joined: Sat Jun 02, 2007 4:50 pm

Previous

Return to Fallout 4