First... please if you quote someone, quote them letter-to-letter...
People seem to think that the God Rays effect is causing their games to perform at sub thirty frames per second, which isn't the case for a large majority of the user base within the required and recommended settings bracket for Fallout 4. Most of these people haven't disabled V-Sync or have meddled with things they shouldnt or just haven't tweaked the game settings to fit their system. A lot of people are expecting miracles and claiming them to be perfectly acceptable expectations without any actual knowledge of how a lot of this stuff works internally.
Listing in bullet form a vague percentage value without providing video/picture evidence isn't proving me wrong nor proving you right. I accepted that those were real datum that you'd acquired yourself and they apply to yourself.
In all testing cases with myself and several of my friends, when I helped them lower their shadow distance from 10,000+ down to 5,000 they all saw a visible increase of frames per second. Further tweaking shadows to be only 2048x2048 resolution texture maps and reducing the quality filter down to High or Medium improved this again. The range is the kicker though because at 10,000+ the game is rendering a massive area of the game world onto a texture map... for each shadow casting light source in the world... so it is essentially re-rendering the entire game world that is visible in the projection of the light source. This is a huge impact on performance and why a lot of games hard limit dynamic shadow casting lights to a handful usually counted on one hand. Remember that each time it is rendering the world the Bethesda Creation Engine is re-rendering even tiny items and characters as well, not just world-space geometry.
You proved it was a problem on your end not that it is a problem for everyone. I proved that myself when I just opened the game and tweaked the God Rays settings and was only able to lose up to 5 fps between low and ultra quality God Rays. I managed to break performance when I changed the default God Rays rendering resolution from grid 32 up to grid 4096 and the game simply crawled at 9 fps, at no other time could I get the game to dip lower than 42 fps with God rays at Ultra and the grid setting on default 32. And this was in a very tree/plant heavy area in North Boston with plenty of water and buildings all in view. For reference I was achieving 54 fps in that area with God Rays off.
I was suggesting there may be merit to what the user "gigertag" had said about culling. I know the game uses spatial partitioning and might also use hardware dependant occlusion query tests and was offering that his statement may be correct and there may be a very big bug in regards to those technologies in the game engine.
An occlusion/culling bug is not an issue with optimisation, it is a bug, an error, something that isn't functioning correctly.
Optimising something means to reduce... say the amount of program code that is being processed while not removing any of the functionality said code was producing. In the case of code you're just optmising the lines of written code in a way that the outputted assembly/machine-code when compiled is less so the CPU has less instructions to perform, thus it would increase performance even if on a micro scale.
Thus I wasn't agreeing with you that optimisation is a problem, not in the way you are making it out that I am.
The reason I have gotten so technical is because a lot of people, and seemingly yourself included, seem not to understand just how complex and power hungry some of these technologies are, without understanding how the technology works under the hood, you can easily make assumptions that it is an optimisation problem or that the programmers are useless and can't program very well. It's better to provide the information for people so they understand and can re-evaluate their views, or discard it if they choose.