My argument is this, many of the features that most people claim make all the difference, actually don't change the experience at all. in most cases in games people unless you specifically go out of your way to prove otherwise are mostly unnoticeable, its only when the player stops dead in his/her actions and deliberately compares them does it make a difference to the experience.
Under normal operating conditions, and general game play almost all of what each fancy setting does actually only makes a difference and a real impact on your experience at a subconscious level rather than a conscious one.
In other words, sure that rock looks better on top settings when you stop and look, but in comparison whilst playing and not comparing the object who really cares.
Now even if we start posing screenshots of rocks, the difference in those rocks is clear at each end of the scale (low-very high) but not so much at (low-medium) or (medium-high) despite changes taking place in each case.
now factor in forward moving motion and eye concentration of targets/objective, etc. what was very clear and precise changes is now motion blur at best. so in a real case scenario rather than at a subconscious level, does a player on very high settings have a better experience over a player playing at very low settings, when it comes to graphics ?
I have several bookmarks of comparison videos and screenshots ready but prefer to with hold them at this stage of the debate, I am curious as to what types of responses I am going to get to my question....
Please what ever your budget, system, views or thoughts are, post them up to be reviewed.