I'd love to play at 60fps, but that's a little difficult when the game requires me to disable one of two GPUs so I'm not staring at a 30" strobe-light. Or, alternatively, to play at lower than desired or non-native (2560x1600) resolutions.
I'm not sure where you're making a differentiation here. Do you not want the most incredible experience that your equipment can provide? Nobody is saying that unoptimized and slow (but resource-pegging none-the-less) is the preferred option to optimized and under-utilizing of resources.
What people want is something that really pushes the limits. That sets a standard of "this is why we play PC games". Ideally, that means optimized and therefore capable of achieving great results if you throw more power at it. Crysis 2 may provide amazing graphics, but it also provides ridiculous hurdles. You know, like disabling your entire second GPU to overcome major flickering problems, which therefore reduces overall performance. I don't think asking to be able to utilize your entire card to it's limit is asking for too much.
Flickering with SLI systems is a bug. A bug that will be addressed in the next patch. It has nothing to do with asking for DirectX 11. And it has nothing to do with people saying Crysis 2 is a dumbed down game. Crysis 2 already push my GTX 580 to the limit: it plays at 60 fps on a 1920x1200 monitor, it is the only game that consumes 380W while playing, it is the only game that has such graphics. None of my 50+ games (apart Crysis 1) can compare to this game, graphically. So the game implements now all the things you're asking.
No, the lack of DX10/DX11 isn't really relevant to whether or not Crysis 2 is pushing technical boundaries. They might be able to do some cool things utilizing them, but they're not necessary. I frankly don't understand the hangup people are having over "when will it support DX11?!". I'd like these people to name three important features that DX11 can provide that are necessary for Crysis 2. I doubt they can think of any.
Also, there are plenty of games out there that will push a modern card. Even at 1920x1200 (I play everything at 2560x1600 unless I absolutely need to lower it). If you crank everything up at 1920x1200, pretty much any modern game will fully use that card just to get you 45-60fps.
It kind of reminds me of when Vanguard: Saga of Heroes came out. It was supposed to support DX10, but it didn't. They said they'd eventually add it in (even though, as I recall, it was advertised as supporting it to begin with). Around a year later, they finally implemented it. But in the meantime, the game looked fantastic. People just biched about it not supporting DX10, because they knew that 10 is a bigger number than 9. *eyeroll*.
All you have to do is playing it instead of wasting time on useless complaints.
Well, personally, I'll be doing just that. As soon as they have fixed it so it can actually be played.
Personally, I think they've done a fine job on the ultimate potential product. My concern is not whether it supports DX10 or DX11, but with the significant release problems on a game that is supposed to really help the PC gaming experience shine above the rest (and for which they're charging console prices, for). I can only imagine what experiences like this have on gamers who are a little less dedicated to the PC. Just another little chink in the armor, so to speak (along with other chinks like Fallout 3 even long after release and Fallout: New Vegas even today).