I looked at benchmarks for cards recently and the DX11 tests grind the FPS in those tests into the ground. They may technically be capable, but could a random mid/low end graphics card that supports DX11 actually support the bells and whistles people seem to think should be default, even ignoring the fact that most gamers don't use DX11 cards yet anyway?
Sure, why not? The thing about tech demos is that they describe what games could be like in 2, 3, 4 years - not now. As there are very few DX11 capable games, we only have tech demos. Every DX11-capable game I've tried has run absolutely fine on my GTS 450 - which is pretty much as low as you can go and still be a DX11 capable card.
Elite PC Gamer thread here. Tessellation is sweet and all, but I hardly notice it in games. All I've played was Metro 2033 and Call of Pripyat, so I haven't seen much.
And Stuff like Heaven Benchmark is specifically designed to showcase DX 11 and nothing else.
From what I see, DX 11 is just to small a market for Bethesda to cater to.
That's, uh, that's the point. Not noticing it means it's doing its job. The idea isn't to be flashy and "Look at me!", it's to dynamically scale model complexity based on distance so that you can have very complex models near you, but then have much simpler ones as you get further away. You're not *meant* to notice it.
As for DX11 vs OpenGL, it's important to note that these are both abstraction layers, that don't add any capabilities to the GPU, but use what's there. Whether you're OpenGL or DirectX is meaningless on console - DX11 is a thing on PC because newer cards have a standardised architecture that DX11 supports, but because both consoles are set hardware, it's a meaningless distinction. They're never going to get any better, you have what you have.