That is not true. Graphics just are not discussed rationally. There are stereotypes that computers are way more powerful than consoles and it's the consoles that are holding back the graphics of computers which is recited over and over again and turns ad hominem in nature. The problem with this is what is holding back computer graphics is computer software.
If you don't believe me here is a tech article discussing it. http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/1
If you swear by using computers check out the second page of the article and see if that doesn't change your mind about what kind of playing experience you can have with a console.
The problem with that article is that its basis - the words spoken by that guy from AMD - aren't actually true and were taken heavily out of context, something which he himself put right. There is certainly a performance hit from the abstraction layer, but that's inherent.
The thing is, there is nothing technical holding back PC graphics. If there were, there'd be something bottlenecking, and there isn't - even on the most graphically demanding games, even mid-high end games have room to spare. If we were bottlenecking either in hardware or software, they wouldn't have any spare power. They do. Technologies like Eyefinity or 3D support have had to be made up simply to justify the existence of higher end cards. There is nothing inside the machine holding it back, it's simply not being used.
While the lovely John Carmack puts the performance gain from writing without abstraction layers, "directly on the metal", so to speak, at about 2x, even conservative estimates put modern hardware at 15 or 20x faster than console hardware. As it has always been, the constant platform of a console is far more efficient, but the moving platform of a PC overtakes it quickly through sheer overwhelming brute force. The idea that PC games don't look 10x better is also somewhat flawed - running at 2.5x the resolution, at twice the framerate, is already running the game much, much better than a console - even if the game looks the same. However, most games, console settings come in at about medium/low to medium. 10x better isn't really that outrageous - and the highest end cards have significant spare power running things even harder than that.
Graphics are discussion quite rationally, whether you agree with it or not. Even on console, there are better looking open world games. Contrary to popular opinion, the amount of NPCs that aren't actually loaded, or the map that's waiting to be streamed from the disc, have very little bearing on texture quality, or animation quality, or shader quality. The GPU doesn't care how much stuff is there for it to do later, and it doesn't care about AI, or streaming the local environment, or whether your cities are open.