Most gamers consider anything less than 30fps to be unplayable, myself included. The rare dip below 30 is acceptable to me, but I would say the bare minimum framerate that I would play a game at is 24fps. 24fps is the framerate of film, and even that is choppy for a video game. Especially because unlike films which go through pulldown, TVs are mostly multiples of 60/50hz. For 50hz, movies are sped up and play at 25fps and for 60fps they go through 2:3 pulldown and take advantage of interlacing when duplicating frames. Because of this, 24P mode actually looks choppy to many people when scenes are panning versus in a theater where you don't have to worry about hz and the frame sticks in your eye so you don't see the judder.
Now on a PC, especially with a great rig, I try to get most of my games to play at 60fps and even when 50fps you can definitely tell the dip in fps. I still consider that playable and I'll play a game at 30fps if I must but 30fps is undesirable to PC users usually unless it's somebody who is just happy to get the game running (like integrated graphics people). 30fps is fine in consoleville, but not so much on pc
not only that. but usually it's not a fps speed issue but more a smoothness and consistent fps delivery.. things like turning off vsync helps with that as the gfx hardware can just dump what ever it has and try and draw the next frame as quickly as possible.. personally I live with the jaggies to make sure I get a fairly consistent frame delivery as vsync can unduely cause worse looking game play.
some of this is engine design. if the game engine dosn't handle vsync/buffers etc correctly then you'll get all sorts of strife and triple buffering etc will make no difference. vsync and extra buffers can cause more slowdown depending on the game,engine,gfx hardware,cpu hardware and operating system.
so fps is not really an indication persay of how good your gaming experence is going to be if behind the scenes you get 5 or so frames in a row not drawn because of underlying issues.. sure it'll look like your awesome setup is doing 40fps but really it could be doing 45fps and the 5 are the microstutter that people are seeing.. for example...
my 5970 is only now starting to be used as it should be by this game after an ati 10.10c driver upgrade and now this patch. I'll need to pull out gpu-z to see how much of my gfx cores are being used now as before it was 99% on one core and was lagging badly with npc's and then it was ~30% per core but still lagging.