Dude, the human eye can see way above 24 frames per second. It's around 70fps that the human eye stops noticing large amounts of difference, but I'm not positive about that. Films don't look juttery because you can't see higher. It's because films employ motion blur - it blurs the frames together making it seem less juttery. Most games need to run higher because the vast majority of games don't use motion blur, or use types that don't quite have the same effect on frame rates, but just produce a nice, cool-looking visual effect, nothing more. Crysis 1 was one of the first games that employed motion blurring in a way that actually makes a lower framerate (around 20 frames) still fairly playable, as it's blurring all the frames together. If you don't believe me, go download this comparison video:
http://www.mediafire.com/?ybmlymn2nnl
I'd link you to a youtube version of it, but youtube is capped at 30fps so it wouldn't work properly. There's also this much simpler comparison if you don't want to download a tiny video, but it doesn't illustrate it quite as well as the video does:
http://www.boallen.com/fps-compare.html
The reason movies are (or were, at least, before the digital age) shot at 24fps? Because it's the lowest framerate they could go without it starting to look horrible. The higher the framerate, the more film would need to be printed, because there'd be more frames. You know the old classic style film rolls they mounted onto projectors? If a film was shot at 60fps those film rolls would need to be almost three times the size as one shot at 24fps, because there would be almost 3 times the amount of frames.
So basically, it's 24fps because it's cheaper.