The FPS vs what you see is fugazi anyway. It's actually 13 milliseconds that researchers found we process in, about 75 FPS. But who cares.
There's this guy on youtube that dances at 60 fps. It takes some getting used to and it always leans a bit towards the uncanny valley. Dunno how I'd react to 120 FPS. Probably like have a seizure or something.
This is probably the case, Skyrim was best capped in ENB at 54 FPS.
People see 30FPS and freak out, but different games have different requirements for enjoyable gameplay. In a competitive shooter, the higher your FPS the better, a minimum near to 60FPS (assuming a 60Hz monitor) is great and gives no-one an advantage over you, but something like a single player RPG or total war is just fine as lone as your minimum FPS isn't below 30. And that's the other thing, a capped constant 30 can possibly be a better experience than swinging from 60 down to 30, for example many found the 30FPS cut scenes in DA:I jarring compared to the gameplay if they had really good rigs.
One, average framerates don't give the full picture if the minimums are significantly lower (which is why tech sites now show frame latencies as well as frame rates for GPU reviews) and two, it's about gameplay experience, not numbers. A 120 FPS average with serious dips may feel like stuttering, a capped 30 or 60 may feel like a smoother experience.
So... as a first time pc builder and as far as Bethesda games are concerned, it'd be smarter to go for a 60Hz IPS monitor instead of a higher Hz equivalent, correct?
As far as Beth games are concerned, I would say yes; other games it's a case by case situation. The other thing to consider is if your GPU is capable of rendering a game at 100 FPS on max and you cap it at 60 (the latest cards from both AMD and Nvidia have better versions of v-sync to do this than previous generations), then your GPU may run cooler, more quietly and with less power draw.