Not sure which article you're referring to - it may be one I've already seen. I did see http://arstechnica.co.uk/gaming/2015/08/directx-12-tested-an-early-win-for-amd-and-disappointment-for-nvidia/ that showed that in one particular game Nvidia saw little benefit in DX12 over DX11, but as they pointed out, that could have been because;
- It was just one game (and we have no information on how well the developers optimised that game to the particular Nvidia card the article authors were using);
- Nvidia architecture and drivers may be highly optimised for the more linear implementation of DX11, and so see less benefit from the switch to DX12.
Also, as AMD had already done work optimising their drivers for Mantle, I can imagine they were further along in getting optimised drivers for DX12 than Nvidia were.
Certainly in that article the Nvidia card only saw a reduction in framerate for DX12 in one test, where the article authors had downgraded the CPU by disabling two cores and hyperthreading - and, frankly, if you're going to mess around with a CPU like that I'd be surprised if you didn't get a few peculiar results.
I'd still maintain that, if a developer wants to get the best out of any GPU architecture, then having as thin an API as possible is the way to go (Mantle or DX12 for AMD, DX12 for Nvidia). However, getting closer to the hardware does mean the game developer has to do more of the optimisation themselves - and not all developers will be equally good at that.
I agree, no reason to get worried about what graphics hardware we have, whatever middleware or extra features Bethesda use for each brand. As a cross-platform game, I reckon Bethesda will have only minor bells and whistles varying from one GPU type to another.