Attached is a screen shot of my spread-sheet results.
http://31.media.tumblr.com/faaaae085466e7e3c2eb6defc55607ea/tumblr_mw08qcyWCX1smcvdbo1_1280.png
1) In my Mod I have a "meter" that begins at 1440
2) Every in-game hour the Meter decreases by 30
3) This means it takes 48 Hours for the Meter to reach zero.
4) 30 is therefore "100%" efficiency.
5) With Perks, the Player can decrease the rate that the meter drains at by 1.5 per in-game hour.
6) With all ten perks, the Meter decreases at a rate of 15 per in-game hour.
7) Therefore, the Meter should last twice as long, since it drains by only 50% per hour.
Unfortunately I am not great with Maths, and do not quite trust the results I've worked out for how many in-game hours this should last?
Meter Length = Represents how long the Meter is (1440, constant)
Units Per hour = How much the meter decreases per in-game hour (Variable, decreases by 1.5 across all ten perks, but the table goes beyond that)
Percentage = How much longer the Meter takes to drain as a percentage.
Raw Hours = Hours relative to the Percent efficiency
Total Hours to Drain = Raw Hours * 2 (since for example : 100 in 1 hour = 200 in 2 hours, but 50 in 1 hour = 100 in 2 hours, so it would take 4 hours to drain 200)
As I say, I'm not great with maths, and although this looks right to me, I do not actually trust I've worked it out right...
I'd appreciate someone better at these things to take a look, and tell me if it seems to follow?