How exactly does rounding a number into an int variable work? Does it round down always (ignoring all decimal places), or round up from .5, or some other possibility?
I'm going to be doing a bunch of math with output from GetRandomPercent as part of it, and I need to figure out where things are going to round up vs. down in the final int value.
(I am assuming that calculations are done as floats, and only rounded at the end of an operation? Ie, something like "set intvariable to (GetRandomPercent * 1.5)" would actually yield a range of 0 - 148 (or 149?), leaving the 1.5 in tact through the operation...)