I am mocking the static random() method of the Math class in Java to better test a method that relies on generating random numbers. The code I am using is as follows:
@Mock
public double random() {
    return 1.0 - Double.MIN_VALUE;
}
This is my attempt at reaching a value as close to 1.0 as possible without being equal. (e.g. 0.999999999999...)
However, when I invoke the mocked Math.random() method I always get 1.0 as the value. It's almost as if subtracting the Double.MIN_VALUE does not affect the 1.0 at all.
Why does 1.0 - Double.MIN_VALUE result in 1.0 and how can I simulate the largest possible value of Math.random()?
 
     
     
     
    