I remarked a surprising behavior in one of my C# functions in Unity. A float passed to my function takes an extraordinary value inside it.
I know floats can go weird when used with numbers with a lot of decimals, but in that case it only happens inside the function. I also tried with doubles and the results are similar.
private float i = 0;
private float whatever1 = Random.Range(1f, 10f);//Some value in 1-10 range
private float whatever2 = Random.Range(1f, 10f);//Some value in 1-10 range
private float MyFunc(float j)
{
    Debug.Log(j.ToString());
    float max = whatever1 > whatever2 ? whatever1 : whatever2;
    return j - max;
}
public void Update(){
    i += Random.Range(1/60f, 1/50f);
}
public void Test(){
    Debug.Log(i);
    MyFunc(i);
    MyFunc(67f);
}
// 10.256463
// 4,622506E+18
// 4,622506E+18
So I tried debugging and logging the value of i and j. Its value is correct outside of MyFunc, but becomes a weird 4.16E+18 as soon as it enters the function. 
I've tried several things so far :
- using Mathf.Maxinstead of a ternary operator
- using doubles instead of floats
- passing a constant to the function instead of a value (see MyFunc(67))
- rebuilding the solution/project
The result :
For a value of
iin the 9-15 range,jhas a value between 4,0E+18 and 6,.0E+18jtakes that weird value beforeMyFuncmakes any operation on the float. So it doesn't seem related to floating point arithmetic issues.
I ended up inlining the content of the function wherever I needed it, but I still don't understand what happens. Can you explain ?
