I am going a bit crazy here with the following section of code
  public static readonly float f = 1.25f;
  public static void init(){
     Debug.Log(f); // output: 1.25f
     FLIPPER_CENTERS = new float[,] { 
           { (20*f), (27*f) },    { FLIPPER_WIDTH - (20*f), (27*f)},
           { (6*f), (25*f) },     { MH_FLIPPER_WIDTH- (6*f), (25*f) },
           { (8), (15)},          { (SMALL_FLIPPER_WIDTH - 8), (15)},
           { (8), (20)},          { (67 - 8), (20)},
     };
     Debug.Log(FLIPPER_CENTERS[0,0]); // output: 0, expected 25;
  }
If I print the values of the first element of that array, I get [0, 0]. The last two elements are [59, 20], as expected.
The first value is supposed to be [25, 33,75]. Which I can get if I substitute (20*f) for (20*1.25f).
        { (20*1.25f), (27*1.25f) },    { FLIPPER_WIDTH - (20*f), (27*f)},
So here is the problem: if I leave the multiplication by f in the array initialization, the values are 0. However, if I change f to 1.25f all is good.
I have tried to figure out what is going on, but to no avail. I am certain the value of f is 1.25f and not 0. Can anyone shed some light on this for me please?
Edit:
To prove that f is not 0, I've tried 20/f for the first element. That didn't throw an exception and the output was still 0.
Partial Solution
Changing f from readonly to const solves the problem. However, I would really much like to know why this is happening.
  public const float f = 1.25f;
All this is running in Unity, it may have something to do with it.
 
    