1 / f * f should mathematically equal 1. I tested this identity with the code below and it was not true for f = 41.000000 and f = 47.000000.
I think it's related with floating point or rounding but don't know the reason. What makes this result?
#include <stdio.h>
int main(void) {
    float f;
    for (f = 1; f < 50; f += 1) {
        if (1 / f * f != 1)
            printf("f=%f\n", f);
    }
    return 0;
}
 
     
    