I have a function that returns an unsigned long that is actually a float.
We'll call that function unsigned long foo()
When I make a call to printf like this:
printf("%f", foo());
it will always print out 0.00000
but when I cast it like this:
unsigned long bar = foo();
printf("%f", *((float *)(&bar)));
it outputs the float correctly.
Printing with "%x" I can see that the binary representations are different. Surprisingly the unsigned long version looks more like an actual floating point representation (41ba019a vs 40000000)
I double checked on my system that the size of unsigned long and float are the same.
So my question is:
How can casting the pointer like this change the value of what is being pointed to?
Edit: The relevant part of foo() is essentially
unsigned long foo()
{
  float a = 22.4;
  return *(unsigned long *)(&a) ;
}
 
     
    