I was helping someone with their homework and ran into this strange issue. The problem is to write a function that reverses the order of bytes of a signed integer(That's how the function was specified anyway), and this is the solution I came up with:
int reverse(int x)
{
    int reversed = 0;
    reversed = (x & (0xFF << 24)) >> 24;
    reversed |= (x & (0xFF << 16)) >> 8;
    reversed |= (x & (0xFF << 8)) << 8;
    reversed |= (x & 0xFF) << 24;
    return reversed;
}
If you pass 0xFF000000 to this function, the first assignment will result in 0xFFFFFFFF. I don't really understand what is going on, but I know it has something to do with conversions back and forth between signed and unsigned, or something like that.
If I either append ul to 0xFF it works fine, which I assume is because it's forced to unsigned then converted to signed or something in that direction. The resulting code also changes; without the ul specifier it uses sar(shift arithmetic right), but as unsigned it uses shr as intended.
I would really appreciate it if someone could shed some light on this for me. I'm supposed to know this stuff, and I thought I did, but I'm really not sure what's going on here.
Thanks in advance!
 
     
     
     
     
     
     
    