int f(int n)
{
    int i, c = 0;
    for (i=0; i < sizeof(int)*8; i++, n >>= 1)
        c = (n & 0x01)? c+1: c;
    return c;
}
It's an exercise I found on my book, but I really don't get It!
int f(int n)
{
    int i, c = 0;
    for (i=0; i < sizeof(int)*8; i++, n >>= 1)
        c = (n & 0x01)? c+1: c;
    return c;
}
It's an exercise I found on my book, but I really don't get It!
It counts the number of bits set in the passed in parameter n (assuming your machine has 8-bit bytes).  I'll comment inline with your code (and fix the terrible formatting):
int f(int n)
{
    int i;     // loop counter
    int c = 0; // initial count of set bits is 0
    // loop for sizeof(int) * 8 bits (probably 32), 
    // downshifting n by one each time through the loop
    for (i = 0; i < sizeof(int) * 8; i++, n >>= 1) 
    {
        // if the current LSB of 'n' is set, increment the counter 'c',
        // otherwise leave it the same
        c = (n & 0x01) ? (c + 1) : c;  
    }
    return c;  // return total number of set bits in parameter 'n'
}
 
    
    It is doing a bitwise and - turning on bits off and off bits on.
