This question is not a duplicate of Count the number of set bits in a 32-bit integer. See comment by Daniel S. below.
--
Let's say there is a variable int x;. Its size is 4 bytes, i.e. 32 bits.
Then I assign a value to this variable, x = 4567 (in binary 10001 11010111), so in memory it looks like this:
00000000 00000000 00010001 11010111
Is there a way to get the length of the bits which matter. In my example, the length of bits is 13 (I marked them with bold).
If I use sizeof(x) it returns 4, i.e. 4 bytes, which is the size of the whole int. How do I get the minimum number of bits required to represent the integer without the leading 0s?
 
     
     
     
     
    