It is difficult to believe when you think about it, but the C programming language had no boolean type until 1998.  Programmers came up with their own solutions, and over time it became convention to use an int type with 0 for no.  When a boolean was eventually added it was as a true first class boolean type.  This means that it can only store the values 0 and 1.read more here
You can test this for yourself in objC with something like this.
bool foo = 55;
NSLog ((foo) ?  (@"foo, it is true") :(@"no, foo is false") ); //will log true..
NSLog (@"foo's int value", (int)foo ); // will log 1 
now bool does use a full 8 bits, its just (i think) that only the first (or last depending on your architecture / endian-ness ) is ever written to / read from..
Now ObjectiveC has been around a lot longer than 1998.  So BOOL is actually older than bool !
This is why it was always defined as char. It is actually capable of storing values between -127 and 128.  Again you can test this in Xcode
    BOOL bar = 55;
    NSLog ((bar) ?  (@"bar, it is true too!") :(@"no, bar neither") ); 
//will again log true..
    NSLog (@"bar's int value", (int)bar ); 
// will log 55 on 32bit builds but 1 on 64bit builds 
Yes but not always as you can see.  In 64bit objC BOOL is defined as bool !
from objc.h
 #if !defined(OBJC_HIDE_64) && TARGET_OS_IPHONE && __LP64__
    typedef bool BOOL;
    #else
    typedef signed char BOOL; 
    // BOOL is explicitly signed so @encode(BOOL) == "c" rather than "C" 
    // even if -funsigned-char is used.
    #endif
Because of bool and BOOL's messy legacy in C and objC the truth is testing for YES or true is not always 100% reliable.  Instead I recommend you test for !false or !NO .  Sounds ridiculous right?
heres a little blog about it I found on big nerd ranch
PS i totally understand that you're talking compiler conditionals, but you did tag it objC :)