I have openssl server and Objective-C client. I send message like this
uint32_t testD = 161;
err = SSL_write(ssl_, &testD, sizeof(uint32_t));
and read it by NSInputStream like
case NSStreamEventHasBytesAvailable:
        {
                uint8_t buffer[4];
                int len;
                while ([inStream hasBytesAvailable])
                {
                    len = [inStream read:buffer maxLength:sizeof(buffer)];
                    if (len > 0)
                    {
                        NSString *output = [[NSString alloc] initWithBytes:buffer length:len encoding:NSASCIIStringEncoding];
                        NSData *theData = [[NSData alloc] initWithBytes:buffer length:len];
                        if (nil != output)
                        {
                            char buff;
                            [theData getBytes:&buff length:1];
                            uint32_t temp = (uint32_t)buffer;
                        }
        ...
So, in output I have "¡", it's 161-th ASCII symbol, in buff I have '\xa1' and in temp very big number, but actually I need 161 in temp.
I read that '\xa1' it's also 161, but I can't cast this to uint32_t. What is the problem?
ANSWER:
The problem was in casting. This works fine for me:
unsigned char buff;
int temp = buff;
or
char buff;
int b = (unsigned char) buff;
 
     
     
    