I am writing a bit of C code in which I am presented with following things:
typedef uint8_t BYTE;
typedef int8_t SBYTE;
typedef uint16_t WORD;
typedef uint32_t DWORD;
typedef uint64_t QWORD;
I have a function having definition as QWORD exp_d(const DWORD *in).
And I have a pointer to a QWORD as QWORD* input. Being a QWORD, it is of 64 bits. Now, I want to send the least-significant 32 bits of input to function exp_d. What I am doing is,
QWORD expansion=exp_d(((DWORD*)input)+1);
I think, input is a QWORD*, so first typecasting it to a DWORD*, and then incrimenting it by 1 (to get to next DWORD, i.e. least significant 32 bits of QWORD) should do the thing. However, when I pass such value to exp_d, I get the most significant 32 bits of input rather than least significant 32 bits as expected.
Where am I going wrong?