I am able to load a certain DLL and call a certain function in that DLL using some basic Win32 C/C++ code.
The function I call wants the address of a buffer allocated by the caller, which will be filled by the call, at a given offset, with a 16 bits integer.
Using a debugger and a breakpoint, I can see the memory after the call. At the given offset, the 16 bits integer is OK: little-endian, so LSB first.
Now, I want to call from nodejs
const ffi = require('ffi-napi');
const ref = require('ref-napi');
const StructType = require('ref-struct-napi');
const ArrayType = require('ref-array-napi');
const hllapi = ffi.Library('WHLAPI32', {
  'WinHLLAPI': [ref.types.void, [ref.refType(ref.types.uint16),
                                 ref.refType(ref.types.void),
                                 ref.refType(ref.types.uint16),
                                 ref.refType(ref.types.uint16)]]
});
var QuerySessionStatusStruct = StructType({
    ShortName: ref.types.uchar,         //  1
    LongName: ArrayType('uchar', 8),    //  9
    Type: ref.types.uchar,              // 10
    Characteristics: ref.types.uchar,   // 11
    Rows: ref.types.uint16,             // 13
    Cols: ref.types.uint16,             // 15
    CodePage: ref.types.uint16,         // 17
    Reserved: ref.types.uchar           // 18
 });
 
const function_number = ref.alloc(ref.types.uint16, 22);
const QuerySessionStatusStructInstance = new QuerySessionStatusStruct;
QuerySessionStatusStructInstance.ShortName='B';
const length = ref.alloc(ref.types.uint16, 18);
const ps_position = ref.alloc(ref.types.uint16, 0);
hllapi.WinHLLAPI(function_number, QuerySessionStatusStructInstance.ref(), length, ps_position );
console.log( "RC is: " + ps_position.deref());
console.log( 'Long Name is: '+String.fromCharCode.apply(null, QuerySessionStatusStructInstance.LongName));
console.log( 'Rows is: '+QuerySessionStatusStructInstance.Rows);
Result:
RC is: 0
Long Name is: MAINFRAM
Rows is: 20480
Rows should be 24! (when in C++ I can see 0x18-0x00) 20480 is 0x5000... What am I doing wrong?
EDIT: as I was seeing the correct LSB in C++, I had the weird idea to change the declaration of the structure from:
Rows: ref.types.uint16,             // 13
to:
Rows: ref.types.uchar,              // 12
Padding: ref.types.uchar,           // 13
and now I have the correct output:
RC is: 0
Long Name is: MAINFRAM
Rows is: 24
But if the uchar is 0x18, how can the uint16 be 0x5000??? I do understand NOTHING! It is very frustrating...
