0
echo hello | iconv -f ascii -t utf-16le | od -x

produces what seems to me like a big-endian result

0068    0065    006c    006c    006f    000a

whereas the same line without the le produces, on a utf16le system (OS X)

echo hello | iconv -f ascii -t utf-16 | od -x

fffe 6800 6500 6c00 6c00 6f00 0a00

Does od -x change the endianness?

phuclv
  • 30,396
  • 15
  • 136
  • 260

1 Answers1

1
$ echo hello | iconv -f ascii -t utf-16le | hexdump -C
00000000  68 00 65 00 6c 00 6c 00  6f 00 0a 00              |h.e.l.l.o...|
$ echo hello | iconv -f ascii -t utf-16le | od -t x1
0000000 68 00 65 00 6c 00 6c 00 6f 00 0a 00

The question is how 'od' handles endianness. When you're asking it to display units larger than a single byte (-x displays 16-bit words), it will default to whatever is native for the system it's running on.

Your macOS is probably running on Intel x86_64 CPU which is little-endian, meaning that bytes {0x68, 0x00} indeed represent the 16-bit number 0x0068 when they're decoded by 'od'.

grawity
  • 501,077