This is also related to char32_t and any intXX_t. The specification points out that:
2.14.3.2:
The value of a char16_t literal containing a single c-char is equal to its ISO 10646 code point value, provided that the code point is representable with a single 16-bit code unit.
5.3.3.1:
[..] in particular [..] sizeof(char16_t), sizeof(char32_t), and sizeof(wchar_t) are implementation-defined
I can not see anything about the intXX_t types, apart from the comment that they are "optional" (18.4.1).
If a char16_t isn`t guaranteed to be 2 bytes, is it guaranteed to be 16 bit (even on architectures where 1 byte != 8 bit)?