I found these excerpts in the C++ standard (quotations taken from N4687, but it will likely have been there since forever):
[char.traits.typedefs]
For a certain character container type
char_type, a related container typeINT_Tshall be a type or class which can represent all of the valid characters converted from the correspondingchar_typevalues, as well as an end-of-file value,eof().
[char.traits.require]
Expression:
X::eof()Type:
X::int_typeReturns: a value
esuch thatX::eq_int_type(e,X::to_int_type(c))isfalsefor all valuesc.
Expression:
X::eq_int_type(e,f)Type:
boolReturns: for all
candd,X::eq(c,d)is equal toX::eq_int_type(X::to_int_type(c), X::to_int_type(d))(...)
candddenote values of typeCharT; (...);eandfdenote values of typeX::int_type
[char.traits.specializations.char]
using char_type = char; using int_type = int;
[basic.fundamental]
Plain
char,signed char, andunsigned charare three distinct types, collectively called narrow character types. (...) Achar, asigned char, and anunsigned charoccupy the same amount of storage and have the same alignment requirements (...) For narrow character types, all bits of the object representation participate in the value representation. (...) For unsigned narrow character types, each possible bit pattern of the value representation represents a distinct number.
There are five standard signed integer types : “
signed char”, “short int”, “int”, “long int”, and “long long int”. In this list, each type provides at least as much storage as those preceding it in the list.
I haven't found anything preventing sizeof(int) == 1 in the surrounding text. This is obviously not the case in most modern platforms where sizeof(int) is 4 or 8 but is explicitly used as an example e.g. in cppreference:
Note: this allows the extreme case in which bytes are sized 64 bits, all types (including char) are 64 bits wide, and sizeof returns 1 for every type.
The question
If int was as large as char, the standard does not leave much space for any object representation of the former that would compare inequal to all values (via to_int_type) of the latter, leaving just some corner cases (like negative zero existing in signed char but mapping to INT_MIN in int) unlikely to be implemented efficiently in hardware. Moreover, with P0907 it seems even signed char will not allow any two different bit strings representing equal values, thus forcing it to 2^(bitsize) distinct values, the int as well, and closing every possible loophole.
How, in such platform, would one conform to the requirements of std::char_traits<char>? Do we have a real-world example of such platform and the corresponding implementation?