I don't understand the relationship between UTF-8 and its other variants and am getting anomalous results at the terminal. For example, right arrow is:
0xE2 0x86 0x92 in UTF-8
but it is
0x2192 in UTF-16 and Unicode
I don't understand how E28692 is equivalent to 2192.
Also, the UTF-8 version does not seem to work in my linux terminal, which is using UTF-8 encoding with DejaVu font that supports Unicode. For example, if I enter
echo -e "\u2192"
Then I get an arrow, great, correct, it works. But, if I enter
echo -e "\xe2\x86\x92" or
echo -e "\x00\x00\x21\x92"
Then I get incorrect graphics. Why are my hex sequences wrong?