I have come across a legacy piece of code encoding byte array to hex string which is in production and have never caused an issue.
This piece of code is used as:
- We encrypt a user password. The encryptor returns a byte[].
- We convert the byte[]to Hex String using this encoder code and then use thatStringrepresentation in our properties file and so on.
However, yesterday we have hit a password, whose encrypted byte[] version is getting encoded incorrectly.
import java.math.BigInteger;
import java.util.HashMap;
import org.apache.commons.codec.DecoderException;
import org.apache.commons.codec.binary.Hex;
public class ByteArrayToHexEncoder {
    public static void main(String[] args) throws DecoderException {
        String hexString = "a0d21588c0a2c2fc68dc859197fc78cd"; // correct hex representation
        // equivalent byte array: this is the byte array returned by the encryptor
        byte[] byteArray = Hex.decodeHex(hexString.toCharArray());
        // legacy encoder
        System.out.println("Legacy code encodes as: " + encodeHexBytesWithPadding(byteArray));
        // commons-codec encoder
        System.out.println("Commons codec encode as: " + new String(Hex.encodeHex(byteArray)));
    }
    private static final String PADDING_ZEROS =
            "0000000000000000000000000000000000000000000000000000000000000";
    private static final HashMap<Integer, Character> MAP_OF_HEX = new HashMap<>();
    static {
        MAP_OF_HEX.put(0, '0');
        MAP_OF_HEX.put(1, '1');
        MAP_OF_HEX.put(2, '2');
        MAP_OF_HEX.put(3, '3');
        MAP_OF_HEX.put(4, '4');
        MAP_OF_HEX.put(5, '5');
        MAP_OF_HEX.put(6, '6');
        MAP_OF_HEX.put(7, '7');
        MAP_OF_HEX.put(8, '8');
        MAP_OF_HEX.put(9, '9');
        MAP_OF_HEX.put(10, 'a');
        MAP_OF_HEX.put(11, 'b');
        MAP_OF_HEX.put(12, 'c');
        MAP_OF_HEX.put(13, 'd');
        MAP_OF_HEX.put(14, 'e');
        MAP_OF_HEX.put(15, 'f');
    }
    public static String encodeHexBytesWithPadding(byte[] inputByteArray) {
        String encodedValue = encodeHexBytes(inputByteArray);
        int expectedSize = inputByteArray.length * 2;
        if (encodedValue.length() < expectedSize) {
            int zerosToPad = expectedSize - encodedValue.length();
            encodedValue = PADDING_ZEROS.substring(0, zerosToPad) + encodedValue;
        }
        return encodedValue;
    }
    public static String encodeHexBytes(byte[] inputByteArray) {
        String encodedValue;
        if (inputByteArray[0] < 0) {
            // Something is wrong here! Don't know what!
            byte oldValue = inputByteArray[0];
            inputByteArray[0] = (byte) (oldValue & 0x0F);
            int nibble = (oldValue >> 4) & 0x0F;
            encodedValue = new BigInteger(inputByteArray).toString(16);
            inputByteArray[0] = oldValue;
            encodedValue = MAP_OF_HEX.get(nibble) + encodedValue;
        } else {
            encodedValue = new BigInteger(inputByteArray).toString(16);
        }
        return encodedValue;
    }
}
The legacy code outputs the encoded value as: 0ad21588c0a2c2fc68dc859197fc78cd while the correct expected value should be: a0d21588c0a2c2fc68dc859197fc78cd.
I am trying to understand what's wrong with the encoder and need some help understanding.
 
    