I have string "abcdefghij", and i want to have this string in bits. I tried like this:
byte[] K = new byte[10 * sizeof(char)];
K = System.Text.Encoding.UTF8.GetBytes(args[1]);
var d = new BitArray(k);
In K i have [0x61, 0x62, ..., 0x6a] - it's ok. But in d i have [1000 0110, 0100 0110, ..., 0101 0110] (not exactly as i typed, it's just array of true and false). In d it's converted as bit[0]...bit[7], from least to most sugnifficant bit. It's not what i want.
I want save bits from most sugnifficant to least: [0110 0001, 0110, 0010, ..., 0110 1010].
How can i deal with it?