I've tried to implement WebSocket frame unmasking algorithm (based on this: How can I send and receive WebSocket messages on the server side?) on the server side. Here's what I've got:
def decode(data):
    frame = bytearray(data)
    length = frame[1] & 127
    indexFirstMask = 2
    if length == 126:
        indexFirstMask = 4
    elif length == 127:
        indexFirstMask = 10
    indexFirstDataByte = indexFirstMask + 4
    mask = frame[indexFirstMask:indexFirstDataByte]
    i = indexFirstDataByte
    j = 0
    decoded = []
    while i < len(frame):
        decoded.append(frame[i] ^ mask[j%4])
        i += 1
        j += 1
    print decoded
    return "".join(chr(byte) for byte in decoded)
However I get very strange results. On JavaScript side:
w = new WebSocket("ws://localhost:2000");
w.send("test");
w.send("test");
w.send("test");
w.send("test");
produces on the server side:
[42, 73, 45, 46, 1, 0]
[42, 1, 98, 0, 0]
[2, 97, 0, 0]
[2, 97, 0, 0]
More calls to w.send("test"); produces [2, 97, 0, 0]. Also first two arrays have length >4 (number of characters in word test). And none of these gets converted to word test. It seeems that I must be doing something wrong in my decoding code. What's causing that? Any help?
EDIT Have look at raw frames:
[193, 134, 48, 166, 232, 11, 26, 239, 197, 37, 49, 166]
[193, 133, 57, 161, 169, 218, 19, 160, 203, 218, 57]
[193, 132, 150, 97, 124, 54, 148, 0, 124, 54]
[193, 132, 163, 26, 102, 249, 161, 123, 102, 249]
[193, 132, 238, 212, 210, 156, 236, 181, 210, 156]
Of course these numbers are a bit random (due to masking), but note that second byte (which is supposed to represent the length of payload) is 134, then 133 and then always 132. Also first two frames are longer then other frames. What's going on here?
 
    