In console, of any web page, I do:
console.log(11111111111111111);
The result is:
11111111111111112
So I am really confused, trying to add 1s and see what happens:
111111111111111111->1111111111111111001111111111111111111->111111111111111120011111111111111111111->11111111111111110000111111111111111111111->111111111111111110000
Am I going crazy? What causes this to happen?
Is this intended? Is this a bug with Javascript?
Note: I am pretty sure it is not an integer overflow or something, because adding one 1 more goes to 1.1111111111111111e+21 which is correct.
This question differs from this one, because I am not asking about the limit of numbers that can be represented accurately, I am asking more like:
Why 11111111111111111 -> 11111111111111112 instead of 11111111111111111 -> 1.1111111111111111e+16 (scientific notation)
