I know that floating point values in JavaScript are stored with a binary base-2 format specified in IEEE 754. To me, this means that when I assign the literal value .1 to a variable, the value actually stored will be 0.100000001490116119384765625 (or some high-precision number like that--my math may be wrong).
But counter to that assumption, the console.log of a stored value does not reflect this. The following code:
var a = 0.1;
console.log(a);
...when executed in Chrome, and probably other browsers, will output:
0.1
I would have expected it to be:
0.100000001490116119384765625
Does the value of a at this point hold 0.1 or 0.1000000...? If the latter, then by what means does console.log() show 0.1? I'm interested in what's going on under the hood here. (E.g. Is JS storing a text representation of the number in the variable?)
For you diligent admins that might be a little quick to "mark as duplicate", please note that I am asking the opposite of the more common question and variations "Why do I suddenly see these wacky high-precision numbers?"
 
     
    