Why javascript counted wrong?
console.log(3*9.7)    // =29.099999999999998?
console.log(3*97/10) // = 29.1
Why javascript counted wrong?
console.log(3*9.7)    // =29.099999999999998?
console.log(3*97/10) // = 29.1
 
    
    this is how Javascript's floating point arithmetic works. If you would like to know more how to avoid this I would recommend checking out this link. How to deal with floating point number precision in JavaScript?
 
    
    