I found this strange behavior in javascript.
var v1 = [1];
var v2 = [1];
v1 == v2 // false
v1 == 1 //true
[1] == [1] // false
1 == [1] // true
why is it that [1] == [1] returns falseand [1] == 1 returns true?
I found this strange behavior in javascript.
var v1 = [1];
var v2 = [1];
v1 == v2 // false
v1 == 1 //true
[1] == [1] // false
1 == [1] // true
why is it that [1] == [1] returns falseand [1] == 1 returns true?
The spec says that if the two operands of == have the same type as each other (such as in the [1] == [1] case, where they both are type Object), then == behaves exactly like ===. The two arrays are not the exact same object, so false is returned. Notice that:
var v1 = [1];
var v2 = v1;
v1 == v2; // true
When the operands have different types, they are both coerced. In the case of 1 == [1] Rule 10 from the link above applies first and the array is converted to a primitive, by its toString() which returns '1'. Then rule 6 applies (converting the string '1' to the number 1), and the comparison becomes 1 == 1, and finally they have the same type and are compared with ===. Obviously 1 === 1 evaluates to true.