Underscore's _.indexOf method looks like this:
var nativeIndexOf = Array.prototype.indexOf;
_.indexOf = function(array, item, isSorted) {
if (array == null) return -1;
var i, l;
if (isSorted) {
i = _.sortedIndex(array, item);
return array[i] === item ? i : -1;
}
if (nativeIndexOf && array.indexOf === nativeIndexOf) return array.indexOf(item);
for (i = 0, l = array.length; i < l; i++) if (i in array && array[i] === item) return i;
return -1;
};
Toward the bottom of that method, it tries to use the native indexOf implementation, otherwise it compares the two with the identity operator ===.
The identity operator === works for primitive objects like these:
console.log("asdf" === "asdf"); // true
console.log(1 === 1); // true
console.log(1 === 1.0); // true
console.log(null === null); // true
console.log(undefined === undefined); // true
But it obviously doesn't work for two Object instances, even if they have the same properties and values:
console.log({} === {}); // false
console.log({} == {}); // false
var o = {};
console.log(o === o); // true
This makes the indexOf method not work if the items in an array are of Object type.
Question is, what's the most optimized way to find the index of an Object in an array in JavaScript. One way is to JSON.stringify(o) every item, which doesn't make sense performance wise. Another is to call toString() for every item (if toString() returns something other than [object Object], like ObjectID does in the node-native-mongodb module for node.js, returning an id string). A third solution, probably the most common, is to iterate over each key in each object. None of these are really ideal.
What is the recommended solution to this problem?