The two are not equivalent tests because of the quite convoluted handling of special values by javascript. In the specific
undefined == null
is true, but typeof undefined is "undefined" while typeof null is "object".
The rules for those special values are quite complex and IMO illogical, so I think there's no "general rule". What you may find are common forms, for example
var value = obj.x || default_value;
that can be used if you're sure that obj will never be undefined or null (because in that case an exception would be thrown) and assuming that 0, NaN or an empty string should be considered as if no value was provided (because they're all "logically false" values). An empty array or an empty javascript object instead are considered "logically true".
Why is it that way? Why does (null).x throw an exception when null according to typeof is apparently an object and searching for a non-existent field in an object normally returns undefined instead?
I've no idea.
I never tried to find a logic in all those strange rules. I'm not actually even 100% sure there's one.
My suggestion is just to study and experiment with them.