I have something like the following:
var val = "string";
var testVal = val && val.length;
I would expect testVal to be either true or false but it is the length of the string.  Not sure why this is?
I have something like the following:
var val = "string";
var testVal = val && val.length;
I would expect testVal to be either true or false but it is the length of the string.  Not sure why this is?
