In javascript, division by zero with "integer" arguments acts like floating points should:
 1/0;    // Infinity
-1/0;    // -Infinity
 0/0;    // NaN
The asm.js spec says that division with integer arguments returns intish, which must be immediately coerced to signed or unsigned. If we do this in javascript, division by zero with "integer" arguments always returns zero after coersion:
(1/0)|0;    // == 0, signed case.
(1/0) >> 0; // == 0, unsigned case.
However, in languages with actual integer types like Java and C, dividing an integer by zero is an error and execution halts somehow (e.g., throws exception, triggers a trap, etc).
This also seems to violate the type signatures specified by asm.js. The type of Infinity and NaN is double and of / is supposedly (from the spec):
(signed, signed) → intish ∧ (unsigned, unsigned) → intish ∧ (double?, double?) → double ∧ (float?, float?) → floatish
However if any of these has a zero denominator, the result is double, so it seems like the type can only be:
(double?, double?) → double
What is expected to happen in asm.js code? Does it follow javascript and return 0 or does divide-by-zero produce a runtime error? If it follows javascript, why is it ok that the typing is wrong? If it produces a runtime error, why doesn't the spec mention it?
 
    