I'm no expert on floating-point numbers, but Wikipedia says that  doubles have 52 bits of precision. Logically, it seems that 52 bits should be enough to reliably approximate integer division of 32-bit integers. 
Dividing the minimum and maximum 32-bit signed ints, -2147483648 / 2147483647, produces -1.0000000004656613, which is still a reasonable amount of significant digits. The same goes for its inverse, 2147483647 / -2147483648, which produces -0.9999999995343387.
An exception is division by zero, which I mentioned in a comment. As the linked SO question states, integer division by zero normally throws some sort of error, whereas floating-point coercion results in (1 / 0) | 0 == 0.
Update: According to another SO answer, integer division in C truncates towards zero, which is what |0 does in JavaScript. In addition, division by 0 is undefined, so JavaScript is technically not incorrect in returning zero. Unless I've missed anything else, the answer to the original question should be yes.
Update 2: Relevant sections of the ECMAScript 6 spec: how to divide numbers and how to convert to a 32-bit signed integer, which is what |0 does.