I am trying to generate a hash code from two integer inputs. The approach outlined in
Combining Java hashcodes into a "master" hashcode
seems to work well for many input values. However, when one of the input integers is int.MinValue, the behavior seems less than ideal. Specifically I observe
int.MinValue * 1013 == int.MinValue
int.MinValue * 1009 == int.MinValue
but
int.MinValue * 2 == 0
int.MinValue * 20 == 0
All of this is in an unchecked context.
I would naively (and wrongly) assume that int.MinValue * (something other than 1 or 0) would yield a new bit pattern different than int.MinValue or 0.
Questions
- Why does multiplying
int.MinValueby these constants yieldint.MinValue(2 cases) or0(2 cases)? - Does the behavior of
int.MinValueindicate a flaw in the hash algorithm?