This is just a question to advance my own knowledge because I found something not behaving as expected. Consider the following code
decimal d0 = 0;
decimal d1 = 0M;
decimal d2 = 0.0M;
string s0 = d0.ToString();
string s1 = d1.ToString();
string s2 = d2.ToString();
The debugger will show d0 and d1 as 0 in a watch window. But it will show d2 as 0.0. The strings s0 and s1 will contain "0". But s2 will contain "0.0".
Why is this? Of course 0 == 0.00. In fact (d1 == d2) returns true. So why does C# treat these differently internally?
