Running an empty for-loop with large numbers of iterations, I'm getting wildly different numbers in how long it takes to run:
public static class Program
{
    static void Main()
    {
        var sw = new Stopwatch();
        sw.Start();
        for (var i = 0; i < 1000000000; ++i)
        {
        }
        sw.Stop();
        Console.WriteLine(sw.ElapsedMilliseconds);
    }
}
The above will run in around 200ms on my machine, but if I increase it to 1000000001, then it takes 4x as long! Then if I make it 1000000002, then it's down to 200ms again!
This seems to happen for an even number of iterations.  If I go for (var i = 1; i < 1000000001, (note starting at 1 instead of 0) then it's 200ms.  Or if I do i <= 1000000001 (note less than or equal) then it's 200ms.  Or (var i = 0; i < 2000000000; i += 2) as well.
This appears only to be on x64, but on all .NET versions up to (at least) 4.0. Also it appears only when in release mode with debugger detached.
UPDATE I was thinking that this was likely due to some clever bit shifting in the jit, but the following seems to disprove that: if you do something like create an object inside that loop, then that takes about 4x as long too:
public static class Program
{
    static void Main()
    {
        var sw = new Stopwatch();
        sw.Start();
        object o = null;
        for (var i = 0; i < 1000000000; i++)
        {
            o = new object();
        }
        sw.Stop();
        Console.WriteLine(o); // use o so the compiler won't optimize it out
        Console.WriteLine(sw.ElapsedMilliseconds);
    }
}
This takes around 1 second on my machine, but then increasing by 1 to 1000000001 it takes 4 seconds. That's an extra 3000ms, so it couldn't really be due to bit shifting, as that would have shown up as a 3000ms difference in the original problem too.
 
    