Possible Duplicate:
Are doubles faster than floats in c#?
I wrote simple benchmark to check how much performance i can get changing double datatype to float in my application. Here is my code:
    // my form: 
    // one textbox: textbox1 (MultiLine property set to true)
    // one button: button1 with event button1_Click
    private void button1_Click(object sender, EventArgs e)
    {
        int num = 10000000;
        float[] floats1 = new float[num];
        float[] floats2 = new float[num];
        float[] floatsr = new float[num];  // array for results
        double[] doubles1 = new double[num];
        double[] doubles2 = new double[num];
        double[] doublesr = new double[num]; // array for results
        Stopwatch stw = new Stopwatch();
        log("Preparing data");
        Random rnd = new Random();
        stw.Start();
        for (int i = 0; i < num; i++)
        {
            floats1[i] = NextFloat(rnd);
            floats2[i] = NextFloat(rnd);
            doubles1[i] = rnd.NextDouble();
            doubles2[i] = rnd.NextDouble();
        }
        stw.Stop();
        log(stw.Elapsed.TotalMilliseconds.ToString()+"ms");
        stw.Reset();
        log("");
        stw.Start();
        for (int i = 0; i <# i++)
        {
            floatsr[i] = floats1[i] * floats2[i];
        }
        stw.Stop();
        log("Multiplying floats: " + stw.Elapsed.TotalMilliseconds.ToString() + "ms");
        stw.Reset();
        stw.Start();
        for (int i = 0; i < num; i++)
        {
            doublesr[i] = doubles1[i] * doubles2[i];
        }
        stw.Stop();
        log("Multiplying doubles: " + stw.Elapsed.TotalMilliseconds.ToString() + "ms");
        stw.Reset();
        stw.Start();
        for (int i = 0; i < num; i++)
        {
            floatsr[i] = floats1[i] / floats2[i];
        }
        stw.Stop();
        log("Dividing floats: " + stw.Elapsed.TotalMilliseconds.ToString() + "ms");
        stw.Reset();
        stw.Start();
        for (int i = 0; i < num; i++)
        {
            doublesr[i] = doubles1[i] / doubles2[i];
        }
        stw.Stop();
        log("Dividing doubles: " + stw.Elapsed.TotalMilliseconds.ToString() + "ms");
        stw.Reset();
    }
    private void log(string text)
    {
        textBox1.Text = textBox1.Text + text + Environment.NewLine;
    }
    // I found that function somewhere on stackoverflow
    static float NextFloat(Random random)
    {
        double mantissa = (random.NextDouble() * 2.0) - 1.0;
        double exponent = Math.Pow(2.0, random.Next(-126, 128));
        return (float)(mantissa * exponent);
    }
I got results like this (release, no debug, Intel Mobile Core Duo T2500 2.0GHz 2MB CPU):
Preparing data 5275,6862ms
Multiplying floats: 442,7865ms 
Multiplying doubles: 169,4028ms
Dividing floats: 550,7052ms 
Dividing doubles: 164,1607ms
I was suprised, that operations on double are almost 3 times faster than operations on float. I searched for "double float" here, and i found this:
Is using double faster than float?
Best answer is focused on CPU architecture, but I cant agree with that.
I suspect that something else is causing low performance on floats, because my CPU with Intel SSE should be able to multiply or divide 4 floats at once (packed floating point instructions), or 2 doubles at once. So floats should be faster.
Maybe compiler (or clr in .net) is optimizing memory usage somehow?
Is there any way to optimize it and make float faster?
Please don't report duplicate, i saw other questions and they not satisfying me.
My results after changing method for generating floats now look fine (suggested by Servy):
Preparing data 1367,0678ms
Multiplying floats: 109,8742ms 
Multiplying doubles: 149,9555ms
Dividing floats: 167,0079ms 
Dividing doubles: 168,6821ms
 
     
     
    