I am getting strange output when I add doubles together. Can someone tell me why I'm getting repeating decimals when I am adding 0.1 every time?
I have worked out the formula for adding these numbers together and I have done this myself on paper up to 3.3... 
The sum of all numbers (decreasing by 1 tenth) from 3.3 to 1 equals 51.6
3.3
3.2
3.1 +
3.0
...
1.0
_
51.6
There is an easier way to calculate this using two formulas:
The linear formula for the increasing number:  Y = 0.1X + 1 
And the sum of increasing numbers formula: [X * (Y + 1)]/2 = total
first solve for Y using any number (in this case 100)
11 = 0.1(100) + 1
Then solve for the total using X and Y
[100 * (11+1)]/2 = 600
The output of the following code should be 600 I believe. There is no question that it should not have a repeating decimal. What am I doing wrong here? There must be something I missed.
public static void main(String[] args) {
      int days = 100;
      double inc = 0.1;
      double init = 1;
      double total = 0;
      for (int i = 1; i <= days; i++) {
          if (i == 1) {
            total = total + init;
          } else {
            init = init + inc;
            total = total + init;
          }
      }
      System.out.println("Total: " + total);
      System.out.println("Daily: " + init);
  }
 
     
     
     
     
    