I have the following array:
var arr = [1, 2, 3, 8];
What I need to do is compute the average of each, but I have to first add them all up which I do in the following way:
var total = 0;
var fractionSum = 0;
var fractionArray = [];
for (var x = 0; x < arr.length; x++) {
   total += arr[x]
}
Then I compute the average:
for (var y = 0; y < arr.length; y++) {
   var average = (arr[y] / total) * 100;                                
   var averageFixed = average.toFixed(2);
   fractionArray.push(eval(averageFixed));
   fractionSum += fractionArray[y];
}
The problem I'm having is that the values in fractionArray are [57.14, 21.43, 14.29, 7.14] and when you add them outside of a javascript interpreter what you get is 100 (which is the desired result in this context), but the value I get in fractionSum is 99.99999999999999. 
How can I fix this so I can get the "true result" of 100?  I'm not interested in knowing "why" I'm getting 99.99999999999999 (which is answered here Is floating point math broken?) .  Rather, my interest is in knowing what else do I need to add to the code so that instead of 78.57 + 14.29 equaling 92.85999999999999 (which happens on the third iteration of the arr array on this line fractionSum += fractionArray[y];) I get it to equal 92.86 and so on and so forth.
 
     
    