While trying to devise an algorithm, I stumbled upon this question. It's not homework.
Let P_i = an array of the first i primes. Now I need the smallest i such that
Sum<n=0..i> 1 / (P_i[n]*P_i[n]) >= 1.
(if such i exists).
An approximation for the i'th prime is i*log(i). So I tried this in Java:
public static viod main(String args[]) {
double sum = 0.0;
long i = 2;
while(sum<1.0) {
sum += 1.0 / (i*Math.log(i)*i*Math.log(i));
i++;
}
System.out.println(i+": "+sum);
}
However the above doesn't finish because it converges to 0.7. However 1/100000000^2 rounds to 0.0 in Java, so that's why it doesn't work. For the same reason it doesn't even work if you replace the 6th line with
sum += 1.0 / (i*i)
while that should reach 1 if I'm not mistaken, because the sum should incease faster than 1/2^i and the latter converges to 1. In other words, this shows that Java rounding causes the sum to not reach 1. I think that the minimum i of my problem should exist.