I'm beginning in Java, and could anyone explain me why Java gives me these answers?
I have a very simple class trying to learn how to round a number. I want 2 decimals so...
public static void main(String[] args) {
    double pi1 = Math.PI;
    System.out.println("pi1 = " + pi1);
    double p2 ;
    p2= Math.round(pi1*100)/100;
    //p2= Math.round(pi1*100)
    //p2=p2/100;
    System.out.println("p2 = " + p2);
}
If I run this result is:
p2 = 3.0
Then I change
    //p2= Math.round(pi1*100)/100;
    p2 = Math.round(pi1*100);
    p2 = p2/100;
Now, result is:
p2 = 3.14
as I wanted
Why with these differences? Why the first option doesn't give me 3.14 I think that I've made a correct code with 1st option.
Please, anyone could tell me why? These things makes me don't trust Java.
Thank you.
 
     
     
     
    