Why does decimal points show inaccurate square root? It must be all zeros perfectly.
print('Enter your input:')
n = input()
def squareroot(n):
        i = 0.01
        while i*i < float(n):
                i += 0.01
        return i
print (squareroot(n))
output -
Enter your input:
100
10.009999999999831
Enter your input:
25
5.009999999999938
Enter your input:
9
3.00999999999998
 
     
    