This is a simple exercise found in a book, which asks us to determine how long it takes for an amount to double at a given interest rate. My code is like this:
def main():
  x = eval(raw_input("Initial Principal: "))
  y = eval(raw_input("Interest rate: "))
  count = 0
  while x < 2*x:
      x = x * (1 + y)
      count = count + 1
  print (x)
  print (count)
main()
What it returns is:
Initial Principal: 1000
 Interest rate: 0.5
 inf
 1734
What's wrong with my code? Also I wonder if the above code would work if my amount and interest is small, e.g. amount = 1 and interest rate = 0.05, since there would include some floating point arithmetic I guess.
Thank you!
 
     
    