For representing money I know it's best to use C#'s decimal data type to double or float, however if you are working with amounts of money less than millions wouldn't int be a better option?
Does decimal have perfect calculation precision, i.e. it doesn't suffer from Why computers are bad at numbers? If there is still some possibility of making a calculation mistake, wouldn't it be better to use int and just display the value with a decimal separator?
 
     
     
     
    