Problem:
Due to precision errors I get a number, say, 0.004999999999. Mathematically speaking 0.00499(9) equals 0.005 precisely, but not in computers.
Such level of precision is still fine, but when I display the number, the user expects to see 0.01, i.e. rounded to two decimal points, but obviously any sane rounding algorithm would return 0.00 instead since 0.004999999999 is definitely closer to 0.00 than to 0.01.
But the user understanably expects to see 0.01
Solution(?):
It can be done with "multistage rounding" like 0.004999999999.round(10).round(2) given we internally calculate everything to a precision of 10 decimal points.
It seems to be a very common problem, but I surprisingly couldn't find any conventional solution to it.