I know the basic definition of base-2 and base-10 But I wonder what is the difference between them in the performance and the speed of a program.
for example:
in C#, data type double is a Base-2 and data type decimal is a Base-10
so that double is very fast in calculations and decimal is up to 10x slower than double.
I don't understand why this ,so please anyone explain this to me and thanks in advance :)