Computing platforms—sets of hardware and software working together—define how arguments should be passed to functions. This specification of how arguments are passed is part of a specification typically called the application binary interface (ABI).
The details about how an argument should be passed may depend on several factors, including:
- its size,
- its alignment requirement,
- its fundamental type (integer, floating-point, pointer, et cetera),
- whether it is an aggregate of some sort, such as a structure or union, and
- whether all the details about the argument are known at compile time.
The means by which an argument should be passed may include:
- whether it is passed in a register, on the stack, or in memory,
- how it is aligned in memory, and
- which set of processor register it is passed in.
The expression 3 / 2 has an int type. It will be passed in the way the ABI specifies for an int argument. When you specify %f in a printf format, printf expects a double, and printf looks for the argument in the place the ABI specifies for a double.
Thus, in printf("%f" , 3 / 2);, there is no guarantee that printf even sees the int value that was passed. It may get data from the wrong memory or register entirely.
If printf does get the data for the int value, it will interpret those bytes as if they were double value. The values of bytes have different meanings when they are encoding an int than when they are encoding a double. So the values in the bytes that encode an int value of 1 do not encode a 1 when they are encoding a double value. So, even if printf, when it is formatting a double for %f, gets the bytes for the int value of 1, the characters it produces are not likely to be “1”.
The C standard defines how you should use printf and its format specifiers so that the ABI can work. When you violate the C rules about matching argument types with format specifiers, the C standard does not define what behavior results. It leaves you at the mercy of the C implementation. Historically, this meant you were violating the ABI, and the program would break for the reasons I describe above. Over time, compilers have become more aggressive about optimization and other program transformations, with the result that violating the rules of the C standard can transform program behavior in more surprising ways.