Consider this code:
#include <stdio.h>
int main(void) 
{
    /* TEST 1 */
    double d = 128;
    char ch = (char)d;
    printf("%d\n", ch);
    /* TEST 2 */
    printf("%d\n", (char)128.0);
    /* TEST 3 */
    char ch1 = (char)128.0;
    printf("%d\n", ch1);
    return 0;
}
Results:
        gcc*  clang*  cl*
TEST 1  -128  -128    -128
TEST 2  127   0       -128
TEST 3  127   -2      -128
* latest version
Questions:
- Why the results differ between tests (excluding cl)?
- Why the results differ between compilers (excluding TEST 1)?
- In case of UB/IB, where is the UB/IB exactly? What the standard says?
- [Extra question] Why clangshows so different behavior? Where these0and-2come from?
 
     
    