Because '1' and 1 are not the same thing. '1' is a character literal that represents the number 1, the underlying value that represents the character '1' depends on the charset in use.
For example, in ASCII the character 1 is represented by the decimal value 49 (hexadecimal 0x31). So assuming ASCII, c = c << 2 assigns the value 196 (multiplies by 4) to c. If a char is signed and is 8 bits long (which is often the case) this will overflow and invoke undefined behavior.
If you want it to print 1 and 2 you should a) multiply by 2 instead of 4 (that is, shift 1 instead of 2), and b) store the numeric value instead of its representation (but of course, then you need to adjust it when printing):
int main()
{
char c = 1;
printf("%c\n", c+'0');
c = c << 1;
printf("%c\n", c+'0');
return 0;
}