I know this has been discussed before, but I want to make sure I understand correctly, what is happening in this program, and why. On page 20 of Dennis Ritchie's textbook, The C Programming Language, we see this program:
#include <stdio.h>
int main()
{
int c;
c = getchar();
while(c != EOF){
    putchar(c);
    c = getchar();
}
return 0;
}
When executed, the program reads each character keyed in and prints them out in the same order after the user hits enter. This process is repeated indefinitely unless the user manually exits out of the console. The sequence of events is as follows:
- The - getchar()function reads the first character keyed in and assigns its value to- c.
- Because - cis an integer type, the character value that- getchar()passed to c is promoted to it's corresponding ASCII integer value.
- Now that - chas been initialized to some integer value, the while loop can test to see if that value equals the End-Of-File character. Because the- EOFcharacter has a macro value of- -1, and because none of the characters that are possible to key in have a negative decimal ASCII value, the condition of the while loop will always be true.
- Once the program verifies that - c != EOFis true, the- putchar()function is called, which outputs the character value contained in- c.
- The - getchar()is called again so it reads the next input character and passes its value back to the start of the while loop. If the user only keys in one character before execution, then the program reads the- <return>value as the next character and prints a new line and waits for the next input to be keyed in.
Is any of this remotely correct?
 
     
     
    