Update:
OSX El Capitan, Xcode 7.3
Input:
0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890001
Here is my code:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#define BUFFERSIZE 2048
int 
main()
{
    char buffer[BUFFERSIZE];
    printf("Enter a message: \n");
    while (fgets( buffer, BUFFERSIZE, stdin) != NULL)
    {
        printf("%s\n", buffer);
    }
    return 0;
}
compile and run it in terminal.
./test
then input one line characters and length is 1024. It doesn't work; it cannot print the buffer. When I input 1023 characters, it will print the 1023 characters. It can print more than 1024 characters which fgets reads from a local open file.
So, with standard input from a terminal, the max length is 1024, even though <syslimits.h> shows ARG_MAX is (256 * 1024).
What is wrong with my code?
I have some references:
 
    