Reading a 2 column text file and storing long int values into an array that is reallocated dynamically, fails when then array grows to over 200 thousand memory reallocations.
    long int load_list_unklen(long int (**ptr), FILE *infname)
    {   long int row =0;
        char buf[64];
        while (fgets (buf, sizeof(buf), infname)) {
            // M is defined to be = 2
            *ptr = (long int *)realloc(*ptr, M * row+1 * sizeof(ptr));
            if(!ptr) {
                perror("Out of memory");
                free(ptr);
                exit( 1); }
            sscanf(buf, "%ld\t%ld", &(*ptr)[row*2],&(*ptr)[row*2+1]);
            row += 1;
        }
        if (ferror(infname)) {
            fprintf(stderr,"Oops, error reading stdin\n");
            abort();
        }
        return  row;
    }
Notice that buf gets a string that has two numbers separated by a tab.  The code fails as it tries load a file with over 2mil lines and row increments stop around 221181, thus I wonder if this there a limit where realloc chokes? Should I be calling realloc differently?
Here is how I call the function:
long int *act_nei = (long int *)malloc(M * sizeof (act_nei) );
const long int sz  = load_list_unklen(&act_nei, fp);
Using code from a SO post to realloc memory slots, where my example is for large input files.
Realloc and sscanf into a functionRealloc and scanf
 
     
     
     
     
    