I am writing a program that reads a dictionary file (text file, one word per line), and inserts it into a structure consisting of a "linked list" of arrays, where the word is recursively forwarded to the [first letter - 'a'] entry of an array (which is another array, that processes the next letter). When the whole word is "consumed", it inserts the word (unchanged) in a regular linked list of words. The program successfully processes first 15 words, but at 16-th it throws segmentation fault.
It looks like the segmentation fault occurs in add() method in the following snippet:
struct LinkedList * new = (struct LinkedList *) calloc(1,
                                       sizeof(struct LinkedList));
            if (!new) {
                perror("Not enough memory!"); // Error here
                exit(2);
            }
(Hopefully) Relevant code:
void addList (struct LinkedList * list, char * word) {
    if (!list->next)
    {
        struct LinkedList * new = malloc(sizeof(struct LinkedList));
        if (!new) {
            perror("Not enough memory!");
            exit(2);
        }
        char * new_word = malloc(strlen(word) * sizeof(char));
        if (!new_word) {
            fprintf(stderr, "Not enough memory!");
            exit(2);
        }
        new->next = 0;
        strcpy(new_word, word);
        new->word = new_word;
        list->next = new;
    }
    else
    {
        addList(list->next, word);
    }
}
void add(struct HashTree * root, char * word, char * word_sorted, int length) {
    if (length == 0)                                     // Found the correct place
    {
        if (!root->words)                                // If words are not allocated
        {
            // Create list node
            struct LinkedList * new = calloc(1, sizeof(struct LinkedList));
            if (!new) {
                perror("Not enough memory!");
                exit(2);
            }
            char * new_word = malloc(strlen(word) * sizeof(char));
            if (!new_word) {
                fprintf(stderr, "Not enough memory!");
                exit(2);
            }
            new->next = 0;
            strcpy(new_word, word);
            new->word = new_word;
            root->words = new;
        }   
        else                                            // Add to the Linked List
        {
            addList(root->words, word);
        }
    }
    else 
    {
        // printf("Length_add = %d\n", length);
        if (!root->next)                                 // If the array was not allocated yet
        {
            struct HashTree * new = malloc(27 * sizeof(struct HashTree *));
            if (!new) {
                perror("Not enough memory!");
                exit(2);
            }
            root->next = new;
        }
        add(&(root->next[ word_sorted[0] - 'a' ]), 
            word, (char *) (word_sorted +1), (length-1));  // Use the next letter.
    }
}
To save the space, Here is the link to the full code.
Here is the output of gdb core and backtrace:
    Program terminated with signal SIGSEGV, Segmentation fault. 
100 perror("Not enough memory!"); 
I implemented a similar algorithm in Java before, and the algorithm seems to be correct. I am pretty new to C, and don't understand what might be wrong. I would really appreciate any help!
EDIT: Deleted sort, clean and cleanWords methods (they does not greatly affect the adding of the word to the structure). Segmentation occurred while processing the second word, line 125:
perror("Dictionary file not found!");
 
     
    