I have a big file that has about 11 Mb. It is a CSV file and I need to load the content of that file into a Postgres database.
I use a PHP script to do this job but always stop in some moment.
I put big size for PHP memory and other stuff and I could load more data but not all data.
How can I solve that? Is any cache memory that I need to clean? Some secret to manage big files in PHP?
Thanks in advance.
UPDATE: Add some code
$handler = fopen($fileName, "r");
$dbHandler = pg_connect($databaseConfig);
while (($line = $handler->fgetcsv(";")) !== false) {
    // Algorithms to transform data
    // Adding sql sentences in a variable
    // I am using a "batch" idea that execute all sql formed after 5000 read lines
    // When I reach 5000 read lines, execute my sql
    $results = pg_query($dbHandler, $sql);
}
 
    