This is the main code of a method that I use to import an excel (using Maatwebsite Laravel-Excel 2) file to my database:
$data = Excel::selectSheetsByIndex(0)->load($file, function($reader) {})->get()->toArray();
DB::beginTransaction();
try {
    foreach ($data as $key => $value) {
        $med= trim($value["med"]);
        $serial = trim($value["nro.seriemedidor"]);
        DB::table('medidores')->insert([
            "med" => $med,
            "serial_number" => $serial
        ]);
    }
    DB::commit();
} catch (\Exception $e) {
    DB::rollback();
    return redirect()->route('myroute')->withErrors("Some error message");
}
This works fine when I have "few" data (let's say less than 5000 rows in the excel file). But I need to work with a large excel file which has like 1.4 million of rows, divided in more than 1 sheet. How could I make my method more faster? Any tip?
EDIT: I will edit the question with the code that was on the link of one of the comments of the answer:
$data = Excel::selectSheetsByIndex(0)->load($file, function($reader) {})->get()->toArray();
DB::beginTransaction();
try {
  $bulk_data = [];
  foreach ($data as $key => $value) {
    $med= trim($value["med"]);
    $serial = trim($value["nro.seriemedidor"]);
    $bulk_data[] = ["med" => $med,"serial_number" => $serial] ;
  }
  $collection = collect($bulk_data);   //turn data into collection
  $chunks = $collection->chunk(100); //split into chunk of 100's
  $chunks->toArray(); //convert chunk to array
 //loop through chunks:
 foreach($chunks as $chunk)
 {
   DB::table('medidores')->insert($chunk->toArray());
 }
  DB::commit();
} catch (\Exception $e) {
  DB::rollback();
  return redirect()->route('myroute')->withErrors("Some error message");
}
The chunk thing was what worked for me.
 
    