I'm writing code in Laravel 5 to periodically backup a MySQL database. My code thus far looks like this:
    $filename = 'database_backup_'.date('G_a_m_d_y').'.sql';
    $destination = storage_path() . '/backups/';
    $database = \Config::get('database.connections.mysql.database');
    $username = \Config::get('database.connections.mysql.username');
    $password = \Config::get('database.connections.mysql.password');
    $sql = "mysqldump $database --password=$password --user=$username --single-transaction >$destination" . $filename;
    $result = exec($sql, $output); // TODO: check $result
    // Copy database dump to S3
    $disk = \Storage::disk('s3');
    // ????????????????????????????????
    //  What goes here?
    // ????????????????????????????????
I've seen solutions online that would suggest I do something like:
$disk->put('my/bucket/' . $filename, file_get_contents($destination . $filename));
However, for large files, isn't it wasteful to use file_get_contents()? Are there any better solutions?