I have to make a function that lists all subfolders into a folder. I have a no-file filter, but the function uses scandir() for listing. That makes the application very slow. Is there an alternative of scandir(), even a not native php function? Thanks in advance!
- 
                    4Just how many files/dirs are in the directory you're scanning? It shouldn't be THAT slow, unless you're doing a `stat()` on each dir as it comes up, or there's thousands of files in there. – Marc B Mar 02 '11 at 20:09
- 
                    1http://www.php.net/manual/en/function.scandir.php#96326 – Detect Mar 02 '11 at 20:13
- 
                    1http://www.php.net/manual/en/function.scandir.php#73062 – Detect Mar 02 '11 at 20:14
- 
                    1have a look at the php manual under "Example #2 PHP 4 alternatives to scandir()" – Jeff Busby Mar 02 '11 at 20:29
2 Answers
You can use readdir which may be faster, something like this:
function readDirectory($Directory,$Recursive = true)
{
    if(is_dir($Directory) === false)
    {
        return false;
    }
    try
    {
        $Resource = opendir($Directory);
        $Found = array();
        while(false !== ($Item = readdir($Resource)))
        {
            if($Item == "." || $Item == "..")
            {
                continue;
            }
            if($Recursive === true && is_dir($Item))
            {
                $Found[] = readDirectory($Directory . $Item);
            }else
            {
                $Found[] = $Directory . $Item;
            }
        }
    }catch(Exception $e)
    {
        return false;
    }
    return $Found;
}
May require some tweeking but this is essentially what scandir does, and it should be faster, if not please write an update as i would like to see if i can make a faster solution.
Another issue is if your reading a very large directory your filling an array up within the internal memory and that may be where your memory is going.
You could try and create a function that reads in offsets so that you can return 50 files at a time!
reading chunks of files at a time would be just as simple to use, would be like so:
$offset = 0;
while(false !== ($Batch = ReadFilesByOffset("/tmp",$offset)))
{
    //Use $batch here which contains 50 or less files!
    //Increment the offset:
    $offset += 50;
}
 
    
    - 56,863
- 21
- 114
- 161
- 
                    1everything seems perfect, but how to disable this . and .. folders. I also have an empty index, that i dont want to list. I guess it's in the catch(Exception), but how to format it? – Emil Avramov Mar 02 '11 at 21:36
- 
                    1Ive updated so that it does not contain `.` or `..` but im not sure what you mean in regards to the empty index – RobertPitt Mar 02 '11 at 21:45
Don't write your own. PHP has a Recursive Directory Iterator built specifically for this:
http://php.net/manual/en/class.recursivedirectoryiterator.php
As a rule of thumb (aka not 100% of the time), since it's implemented in straight C, anything you build in PHP is going to be slower.
 
    
    - 3,105
- 20
- 18
- 
                    2Unless you are writing a real time software the "slowness" of php doesn't matter at all – dynamic Mar 02 '11 at 21:47
- 
                    1Wrap the directory iterator in a `ParentIterator` (or a bespoke iterator filter) and bam, no files. – salathe Mar 05 '11 at 22:01
- 
                    
 
    