I've already build a recursive function to get the directory size of a folder path. It works, however with the growing number of directories I have to search through (and number of files in each respective folder), this is a very slow, inefficient method.
static string GetDirectorySize(string parentDir)
{
    long totalFileSize = 0;
    string[] dirFiles = Directory.GetFiles(parentDir, "*.*", 
                            System.IO.SearchOption.AllDirectories);
    foreach (string fileName in dirFiles)
    {
        // Use FileInfo to get length of each file.
        FileInfo info = new FileInfo(fileName);
        totalFileSize = totalFileSize + info.Length;
    }
    return String.Format(new FileSizeFormatProvider(), "{0:fs}", totalFileSize);
}
This is searches all subdirectories for the argument path, so the dirFiles array gets quite large. Is there a better method to accomplish this? I've searched around but haven't found anything yet.
Another idea that crossed my mind was putting the results in a cache and when the function is called again, try and find the differences and only re-search folders that have changed. Not sure if that's a good thing either...
 
     
     
     
     
     
     
     
    