3

I have a client that knows nothing about computers, and did a format and reinstall from the recovery partition of his old AMD Athlon II X4 desktop running Windows 7. This was his primary business machine.

I used Recuva and DiskDigger from a Live environment to copy the files to my 3TB external HDD.

Little did I know, DiskDigger has a file destination name limit, so I was forced to use the root directory of my already quite full external drive. It took almost 3 days to run both programs, and now, neither Windows Explorer or a minimal file manager in Puppy Linux can load the directory fully.

I tried using Folder Axe (a folder splitter) to break down the root dir into more manageable pieces, but it lags up when choosing that folder to work with.

I've tried optimizing the folder for Documents, (I read that reduces load time in Explorer) and disabling 8.3 filenames in Windows.

Yes, I know there was another question on SU about 'windows folder with millions of files not responding'. But none of the answers were really that great, nor did it really pertain to my particular issue.

Any ideas, workarounds, guesses?

An Dorfer
  • 1,178

2 Answers2

1

I would try using XCOPY (help xcopy in cmd.exe), but make sure to only tell it about the folder's name; i.e. don't: folder\*. For example:

xcopy /Q /Y C:\clientPath\bigDirectory Z:\externalPath\bigDirectory

That will xcopy the directory with it quietly (/Q) overwriting (/Y) all the files.

HopelessN00b
  • 1,891
AuoroP
  • 11
1

If I understand this correctly, you have the following problem:

  • You ran a file recovery tool, which can't recover folder structure
  • There's now approximately one gazillion files all in a single directory
  • Anything that attempts to enumerate these files crashes
  • You've tried to split this in to subfolders, but even that tool crashes

Lucky for you, I found another 1,562,922 files left over from this debacle... I'm pretty sure we can split this in to subfolders using PowerShell.

First, let's launch PowerShell ISE and enumerate all the files and store it in a variable.

$files = ls C:\recovered\

Run that command and go get some lunch. I promise it will finish eventually...

Now we're going to loop through all of these files and move them in to manageable subfolders. First, let's see how many we're dealing with:

$files.count

I'll try to explain it in chunks so you can change it easily for your needs... Forgive me if this is too verbose for you. We'll start by setting up our loop:

#This is to keep track of files
$i = 0

#This is to keep track of subfolders
$x = 0

do
{

}
while ($i -le $files.count)

I chose to divide mine every 1,000 files:

$i = 0
$x = 0

do
{
    #If $i is divisible by 1000 with no remainder...
    #Create a new folder
    if ($i % 1000 -eq 0){

            #Increment x
            $x++

            #Log progress to the console
            Write-Host "Creating folder $x..."

            #Create the new folder
            #Start with a DIFFERENT root folder
            New-Item -Path C:\recovered_subfolders -Name $x -ItemType directory
        }
}
while ($i -le $files.count)

Start this process in a folder other than the one you recovered all the files in, that way you can look at the files immediately.

Now we actually move the files. This is what my whole script looked like:

$i = 0
$x = 0

do
{
    if ($i % 1000 -eq 0){
            $x++
            Write-Host "Creating folder $x..."
            New-Item -Path C:\recovered_subfolders -Name $x -ItemType directory
        }
    Move-Item $files[$i] -Destination C:\recovered_subfolders\$x\
    $i++
}
while ($i -le $files.count)

It's been running for about 20 minutes now and I've only gotten through 22,000 files. Looks like you've got another weekend project ahead of you. Good luck.

rtf
  • 12,826