2

I had the following situation:

I copied 5.5 TB worth of data (in total over 100,000 directories and 10 million files) from an old disk to a newly-purchased external disk.

I used Ultracopier on Windows 10 to do the entire copy and I had the Verify checksums option enabled in order to have the checksum of each and every file verified after the copy.

Next I used TreeSize and SpaceSniffer and verified that the total size of the backed up directories and the number of files and directories are exactly the same on the old and new drive.

Finally, I used Beyond Compare 4 which supposedly compares all attributes of files and folders with each other (size, dates, etc.). It is also capable of binary comparison. It reported that the data on the old and new drive are exactly identical even after binary comparison.

My question is threefold:

  1. Was the above approach reliable and comprehensive enough to conclude that the data is indeed identical on both drives and there were no errors/mess-ups during the copy process?
  2. What is the best and most reliable file attribute and/or approach for comparing two files? Is a checksum really enough? Or a binary comparison?
  3. How likely is it, in such a huge transfer of files, to end up with a few corrupted or partially copied files? Especially when Verify checksums was enabled. Is such a scenario likely at all, ever?

RELATED: Is verifying a MD5 sum after copying 100GB of data safe?

fesom
  • 21

0 Answers0