I read many posts talking about importance of having multiple copies. but the problem is, even if you have multiple copies, how do you make sure that EVERY FILE in each copy is good. For instance, imagine you want to view a photo taken a few years ago, when you checkout copy 1 of your backup, you find it already corrupted. Then you turn to copy 2/3, find this photo is good. OK you happily discard copy 1 of backup and keep 2/3. Next day you want to view another photo 2, and find that photo 2 in backup copy 2 is dead but good in copy 3, so you keep copy 3, discard copy 3. Now some day you find something is wrong in copy 3, and you no longer have any copies with everything intact.

Someone may say, when we find that some files for copy 1 are dead, we make a new copy 4 from copy 2 (or 3), but problem is, there are already dead files in this copy 2, so this new copy would not solve the issue above.

Just wonder how do you guys deal with this issue? Any idea would be appreciated.

  • @Far_Marsupial6303B
    link
    fedilink
    English
    17 months ago

    Ideally you would have generated and saved a HASH before you copied your files as a control. Otherwise, it’s just a probability game. If the HASH on copy 1&2 match, but doesn’t match 3, then the probability is 1&2 are correct. If all three don’t match, you toss a coin.

    If you’re on Windows, I recommend using Teracopy for all your file copying (always copy, never move!) and set verify on, which will perform a CRC and generate a HASH which you can then save. You can also use it to Test your files after the fact and generate a HASH.