A little more about metabecap.

    Recently, an interesting topic habrahabr.ru/blogs/i_am_clever/53170 appeared on the hub; it was proposed to backup not the data itself, but only information about the location of the files + their identifiers for subsequent location in peer-to-peer networks.
    The author wrote a script to collect this information, and then, when it was needed, he merged everything via DC ++ back to the hard drive.

    The idea came to me that for backup with subsequent merging through DC it is not necessary to write any script at all, because DC ++ itself stores the “own file list” with TTH file identifiers. That is, it is enough to backup only the filelist, then open it in DC via "open own filelist" and put the files for downloading. The file list itself is stored in a convenient XML format along with TTH.

    The problem was also raised that not all files could subsequently be found. It would be interesting to know your opinion on the use of the Reed-Salomon recovery methods when, after scanning all the files, you create a file with redundant information, for example 10% percent of the original, from which it will be possible to recover files that were not found, in our case, if their number will not exceed 10% of the total.

    Also popular now: