How I organized, synchronized, and backed up files on my home computer



    The first time I encountered the problem of cluttering hard drives on a computer in high school. Then I solved the problem by structuring all my documents using Evernote, putting down the necessary tags and transferring all the documents to this wonderful software. All other files both lay on the heap and continued to lie.

    There are even more files in the uni, the number of computers at home has increased to 3 pieces, I started making backups. So far, just making copies of current folders on your computer, not really bothering with their structure. At the end of the training, I thought about whether to store a huge amount of my documents on external servers. After thinking a little, I decided it was better not to do this and decided to get off the Evernote needle. Previously, I thought over the structure of storing files on a computer, so that even without a search it is easy to find the necessary information. For each topic that interested me, I created a folder in the / home / username folder. This I called categories. Within each category there were project subfolders, the misc folder was mandatory in almost every folder, so that in the file manager not to see heaps of randomly piled unstructured files. For instance, I had folders Bioinformatics / Aligner, Development / Projects / GameOfLife. There were clear rules for naming files and folders (without underscores, camelCase, folders with a capital letter, files with a small one). Everything seemed to be fine, but I was lazy and didn’t always beautifully upload files to the necessary folders, which ultimately led to cluttering up my structure. I decided to try something else ...

    I decided to raise the mediawiki on the localhost and synchronize it between the two main computers (one is stationary at home, the other is always with me). All synchronization of all folders in which the necessary files were stored (and backups too) were done using rsync. When I came home, I ran the iCameHome.sh script, which uploaded all the changes to the home computer, which was also a backup server at the same time (and now it is also a backup server). When I left for work, I started iWentOut.sh, which uploaded the changes in the opposite direction. Everything seemed to be fine, the home wiki synchronized easily and naturally, like the other folders on the computer that I included in the rsync script. But I began to notice that the further, the less I fill out my Wikipedia, as it takes some time to find the necessary article, despite the affixed categories of the articles. And all the more I do not want to do this. I didn’t like the search in mediawiki, maybe I didn’t read it or finish it. But basically here.

    What have I come to? My plan for moving to a new file structuring system consisted of two parts:

    • Configuring a git server on a home stationary computer
    • Virtualization


    The first point is very simple. We take and put git-core, add the ssh key of the laptop to the list of available ones, disable ssh password access and open the ssh port of the stationary computer outside. Especially paranoid personalities are advised to configure port knocking. After that, for each folder that needs to be synchronized, we create a git repository and commit to it from both computers - from the stationary and laptop, depending on when and where the changes were made.

    The second point is virtualization of the workspace. First, I make typical virtual machines: a virtual one with raised mysql + IDE configured for me, a virtual for safe web surfing, and a virtual for a specific project. After the virtual machine is brought to mind, its full backup is done. Then only the project folders are synchronized via git in the way described above. All work is done in virtual machines, on the host I still have Skype, Minecraft, a torrent download and a VLC player.

    What are the advantages of this solution?

    • Minimum host requirements (regarding settings). Only virtualbox and git are needed. Therefore, it does not matter which system to work under, you can easily experiment.
    • git server is raised with one command, all other settings, as I wrote above, are also very simple
    • You can synchronize both all folders (by adding git pull for the necessary repositories), and individual project folders by pulling from the inside of the folder. With rsync, it’s more difficult here, because you have to go into the general script and copy the command, since it is long and you always remember all the paths to the files.
    • If it is not possible to synchronize the repository, then you can safely change the files, add, commit locally, and then make a merge
    • There is a history, any file deletion or loss of any changes will be recorded. You can track file changes.


    Minuses?

    • Perhaps the history of the repositories will swell greatly, but so far I can’t say anything, I switched to such a system only recently.
    • The inability to synchronize files with phones / tablets, but for me personally this is not necessary. I came to the fact that on the phone / tablet should be only those files that somewhere else are already on the desktop computer, so I protected myself from losing files on mobile devices


    I will be glad to hear in the comments who solves the problem of backup and data synchronization between devices.


    Also popular now: