{"id":2573,"date":"2015-09-07T22:04:25","date_gmt":"2015-09-07T19:04:25","guid":{"rendered":"http:\/\/9v.lt\/blog\/?p=2573"},"modified":"2022-01-19T08:34:37","modified_gmt":"2022-01-19T06:34:37","slug":"my-backup-procedure","status":"publish","type":"post","link":"http:\/\/9v.lt\/blog\/my-backup-procedure\/","title":{"rendered":"My backup procedure"},"content":{"rendered":"<p>Oh hello again&#8230; man, lately I have no time to keep my blog updated, but I&#8217;m glad I can write a post occasionally at least.<br \/>\nThis time I&#8217;d like to write about how I do backups and a way I automated half of it. It might not be the most efficient way or &#8220;the right way&#8221;, but as with anything &#8211; it works for me, so shut up! :P (j\/k, you can bitch in the comments).<br \/>\nAnyway, I have several places to grab backups from and one place to store them &#8211; I don&#8217;t use mainstream cloud services (maybe I should&#8230;?), instead I store those onto my external drive. Google drive gives free 15GB to store stuff, in the future I might use that instead of my external, but the place doesn&#8217;t matter, it&#8217;s HOW I store those backups that matters.<br \/>\n<!--more--><br \/>\nLike I said before, I have several places to backup, which include a couple of websites with insufficient storage space to keep backups on the server (you should keep them off site anyway eh? :)) and some of important computer files and their system images.<\/p>\n<p>With computers it&#8217;s basic &#8211; clean, defrag, make system clone, transfer to external hdd.<br \/>\nI also have a raspberry home media server (wrote about it <a href=\"http:\/\/9v.lt\/blog\/what-i-use-my-raspberry-for\/\" target=\"_blank\" rel=\"noopener\">here<\/a>), I make an SD card clone of that too. Some say I should use RSync instead, so I don&#8217;t have to power it off, but tbh it seems less hassle to simply make a clone off SD card and store it.<br \/>\nAfter making a clone, I use a software like FBackup to sync my important <del datetime=\"2015-09-07T18:14:26+00:00\">porn<\/del> files.<\/p>\n<p>After that I would go ahead and order website backups through DirectAdmin, download those and then transfer them, but this part was so tedious that I would often forget about it. So I thought of automating that by writing up a simple python backup script.<br \/>\nThe automated process is made up of 2 parts &#8211; PHP scripts that send requests to DA on servers gets run via cronjob at midnight, backups get made in few minutes. Then the Python script running at 1AM on my RPI grabs these backup archives via FTP from those websites and stores them on NAS.<br \/>\nSimple as that.<\/p>\n<p>PHP files are available publicly from DirectAdmin <a href=\"http:\/\/files.directadmin.com\/services\/all\/backup.php.txt\" target=\"_blank\" rel=\"noopener\">here<\/a> and my Python backup script is available <a href=\"http:\/\/9v.lt\/projects\/python\/autoBackup.py\" target=\"_blank\" rel=\"noopener\">here<\/a>.<\/p>\n<p>Hopefully it helps someone to save time as it does for me :)<\/p>\n<p>Also for a quick overview:<\/p>\n<pre lang=\"python\" line=\"1\">\r\n#!\/usr\/local\/bin\/python\r\n'''\r\n    Date: 2015.08.02\r\n    Author: Kulverstukas\r\n    Website: 9v.lt; evilzone.org\r\n    Description:\r\n        Downloads backup files from \/backups in set hosts\r\n        and downloads whole website content from defined website.\r\n'''\r\n\r\nimport os\r\nimport sys\r\nimport time\r\nimport shutil\r\nimport zipfile\r\nimport ftputil\r\nfrom ftplib import FTP\r\n\r\n#-------------------------------------------------------\r\n# where to put everything\r\nrootFolder = 'webhost_backups'\r\n\r\n# configuration of as many webshosts as you have\r\nwebhosts = [\r\n    {'folder' : 'somesite.lt', # where to store your files\r\n     'address' : 'somesite.lt', # URL of your website\r\n     'username' : 'lorem', # FTP username\r\n     'password' : 'ipsum', # FTP password\r\n     'backup_path' : '\/backups', # usually this is where backups are stored on the server\r\n     'mirror' : True}, # should mirror the whole website starting from root\r\n     \r\n     {'folder' : 'subdomain.domain.com',\r\n     'address' : 'subdomain.domain.com',\r\n     'username' : 'lorem',\r\n     'password' : 'ipsum',\r\n     'backup_path' : '\/backups',\r\n     'mirror' : False}\r\n]\r\n#-------------------------------------------------------\r\n# create folders if they don't exist already\r\nif not os.path.exists(rootFolder) or not os.path.isdir(rootFolder):\r\n    os.mkdir(rootFolder)\r\n#-------------------------------------------------------\r\n# replace symbols with encoded equivalents\r\ndef makeFriendlyFilename(input):\r\n    badSymbols = {'<' : '%3C',\r\n                  '>' : '%3E',\r\n                  ':' : '%3A',\r\n                  '\"' : '%22',\r\n                  '|' : '%7C',\r\n                  '?' : '%3F',\r\n                  '*' : '%2A'}\r\n    i = ''\r\n    for i in badSymbols.keys():\r\n        input = input.replace(i, badSymbols[i]);\r\n    return input\r\n#-------------------------------------------------------\r\ndef zipdir(path, zipname):\r\n    zipf = zipfile.ZipFile(zipname, 'w')\r\n    for root, dirs, files in os.walk(path):\r\n        for dir in dirs:\r\n            zipf.write(os.path.join(root, dir))\r\n        for file in files:\r\n            zipf.write(os.path.join(root, file))\r\n    zipf.close()\r\n#-------------------------------------------------------\r\n# download all of the files from FTP\r\ndef mirrorFtp():\r\n    for config in webhosts:\r\n        if config['mirror']:\r\n            today = time.strftime(\"%Y-%m-%d\")\r\n            mirrorName = config['folder']+'_mirror_'+today\r\n            localMirrorRoot = os.path.join(rootFolder, config['folder'], mirrorName)\r\n            if not os.path.exists(localMirrorRoot) or not os.path.isdir(localMirrorRoot):\r\n                os.makedirs(localMirrorRoot)\r\n                \r\n            ftp = ftputil.FTPHost(config['address'], config['username'], config['password'])\r\n            for root, dirs, files in ftp.walk('\/'):\r\n                for dir in dirs:\r\n                    try:\r\n                        os.mkdir(os.path.join(localMirrorRoot, root[1:], dir))\r\n                    except:\r\n                        pass\r\n                for file in files:\r\n                    try:\r\n                        ftp.download(root+'\/'+file, os.path.join(localMirrorRoot, root[1:], makeFriendlyFilename(file)))\r\n                    except:\r\n                        print root+'\/'+file\r\n            zipdir(localMirrorRoot, localMirrorRoot+'.zip')\r\n            shutil.rmtree(localMirrorRoot)\r\n#-------------------------------------------------------\r\n# download backup files made by DirectAdmin\r\ndef getBackupFiles():\r\n    ftp = FTP()\r\n    for config in webhosts:\r\n        if not os.path.exists(rootFolder+'\/'+config['folder']) or not os.path.isdir(rootFolder+'\/'+config['folder']):\r\n            os.mkdir(rootFolder+'\/'+config['folder'])\r\n        try:\r\n            ftp.connect(config['address'])\r\n            ftp.login(config['username'], config['password'])\r\n            ftp.cwd(config['backup_path'])\r\n            filesToDl = ftp.nlst();\r\n            for file in filesToDl:\r\n                if (file not in ['.', '..']):\r\n                    ftp.retrbinary('RETR '+file, open(rootFolder+'\/'+config['folder']+'\/'+file, 'wb').write)\r\n                    try:\r\n                        # comment this out if you want to keep backup archives on the server\r\n                        ftp.delete(file)\r\n                    except:\r\n                        pass\r\n            ftp.quit()\r\n        except:\r\n            pass\r\n#-------------------------------------------------------\r\n\r\ngetBackupFiles()\r\nmirrorFtp()\r\n\r\n#-------------------------------------------------------\r\n<\/pre>\n","protected":false},"excerpt":{"rendered":"<p>Oh hello again&#8230; man, lately I have no time to keep my blog updated, but<\/p>\n","protected":false},"author":2,"featured_media":2574,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6,5],"tags":[943,108,781],"class_list":["post-2573","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-eng","category-my-tutos","tag-backup","tag-python","tag-server"],"_links":{"self":[{"href":"http:\/\/9v.lt\/blog\/wp-json\/wp\/v2\/posts\/2573","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/9v.lt\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/9v.lt\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/9v.lt\/blog\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"http:\/\/9v.lt\/blog\/wp-json\/wp\/v2\/comments?post=2573"}],"version-history":[{"count":0,"href":"http:\/\/9v.lt\/blog\/wp-json\/wp\/v2\/posts\/2573\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"http:\/\/9v.lt\/blog\/wp-json\/wp\/v2\/media\/2574"}],"wp:attachment":[{"href":"http:\/\/9v.lt\/blog\/wp-json\/wp\/v2\/media?parent=2573"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/9v.lt\/blog\/wp-json\/wp\/v2\/categories?post=2573"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/9v.lt\/blog\/wp-json\/wp\/v2\/tags?post=2573"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}