Computers need backups. Hardware will fail. Are you prepared for the disaster? Will you know how to recover the important data?
I like backing up my data to Google Drive. I bought a Chromebook years ago, and have been a Google user from the Gmail invite days, so I have around 120GB of free cloud-based storage at my disposal. If I ever fill it up, their rates are pretty low to expand. So, given this, I would really sleep better if my database for Typica, running on a six year old Mac Mini, was exported every few days to my Google Drive for safe keeping.
Here’s a funny thing. Google is a firm believer in Linux. They use Linux on their hardware, the Googler’s desktops reportedly mostly run Linux, the Chromebook is Linux, and Android is essentially Linux. But yet, there is no official Google Drive client from Google for Linux. Seriously. This is so odd to me.
However, a Googler has contributed to an open-source product that is a useful method of uploading database exports to Google Drive.
Exporting the Database
In the previous blog for setting up the database I had you configure your user account for special privileges for backups. This is so you could run the pg_dump command like so:
pg_dump typica > typica.export.$(/bin/date +\%Y\%m\%d\%H\%M)
You should try this now and make sure it works without access errors to the database. This is the basic command for exporting the database and it needs to operate cleanly before we can proceed. Feel free to use more or less on the export file to examine the SQL commands that will rebuild your database after a disaster.
In the example I created the destination filename with a date string. The date string is created automatically at run time because the system executes Yet Another Command to capture the current date in a convenient sortable format: YYYYmmddHHMM. This is specified with the symbols “$(” preceding the date command, and a closing parens following the command.
Google Drive client
The software package I am using to send my backups to the cloud is called GDrive. It is coded with GO, the google language. You can obtain the source and some binaries here. I punted on building it from source for the moment. I downloaded the binary put it in /usr/local/bin.
Once you have the executable in your path, begin with getting the authorizations setup between your local computer accounting using gdrive and your Google Account.
You can execute “gdrive help” to see a lengthy list of operations you can perform with the client. It is very full featured, just perhaps a little unwieldy for complex operations. Fortunately for what we want, it will be pretty easy to keep it simple.
To get the Authorization setup between Google and your computer execute the gdrive command with any operation besides “help”. I recommend to start with “list”.
user@local~ $ ./gdrive list Authentication needed Go to the following url in your browser: https://accounts.google.com/o/oauth2/auth?access_type=offline&client_id=36713-farfarfarfapipick.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&response_type=code&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdrive&state=state Enter verification code:
The first time you run this it will prompt you with a URL and wait for your return an authorization code to allow it to proceed. Follow the instructions on screen.
Setup a Google Drive Subfolder
A “gdrive list” should list a confusing array of folders and documents in your Google Drive. It will be something like this:
0B4UCLDdeCDdisUdN33dtHeRealC0dE typica.export.201603230645 bin 64.4 KB 2016-03-23 06:45:02
You cannot operate on files and folder with the logical pretty names you see in the browser or other clients. You have to use the confusing code that precedes the logical information. By default it will only list 30 items and they seem to be those most recently modified.
Go to Google Drive in your browser and create a folder that you wish to contain your exports — something descriptive like “Typica Database Exports”. Go into that folder and observe the address URL. The non-English code following “folders” is what you need to specify for the location for your files that you wish to upload.
You can copy it from the address line or by using “gdrive list”.
Now we will attempt to upload the file you created in your pg_dump demo into your specially prepared folder. We specify the folder using the command line option “-p” and the code following it. The filename to upload comes at the end of the command line.
gdrive upload -p 3BuSyB33sW3nt2SaMEfL0w3r1Day typica.export.201603172133
The file should show up in the folder in your browser in short order.
Export to GDrive through pipeline
I think the ideal way to export the database to the cloud is to not even write it to disk before uploading it. If you write it to disk you will just have to delete it. The export to pipeline works for now. I don’t know how large it will grow to be and if this pipelining method will be a problem with memory in the future but that is something you will need to consider and I may have to revisit.
In the first export I had you perform I had you redirect the output to a file through the greater-than symbol. If you did not specify this symbol it would have printed to screen.
We can use another type of redirection called a pipeline, or special character “|”. This tells the Linux system to send the output to another executable as input. The gdrive executable has an option for the upload to specify the STDIN input from a previous command; this is the dash (“-“) in my example below. We still use the “-p CODE” options to specify the destination folder, and then we still follow it with the name you want this file to called when it is created in the cloud.
One last item before you schedule this in the cron scheduling facility. We will want to redirect both STDOUT and STDERR (2>) to a file. If we do not, the local mail facility on the Linux system will end up with many extraneous messages from the executions of the command.
Test this final command from your terminal. When it executes cleanly, with nothing written to the display, schedule it in cron:
/usr/bin/pg_dump -d typica |/usr/local/bin/gdrive upload - -p 3BuSyB33sW3nt2SaMEfL0w3r1Day typica.export.$(/bin/date +\%Y\%m\%d\%H\%M) >/home/kevin/db_backup.log 2>&1