AWS, Drupal and Caching: pt.4 Backups and more
I have grouped the recovery plan, Cron, slowlogs, and Mandrill together into one post because they tie together nicely due to the way each relies on the other in some way. The main task at hand is to have cron run a script every day which grabs the current mysql slowlogs and then emails them to the user via Mandrill. Then we will also setup the other crons to run our backup script (which uses the AWS scripts) and the usual Drupal cron.
I would suggest doing all the crontab work as root OR as sudo user, as otherwise you may get confused, as each user has their own crontab file. Best to stick to the one root one with all the permissions.
First step is for you to take a look in your crontab file, do this with the below command (the e option stands for edit).
Drupal/Application standard cron
There wont be much there other than some notes on how to use crontab, and also a blank canvas for all your exciting scripts. Let's start by quickly adding our Drupal cron, first login to your site and get your cron URL from /admin/reports/status, then add the below to crontab and modify. This will run the Drupal cron.php file every 30mins (the first item in a crontab line being minutes). Be aware though that this will likely not work until EIP/DNS is setup and propogated as mentioned at the end of pt3 in this series, due to the URL being directly referenced here. One way around that would be to alter your servers /etc/hosts file though to point outgoing requests for your new domain, back on itself (127.0.0.1 your.comain.com)
# Drupal crons */30 * * * * wget -O - -q -t 1 http://your.comain.com/cron.php?cron_key=sddfgiosidjd930u9uuk
Now let's setup the EC2 backups. This sounds complex, and in all rights it should be, but i'm afraid it isn't (sorry about that). First, exit crontab and install the aws CLI using the instructions here (http://docs.aws.amazon.com/cli/latest/userguide/installing.html). The first step of this is to install PIP (Python Package Index), then verify it is installed by checking the help documentation is returned
wget https://bootstrap.pypa.io/get-pip.py sudo python get-pip.py pip --help
Next we install the AWS CLI using PIP
sudo pip install awscli
Once you have the CLI installed, it is time to configure it (http://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started...) with your AWS keys (create the new access keys, and record them for future use, by going to https://console.aws.amazon.com/iam/home?#security_credential), my region was 'eu-west-1', and I just left output format empty.
Now, we should be ready to test it out, so go into your EC2 admin page, find the volumeID of the item you would like to backup, and note it down. Then open up terminal, and paste a command like the one below into a new *.sh file (remember to set the excutable bit), but replace the volID for your own
#!/bin/bash /usr/local/bin/aws ec2 create-snapshot --volume-id vol-xxxxxxx --description "$(date +\%Y-\%m-\%d) [Backup of testsite]"
Once you run that script, you should now have the site backing up in AWS under snapshots (https://eu-west-1.console.aws.amazon.com/ec2/v2/home?region=eu-west-1#Sn...).
Note: You may have credentials issues though, I did have to manually edit the credentials file once rather than using the nice 'aws configure' functionality which was made available in their documentation.
If things are working though, then let's just tell our crontab to run this once a day at about 2am (low traffic time).
# EC2 snapshots 0 2 * * * /var/www/scripts/backup/ec2.sh
Slow Query logs
The last of the 3 cron commands is another simple one, just email us the mysql slow query logs once a day, so let's paste this into the crontab file
# Logging 0 1 * * * mysqldumpslow /var/log/mysql-slow.log | mail -s "slow query log" email@example.com
Now, this may or may not work for you. The likelihood is, if you run that command (without the cron time bits at the beginning) you will get an error stating mail command not found. This shows that mail is not yet installed on the server, but even if it was, the mail would have been sent and likely ended up in a spam inbox (AWS has a history in the olden days of sending spam, so I wouldn't rely on one to send email without a lot of TLC). So what to do, well Mandrill is part of MailChimp, and is also completely free to use. What does it do? Well, we can tell postfix (the application on your server which sends out emails) to actually send the files to Mandrill to send for us, and it will then also help us track and see some cool graphs, and who doesn't like a cool graph. So let's get started by first installing mail
sudo apt-get install mailutils
You will still get an error though, as we don't actually have any slow log data yet, so add the below lines into /etc/mysql/my.cnf within the [mysqld] part, before then restarting mysql.
slow_query_log = 1 slow_query_log_file = /var/log/mysql-slow.log long_query_time = 30
If you still get an error like the file doesn't exist, then reduce long_query_time to 1 temporarily, then flush the drupal cache. (Remember to set back to 30, or however many seconds you thing a slow query is in this apps case). Also, you may need to create the blank log file first, if you do this though, be sure to set the file permissions to that of the other mysql log files so mysql is the owner (chown mysql:root mysql-slow.log)
Let's start by first ensuring everything is installed and configured as we want it to be:
apt-get install postfix mailutils libsasl2-2 ca-certificates libsasl2-modules apt-get install --reinstall postfix
Now we should just have to follow these instructions though for this (http://opensourcehacker.com/2013/03/26/using-postfix-and-free-mandrill-e...). Start by editing main.cnf (vim /etc/postfix/main.cnf) and adding these lines to the bottom
smtp_generic_maps = hash:/etc/postfix/generic inet_protocols = ipv4 smtp_sasl_auth_enable = yes smtp_sasl_password_maps = hash:/etc/postfix/sasl_passwd smtp_sasl_security_options = noanonymous smtp_use_tls = yes
Then edit relayhos to look like below
relayhost = [smtp.mandrillapp.com]
Exit this file, and now we will edit sasl_passwd (vim /etc/postfix/sasl_passwd). Get your API credentials for Mandrill from https://mandrillapp.com/settings and modify the below
Next we will install the ssl certs
cat /etc/ssl/certs/Thawte_Premium_Server_CA.pem | sudo tee -a /etc/postfix/cacert.pem
And finally restart the whole shebang
service postfix reload && service postfix restart
Give it a test, you should see your emails appear in the outgoing section of mandrill, and also appearing in your inbox. If not, take a look in the various mail logs aswell as 'mailq'. I also tweaked /etc/postfix/generic to map the email address it wanted to send from (firstname.lastname@example.org) to my actual email (email@example.com). If you do, you will need to remember to run postmap after your changes (postmap /etc/postfix/generic; service postfix restart), as it may not create the db file otherwise.
Is this it? Weeeell I had a few slight issues, one was that after the first email went out I realised the clock was an hour out as the server was on GMT rather than BST, so needed to make a quick change to that <pre>ln -sf /usr/share/zoneinfo/Europe/London /etc/localtime</pre>.
Cultivating a better future, with the seeds of better websites
For several years I have evolved through one of the largest media charities in the UK to the role of Senior developer. I loved it there, but have to stepped out on my own as a freelance developer.
My main skills are with the Drupal CMS, developing code for it, ensuring standards through projects (whether that design, development, or planning), and leading teams to build applications they can be proud of.
Digital Consultancy, Web Development and Project Architecture are where I try to focus my skills, supporting my passion and desire to create stunning websites on time, in budget, and meeting your objectives.
To build a powerful web presence, I combine my creative, technical, and managerial experience (alongside a good splash of passion) which I have cultivated over 10 years in the industry, creating websites people want to shout about.