How To Automate Backups For Your Online Store And Why I’m An Idiot

Share On Facebook

You are probably wondering why I randomly chose to write about how to automate backups for your online store or business. After 2 years and 3 months of running like a champ, our online store database crashed a couple weeks ago.

It wasn’t one of those mini crashes where you can simply fix up a broken table or two. Instead, it was a complete crash in which the entire database got hosed.

How does one hose an entire database? Well that day, the server that our store was hosted on was on the fritz and kept going down. The last straw broke when a few of our customers were in the midst of checking out when the server went out for an extended period and corrupted our database.

crashlibrarymistress

Photo By Library Mistress

I’m An Idiot

This all happened while I was at my day job so I began freaking out. The worst part was that I hadn’t backed up the database for nearly 2 months! 2 months worth of customer data, orders and new product updates were lost and I had no recent backup!

This wasn’t even considering the fact that we were losing revenue and it looked extremely unprofessional to display a buggy website. All told, our website was down for more than an entire day.

In a frantic effort to get things running again, I had to manually restore an old database and enter in the orders for the past few months by hand which thankfully we had a record of via email. Meanwhile our phone was ringing off the hook because the customers checking out were wondering what happened.

What also sucked was that the old backup was missing a ton of new products that we added during those two months. It was a complete disaster!

Make Backups Automated

Don’t be dumb like me. Make sure your backups are automated so you don’t forget because you will eventually. Tools like WordPress have nice plugins that handle it for you but not all web software will do this for you.

Our shopping cart software did not offer this feature so I wrote a little script to handle automated backups. Below is a tutorial of what I did. It is not a difficult thing to do and only took me a half hour to implement. If you are not comfortable with basic scripting, you should learn or find someone to help you.

Perl Script To Perform The Backup

It all starts with a script. You need to write a mini program that handles the actual commands to backup your mySQL database. The following is the script that I whipped up to handle the backup. If you feel inclined to look through the code, comments are preceded with the # sign. It isn’t the most elegant code that I’ve ever written, but it gets the job done.

use strict;
use File::Basename;
use File::Copy;
use File::Path;
use Cwd;

#Specify backup directory and max backups
my $maxBackups = 60;
my ($sec,$min,$hour,$mday,$mon,$year,$wday,$yday,$isdst) = localtime time;
my $backupdir = "[backup directory]";

$mon++;
$year+=1900;

#Format the month, day, min, sec to 2 digit decimal for sorting purposes
$mon = sprintf("%02d",$mon);
$mday = sprintf("%02d",$mday);
$hour = sprintf("%02d",$hour);
$min = sprintf("%02d",$min);
$sec = sprintf("%02d",$sec);

#Open the backup directory and delete oldest archive if num backups exceeds $maxBackups
chdir $backupdir or die "can't change dir to $backupdir";
opendir DIR, ".";
my @files = grep {-f "$_" } readdir(DIR);

if (scalar (@files) >= $maxBackups){
my @sorted = sort @files;
print "Number of backups ".scalar(@files)." exceeds $maxBackups\n";
unlink shift(@sorted);
}

#Run the backup command
my $time = "${year}_${mon}_${mday}_${hour}_${min}_${sec}";
print $time;
my $outname = "sqlBackup_${time}.sql";
my $outzip = "sqlBackup_${time}.zip";

my $sql = "mysqldump --add-drop-table --user [database_user] --password=[password] [Database_name] > $outname";
`$sql`;

#Zip up the results
my $zip = "zip $outzip $outname";
`$zip`;

unlink $outname;

The script above provides the following functionality:

  • It runs the mysqldump command to backup the database into a text file
  • It writes the backup into a directory of my choice
  • It keeps a record of the last 60 backups. When there are 60 backups, it automatically deletes the oldest one
  • To save space, it zips up the backup into an archive.

Even if the entire script went way over your head, you should at least become acquainted with the mysqldump command. The mysqldump command allows you to take a snapshot of your entire database and save it to a file. The syntax is

mysqldump --add-drop-table --user [database_user] --password=[password] [Database_name] > backup.sql

Fill in [database_user] with your username, [password] with the password and [Database_name] with your database name. Run the command on any unix shell and you’ll have a backup.

Running A Cron Job

Cron is a tool that allows you to schedule recurring tasks automatically. In my case, I want to schedule a cronjob to activate once a day and run the perl script that I wrote above.

There are 2 ways to do this. If you have SSH access to your webhost, you can use the crontab command. Some hosts have a Cron tool as part of their web based interface. You can use either interface depending on what you are comfortable with.

I’m more comfortable with unix so I setup my cronjob via the command line interface. This is what I did.

Type ‘crontab -e’. This will bring up an editor in which you can setup the schedule for your cron job. It obeys the following syntax.

* * * * * command to be executed
– – – – –
| | | | |
| | | | +—– day of week (0 – 6) (Sunday=0)
| | | +——- month (1 – 12)
| | +——— day of month (1 – 31)
| +———– hour (0 – 23)
+————- min (0 – 59)

I want to backup my database once a day at 12:00am so I set up my cronjob to look like this

0 0 * * * mybackupscript.pl

Once you are done, the cron job will run the mybackupscript once a day at midnight and send an email indicating that a backup was completed.

Don’t Get Cocky

For those first 2 years and 3 months, I felt like my website was invincible. But everything goes down eventually. Don’t trust yourself to make regular backups. Leave that to a computer!

Even if everything written in this article went over your head, just remember to ask your webmaster or web designer about automating backups using mysqldump and cron.

Ready To Get Serious About Starting An Online Business?


If you are really considering starting your own online business, then you have to check out my free mini course on How To Create A Niche Online Store In 5 Easy Steps.

In this 6 day mini course, I reveal the steps that my wife and I took to earn 100 thousand dollars in the span of just a year. Best of all, it's absolutely free!

Give Me Access To The Free Course!
Enter Your Email Address:
Share On Facebook

Similar Posts

Have you read these?

19 thoughts on “How To Automate Backups For Your Online Store And Why I’m An Idiot”

  1. You’re not an idiot. This kind of stuff happens to the best, which you are.
    This is really good advice. I always love how you write from experience.
    ….and yeah, lol, the prgramming kinda went over my head.

    1. Hey Shea,

      Well let’s see. I had the knowledge to backup my store automatically. I intended to do so at some point. I saw the need but still didn’t do it. That’s pretty dumb in my book. Oh well. Never again. Thank god everything has recovered.

  2. Wow Steve, sorry to hear about this. If it makes you feel any better, my laptop fried its motherboard today. It’s under warranty, fortunately.

    1. Thanks Jon, but your misfortune is not something that would make me feel better:) Hope all of your data was intact!

  3. B7 says:

    Thanks for sharing, Steve! I guess its part of human nature to not worry about it until we get screwed.

    The same thing happened to me with my home/work computer. I lost all my documents, music, pictures and all the data for my business, and all my client data too!

    My solution is automated online backup (mozy.com works pretty well). Yes, it costs $5 per month. However, it will automatically back up everything nightly, and there is not much of a space limitation. I guess computers have become so important in our lives that we need certain data backed up constantly.

  4. Chris says:

    Actually yes, you are an idiot. So are others who don’t have a backup for a crucial part of their business. But you’re even more of an idiot for posting about it online.

    1. Hey Chris,

      Can’t argue with you there. But the point of the article is to teach others how to automate their backups so it doesn’t happen to anyone else. I was performing backups manually before. Now, I will never lose data again. Live and learn

      1. B says:

        Steve, it’s amazing that you actually had to explain the obvious, regarding Chris’s not-so-kind remark. Thanks for the help.

  5. What doesn’t kill you will only make you stronger. Thank you for the article.

    I am not as tech savvy as you, so even though I have backup of my blog (which I’m thinking of changing the frequency to daily now that I read this), I’m not so sure if I can restore it once the time comes.

    I guess I have to start practicing.

  6. You’re not an idiot. It’s hard to manage a business AND work full-time. I am pretty much doing everything too and because I am not tech savvy, I have to learn this one step at a time. Though I have NO idea what you just wrote (codes, etc), I will keep this post in mind for future reference.

  7. I learned to do this with my personal computer as well. I have 1400 GB of hard drive space on my home system. I don’t want anything on it lost; as that equals hours of time recovering media or projects that simply will not be replaced.

    I set my computer to automatically backup, update, virus scan, spyware scan, ect. Thank goodness for automation! : )

  8. And for those of us on WordPress, there’s the handy-dandy WP-DB-Backup plugin. It’s so easy, there’s really no excuse to risk your site. I have it set to backup on a weekly basis.

    ari

  9. Hey Steve, glad you finally got the backup solution up and running. Are you storing your backups off your web server though? It’s great to have everything being dumped to a file but if your hard drive crashes or there’s a hardware issue, you’ll be in the same spot you were before. Add a few lines to your perl script to FTP those backup files to your home computer or to another server. I’m sure you’ve probably already setup something like this, but for anyone else who hasn’t thought to do this, it is NOT your web host’s responsibility to backup your hard drive. Even if some hosts offer it as a feature, rely on them as much as you’d rely on a nanny to teach your children about life, sure they can do it but they might not have your best interests in mind…

    1. Hey Jace,
      Thanks for the tip. I actually added a few extra lines to the script to automatically email me the backups to several email accounts. I also have a NAS at home running RAID 5 where I store everything of value. Definitely not going to make the same mistake twice.

  10. I.Adam says:

    hmm, interesting but a bit technical stuff. I am afraid it is not for non-tech guys. Lazy admins should use some kind of cloud backup service. Of course a good one but I like free. I use EverLive.net to take website backups to the cloud. Their autobackup service is great.

    Anyway, whatever method you use, make sure it should be very quick & easy to restore.

Leave a Reply

Your email address will not be published. Required fields are marked *