Awhile back I was in a panic. As I went to my site to write an article I was met with a blank screen. And of course that’s never a good thing.

It was at that moment that I happened to turn around to see the announcement on CNN that wildfires had engulfing part of California. That’s when it dawned on me that my web host is located in California.

I don’t know that I can explain the sinking feeling that I got at that moment. On one hand if that was indeed the cause for the outage it would give me an excuse to start my site over from scratch. But let’s be honest, that was a rationalization, I would have been frustrated to say the least.

Thankfully for me it was a migration process gone awry that had caused the downtime, not wildfires. (I came to find out that the fires were further south than my hosting provider.) Nevertheless, it caused me to think about backing up my data.

I was consistent in my backups for awhile, but then I somehow forgot that routine backups are important. Recently I revisited the concept at hand with the knowledge that it was my own consistency that would be a problem with my regular maintenance schedule. I wanted to change that.

The Automatic Backup Script

I could set a reminder in my calendar to visit my database management application on a regular basis, but why bother? I wrote a small script that will connect with my database and backup the entire thing to a file. The initial script looks something like this:

Apart from the comments within the code, I’ll mention that if you would prefer not to have the file in gzip format you can leave off “| gzip”. You should also note that this will put the backups in the same folder as your script, unless you add in a directory as mentioned in the comments of the script. It would be a good idea to put the files in a directory that isn’t reachable from the web, that way you can be extra sure that someone cannot grab your backups and read sensitive information that may be held within.


If you are like me, however, you probably have multiple databases lying around, ones that you may forget about on a regular backup. And, if you are like me, they are probably on multiple hosts. So, I supercharged the script to connect to a particular host and pull the listed databases into individual backup files.

Running The Scripts

Being PHP files, you just need to put them in a place on your website and visit them with a web browser. You won’t see any response in your browser window, however, unless something goes wrong. If something does go wrong it will likely only tell you “no 1” or “no 2”. These errors tell you which connection failed. I wrote them like this so that in case some unscrupulous person attempts to use the script to gain access to a database the errors do not give away useful information.

You will probably want to verify that the script actually did something, so just use your FTP program and visit the directory you mentioned in the script. (If you mentioned one that is, otherwise the files are placed in the same directory as the script itself.) You should see your gzipped files staring back at you if everything ran smoothly.

To make your life even easier, you could setup a Cron Job to automatically make a backup every evening at midnight, for example. The choice is yours, but you may want to verify that the script is in fact running at the regular interval. I have had many Cron Jobs fail without notification.

Good luck and happy “backuping.”

Note: Make sure to backup your database before you run this script for the first time. I don’t see why it would do anything to your database, but that doesn’t mean your web server agrees with me. Just a warning. I also don’t claim that this script will work for you and that it won’t cause any damage, so use it at your own risk.