I recently realized that my host supports scheduling scripts via cron. In looking for more info on cron on geeklog.net I came across
[*1] by Tony a few years back and a script he wrote to create backups of the GL database.
Text Formatted Code
#!/usr/bin/php -q
<?php
/**
* Date: 2/7/2002
*
* This script will backup a database on a daily,
* weekly and monthly basis. This script will keep only
* 7 daily backups, 4 weekly backups and 12 monthly backups
*
*/
// This directory you specify below expects the following subdirectories: daily/, weekly/, monthly/
$backup_dir = '/home/tony/backups/';
// MySQL Acccount info. This should not be world readable unless you don't mind your
// mysql info being viewed.
$mysqlhost = 'localhost';
$mysqluser = 'geeklog';
$mysqlpass = '';
// The database you want to backup
$database = 'geeklog';
// path to mysqldump
$mysqldump = '/usr/bin/mysqldump';
// Change this to point to the location of your tar
$tar_cmd = '/bin/tar';
/** You should not have to touch anything below here **/
/***********************
* DAILY BACKUP
***********************/
$backup_file = date('mdY') . '_daily_backup.sql';
$old_backups = getOldBackups($backup_dir . 'daily/');
// Sort results and remove oldest backup
if (sizeof($old_backups) > 6) {
rsort($old_backups);
$oldest_backup = array_pop($old_backups);
unlink($backup_dir . "daily/$oldest_backup");
}
// Dump the database
exec("$mysqldump -h$mysqlhost -u$mysqluser -p$mysqlpass $database > " . $backup_dir . 'daily/' . $backup_file);
exec("$tar_cmd --directory $backup_dir" . "daily/ -czf $backup_dir" . "daily/$backup_file" . '.tar.gz ' . $backup_file);
unlink($backup_dir . 'daily/' . $backup_file);
/***********************
* WEEKLY BACKUP
***********************/
// If at end of week do weekly backup
if (date('w') == '0') {
if ($old_backups = getOldBackups($backup_dir . 'weekly/')) {
if (sizeof($old_backups) > 3) {
rsort($old_backups);
$oldest_backup = array_pop($old_backups);
unlink($backup_dir . "weekly/$oldest_backup");
}
}
copy($backup_dir . 'daily/' . $backup_file . '.tar.gz' , $backup_dir . 'weekly/' . str_replace('daily','weekly',$backup_file) . '.tar.gz');
}
/***********************
* WEEKLY BACKUP
***********************/
// If at end of month do monthly backup
if (date('d') == '01') {
if ($old_backups = getOldBackups($backup_dir . 'monthly/')) {
if (sizeof($old_backups) == 12) {
rsort($old_backups);
$oldest_backup = array_pop($old_backups);
unlink($backup_dir . "monthly/$oldest_backup");
}
}
copy($backup_dir . 'daily/' . $backup_file . '.tar.gz', $backup_dir . 'monthly/' . str_replace('daily','monthly',$backup_file) . '.tar.gz');
}
/**
* This simply gets all the files in an directory and loads the
* file names into an array
*
*/
function getOldBackups($dir_path)
{
if (!is_dir($dir_path)) return false;
$fd = opendir($dir_path);
$old_backups = array();
$index = 1;
while (($f = @readdir($fd)) == TRUE) {
if (!is_dir($f)) {
clearstatcache();
$old_backups[$index] = $f;
$index++;
}
}
closedir($fd);
return $old_backups;
}
?>
1- Is this script still viable for backing up gl v1.4.1 sites? (written back in Feb '02)
One thing I notice is that the variables for the database are duplicated in the script. Would it be possible to just include the Geeklog config file, so that no username/password/dbname info was stored in the actual script? I know that would make the script geeklog specific, but hey, that's the goal of posting here, right?
Ultimately, I am looking for a script solution that runs via cron, that will backup my database to an .sql file, and then archive my entire Geeklog install (both public_html/ and path_to_geeklog/ ) to a tar.gz file and then push it to a remote FTP site of my choosing and email me notifications. Icing on the cake would be weekly full backups, and daily incremental backups. Another goal would be to then take the tar.gz backup, and in a local environment (probably Windows based in my case) with a XAMPP install, extract the tar.gz archive, import the .sql database, and then run a grep command (I have a grep tool installed in XP) to change the site name to my local install. The goal is to sync my live site with my local site for development purposes... and backup purposes. :wink:
Anyone have a good solution they can share? I know some things depend upon hosting environment, but I am shooting for the lowest common denominator, shared hosting in a Linux environment. And one thing I've noticed among my different GL sites on different databases is that mysqldump is not always in the same place from server to server. Where are some other logical places to look, in addition to the default /usr/bin/ location?