Welcome to Geeklog Sunday, January 17 2021 @ 02:10 am EST

Geeklog Forums

Backup database via cron

Status: offline


Forum User
Full Member
Registered: 10/02/04
Posts: 176
Location:Boise, Idaho
I recently realized that my host supports scheduling scripts via cron. In looking for more info on cron on geeklog.net I came across this story by Tony a few years back and a script he wrote to create backups of the GL database.

PHP Formatted Code
#!/usr/bin/php -q

* Date: 2/7/2002
* This script will backup  a database on a daily,
* weekly and monthly basis.  This script will keep only
* 7 daily backups, 4 weekly backups and 12 monthly backups

// This directory you specify below expects the following subdirectories: daily/, weekly/, monthly/
$backup_dir = '/home/tony/backups/';

// MySQL Acccount info.  This should not be world readable unless you don't mind your
// mysql info being viewed.
$mysqlhost = 'localhost';
$mysqluser = 'geeklog';
$mysqlpass = '';

// The database you want to backup
$database = 'geeklog';

// path to mysqldump
$mysqldump = '/usr/bin/mysqldump';

// Change this to point to the location of your tar
$tar_cmd = '/bin/tar';

/** You should not have to touch anything below here **/


$backup_file = date('mdY') . '_daily_backup.sql';

$old_backups = getOldBackups($backup_dir . 'daily/');

// Sort results and remove oldest backup
if (sizeof($old_backups) > 6) {
    $oldest_backup = array_pop($old_backups);
    unlink($backup_dir . "daily/$oldest_backup");

// Dump the database
exec("$mysqldump -h$mysqlhost -u$mysqluser -p$mysqlpass $database > " . $backup_dir . 'daily/' . $backup_file);
exec("$tar_cmd --directory $backup_dir" . "daily/ -czf $backup_dir" . "daily/$backup_file" . '.tar.gz ' . $backup_file);
unlink($backup_dir . 'daily/' . $backup_file);


// If at end of week do weekly backup
if (date('w') == '0') {
    if ($old_backups = getOldBackups($backup_dir . 'weekly/')) {
        if (sizeof($old_backups) > 3) {
            $oldest_backup = array_pop($old_backups);
            unlink($backup_dir . "weekly/$oldest_backup");
    copy($backup_dir . 'daily/' . $backup_file . '.tar.gz' , $backup_dir . 'weekly/' . str_replace('daily','weekly',$backup_file) . '.tar.gz');


// If at end of month do monthly backup
if (date('d') == '01') {
    if ($old_backups = getOldBackups($backup_dir . 'monthly/')) {
        if (sizeof($old_backups) == 12) {
            $oldest_backup = array_pop($old_backups);
            unlink($backup_dir . "monthly/$oldest_backup");
    copy($backup_dir . 'daily/' . $backup_file . '.tar.gz', $backup_dir . 'monthly/' . str_replace('daily','monthly',$backup_file) . '.tar.gz');

* This simply gets all the files in an directory and loads the
* file names into an array

function getOldBackups($dir_path)
    if (!is_dir($dir_path)) return false;

    $fd = opendir($dir_path);
    $old_backups = array();
    $index = 1;
    while (($f = @readdir($fd)) == TRUE) {
        if (!is_dir($f)) {
            $old_backups[$index] = $f;

    return $old_backups;



Here are a couple questions:

1- Is this script still viable for backing up gl v1.4.1 sites? (written back in Feb '02)

2-Can anyone share any improvements to it?

One thing I notice is that the variables for the database are duplicated in the script. Would it be possible to just include the Geeklog config file, so that no username/password/dbname info was stored in the actual script? I know that would make the script geeklog specific, but hey, that's the goal of posting here, right? Big Grin

Ultimately, I am looking for a script solution that runs via cron, that will backup my database to an .sql file, and then archive my entire Geeklog install (both public_html/ and path_to_geeklog/ ) to a tar.gz file and then push it to a remote FTP site of my choosing and email me notifications. Icing on the cake would be weekly full backups, and daily incremental backups. Another goal would be to then take the tar.gz backup, and in a local environment (probably Windows based in my case) with a XAMPP install, extract the tar.gz archive, import the .sql database, and then run a grep command (I have a grep tool installed in XP) to change the site name to my local install. The goal is to sync my live site with my local site for development purposes... and backup purposes. :wink:

Anyone have a good solution they can share? I know some things depend upon hosting environment, but I am shooting for the lowest common denominator, shared hosting in a Linux environment. And one thing I've noticed among my different GL sites on different databases is that mysqldump is not always in the same place from server to server. Where are some other logical places to look, in addition to the default /usr/bin/ location?


Synergy - Stability - Style --- Visit us at glfusion.org

Status: offline


Site Admin
Registered: 12/01/02
Posts: 13073
Location:Stuttgart, Germany
Quote by: geiss

1- Is this script still viable for backing up gl v1.4.1 sites? (written back in Feb '02)

Yep. It has been in constant use ever since, right here on geeklog.net

And why not? It just creates a backup of the entire database. You can even use it for other databases, not related to Geeklog.

I guess it should be possible to include config.php to avoid duplicating the DB settings. I wouldn't to include lib-common.php since that would create a session, etc. Too much overhead, IMO.

bye, Dirk

Status: offline


Forum User
Full Member
Registered: 29/08/05
Posts: 985
I have a lib-minigl.php that loads as little of lib-common as possible and still support sessions. It was designed for creating web pages that serve up non-html (images, css, js) where you aren't going to use all the HTML stuff you just need to know that the person is who they claim to be. It is in this forum post.

If you didn't want sessions, this becomes much smaller.

Status: offline


Forum User
Full Member
Registered: 04/08/03
Posts: 1298
It is good to talk about backing up. The built in back up of GL never works on my webspaces although I asked the support for help. It seems it can`t work under Confixx. It is no problem with cpanel.

At present I`m testing http://www.phpmybackuppro.net/index.php. It can dump and gz the MySQL manually, by pseudo cron and cron. It stores it, sends it by email or ftp. That`s good when you have many sites and you want to backup all the MySQLs on your big mail account somewhere. I plan to install this with every new GL installation.

Unfortunately it cannot yet gz a public_html but the only thing that changes there is the images of the articles.

Have another look at http://www.mysqldumper.de/en/. It claims to be able to handle large MySQL files reliably. Works with email and ftp but only to backup MySQLs.

As for me I`d like to backup MySQLs and parts of the other stuff fully or incremental.

All times are EST. The time is now 02:10 am.

  • Normal Topic
  • Sticky Topic
  • Locked Topic
  • New Post
  • Sticky Topic W/ New Post
  • Locked Topic W/ New Post
  •  View Anonymous Posts
  •  Able to post
  •  Filtered HTML Allowed
  •  Censored Content