Common DFOS tools:

dfos = Data Flow Operations System, the common tool set for DFO

*make printable Note: The tool is now distributed through the utilities package (utilPack).



This tool is an infrastructure tool to support the regular backups of operational software and data. It is intended to be called once a week as a cronjob. It copies a set of defined data into two places, one on the raid5 data disk ($BACKUP_DIR1), the other under the home directory ($BACKUP_DIR2). The goal for the backups is to have copies of the operationally critical software, in cases of data losses and of unintentional data deletion.

The strategy for backups is driven by the following:

Hence a weekly customized backup using dfosBackup, kept on two different places in the system, combines the security of the raid5 system with the available tape backups, and extends the backup depth to something like 10 weeks (configurable).

Because of backup size, the type of data to be backed up needs to be selected carefully. It contains:

The user can add files or directories to the configured set. A current backup on the GIRAFFE system with the standard configuration takes 90 MB.

There is the option to call a plugin script at the end. That script could have a function which transfers the backup to another machine, or a CD writer etc.

The total number of backups created can be configured. If that number e.g. is set to 10 and the tool is called weekly, backup no. 11 will overwrite backup no. 1 etc. Hence a backup depth of 10 weeks is offered. This seems to be a reasonable trade-off between security and backup data volume.

Check out here how to set up the tool as a cronjob.

How to use

dfosBackup -v|--version     gives the version number
dfosBackup -h|--help        gives a short help text

dfosBackup        calls the backup from the command line

Recommended call:
[00 08 * * 1 dfosCron -t dfosBackup ]
if this line is defined by using crontab -e, the backup will be refreshed every Monday at 08:00.


The tool comes as part of the utilPack delivery.

Configuration file

Section 1: General parameters
BACKUP_DIR1 /data<nn>/data/backup 1st directory for backups (on raid5)
BACKUP_DIR2 /home/<account>/backup 2ndt directory for backups (on home account)
KEEP_NLATEST 10 keep N latest backups
PLUGIN_FINAL pgi_dfosBackup name of optional plugin file (expected under $DFO_BIN_DIR), called at the end
Section 2: Definition of backup items
this list should not be modified without good reasons; if you extend it, make sure not to include overly large items
DIRECTORY $DFO_LST_DIR path to files or directories to be backed up
FILE_DIR e.g.: NR_* or monitor string to identify file(s) or directory names under DIRECTORY; unix placeholder can be used (e.g. list*.txt)
BACKUP_TAR e.g. LISTINGS, CONFIG etc. root name of tar file; use only the reserved names: LISTINGS, CONFIG, LOGS, STATISTICS, TOOLS, SPECIAL; others won't get backed up. SPECIAL is reserved for any user-specific data.

Workflow description

1. find and create required directories; the backups go compressed to $BACKUP_DIR<n>/$DATE/LISTINGS_<date>.tar.Z etc.

2. loop over configured items

2.1 per item, call 'tar -rf ...'
2.2 compress the tar file
2.3 give some statistics

3. Remove outdated backup (in excess of $KEEP_NLATEST)

4. Call configured plugin