

Job {
Type = Admin
Run Script {
Runs When = Before
Runs on Client = No
Command = "/opt/bacula/scripts/my-script.pl"
}
Schedule = AdminSchedule
...
}
When backing up data at incremental or differential levels, Bacula (by default) does not do anything regarding removed or moved files or directories. Which implies that the result of a restoration could be different than the latest state of the machine. For that reason, we advise using “Accurate” mode which is enabled by the directive of the same name. When set to “yes” in a Job, Bacula will record removed missing files or directories, and depending on additional configuration, Bacula will also consider more criteria than just time stamps to determine if a file needs to be backed up. In this case, Bacula will restore the machine to the exact same state (from a backup content point of view) that it was in during the backups.
Administrators will understand that the “Accurate” mode takes additional resources and time when running backups.
To improve performance, you may use the “Accurate” directive when using Virtual Full backups only for the last incremental before the Virtual Full itself. Activating this directive for every incremental backup would be even better but could increase the backup time.
#
# A usual job definition
Job {
Name = "j-bacula"
JobDefs = "DefaultJob"
FileSet = "BaculaFileSet"
Client = client-fd
Schedule = s-Data2Disk
Max Full interval = 19 days
Run Script {
Runs When = after
Runs on Client = no
Runs on Failure = yes
Command = "/opt/bacula/scripts/run_copyjob.pl %l %i %b t-rdx j-copy-full" ;
# Launching the copy job as soon as the backup is done
# %l is job level
# %i is job id
# %b is job bytes
# j-copy-full is the job name (see below)
# The script "run_copyjob.pl" issues a shell command like
# bconsole -c bconsole.conf << END_OF_DATA
# run jobid=%i job=j-copy-full storage=t-rdx yes
# quit
# END_OF_DATA
}
}
And here is the “Copy” job to copy the job from Disk to RDX
Job {
Name = "j-copy-full"
Type = Copy
Level = Full
Client = client-fd
File Set = "Empty Set"
Messages = Standard
Pool = PoolVFull
Maximum Concurrent Jobs = 1
}
And the Schedule used to run the several levels
Schedule {
Name = s-Data2Disk
Run = Level=incremental monday-thursday,saturday at 21:00
Run = Level=incremental accurate=yes friday at 12:30
Run = Level=VirtualFull priority=15 friday at 12:35
}
In such a situation, you will configure your clients not to activate the pruning algorithm, using the “Auto Prune” Directive such as the following:
Client {
Name = client-fd
Address = bacula.example.com
FDPort = 9102
Catalog = Catalog
Password = "do-you-want-a-very-strong-password?"
File Retention = 15 days
Job Retention = 50 days
AutoPrune = no
# No automatic pruning at the end of a job for this client
}
If you don't do anything more, your Catalog will grow infinitely. To keep it at its best, you should define an “Admin” job, like the following:
Job {
Name = "admin-manual-pruning"
Type = Admin
JobDefs = "DefaultJob"
RunScript {
Runs When = Before
# below command relies on proper PATH!
Command = "/bin/sh -c \"echo prune expired volume yes\" | bconsole"
Runs On Client = no
}
Schedule = s-Prune
}
As a Bacula volume can contain one or more jobs (or parts of jobs) and a job contains one or more files, the pruning process will have side effects: