Backup data adalah hal critical yang sering diabaikan. Mari pelajari strategi backup yang efektif.
Strategi Backup
3-2-1 Rule
3 - Simpan 3 copies data
2 - Di 2 media berbeda
1 - 1 copy di offsite/cloud
Contoh:
- Original di laptop
- External hard drive
- Cloud storage
Jenis Backup
Full Backup:
- Backup semua data
- Waktu lama, storage besar
- Restore cepat dan mudah
Incremental Backup:
- Backup perubahan sejak backup terakhir
- Waktu cepat, storage kecil
- Restore: butuh full + semua incremental
Differential Backup:
- Backup perubahan sejak full backup terakhir
- Medium time, medium storage
- Restore: full + latest differential
Backup di Linux
Rsync (Recommended)
# Basic rsync
rsync -av source/ destination/
rsync -av --progress source/ destination/
Delete files yang tidak ada di source
rsync -av --delete source/ destination/
Exclude files/folders
rsync -av --exclude='*.tmp' --exclude='node_modules' source/ destination/
To remote server
rsync -avz -e ssh source/ user@server:/path/destination/
Dry run (test tanpa execute)
rsync -av --dry-run source/ destination/
Rsync Backup Script
#!/bin/bash # backup.shVariables
SOURCE="/home/user/Documents" DEST="/mnt/backup/Documents" DATE=$(date +%Y%m%d %H%M%S) LOG="/var/log/backup$DATE.log"
Create backup
echo "Starting backup at $(date)" | tee -a $LOG rsync -av --delete --exclude='*.tmp' --exclude='.cache' \ $SOURCE/ $DEST/ 2>&1 | tee -a $LOG
Check result
if [ $? -eq 0 ]; then echo "Backup completed successfully at $(date)" | tee -a $LOG else echo "Backup failed at $(date)" | tee -a $LOG
Send notification (optional)
# mail -s "Backup Failed" [email protected] < $LOGfi
Tar Backup
# Create tar archive tar -cvf backup.tar /path/to/folderCreate compressed tar.gz
tar -czvf backup.tar.gz /path/to/folder
Create with date
tar -czvf backup_$(date +%Y%m%d).tar.gz /home/user/Documents
Extract
tar -xzvf backup.tar.gz
Extract to specific directory
tar -xzvf backup.tar.gz -C /path/to/destination
List contents
tar -tzvf backup.tar.gz
Tar Backup Script
#!/bin/bash # tar-backup.shSOURCE="/home/user/Documents" DEST="/mnt/backup" DATE=$(date +%Y%m%d) FILENAME="backup_$DATE.tar.gz"
Create backup
tar -czvf $DEST/$FILENAME $SOURCE
Keep only last 7 days
find $DEST -name "backup_*.tar.gz" -mtime +7 -delete
echo "Backup created: $FILENAME"
Database Backup
MySQL/MariaDB
# Single database mysqldump -u root -p database_name > backup.sqlAll databases
mysqldump -u root -p --all-databases > all_backup.sql
With compression
mysqldump -u root -p database_name | gzip > backup.sql.gz
Restore
mysql -u root -p database_name < backup.sql
Restore from gzip
gunzip < backup.sql.gz | mysql -u root -p database_name
MySQL Backup Script
#!/bin/bash # mysql-backup.shDB_USER="backup_user" DB_PASS="password" BACKUP DIR="/mnt/backup/mysql" DATE=$(date +%Y%m%d%H%M%S)
Get all databases
DATABASES=$(mysql -u $DB_USER -p$DB_PASS -e "SHOW DATABASES;" | grep -Ev "(Database|information_schema|performance_schema)")
Backup each database
for DB in $DATABASES; do echo "Backing up $DB..." mysqldump -u $DB_USER -p$DB_PASS --single-transaction $DB | gzip > $BACKUP DIR/${DB}${DATE}.sql.gz done
Remove backups older than 7 days
find $BACKUP_DIR -name "*.sql.gz" -mtime +7 -delete
echo "Backup completed"
PostgreSQL
# Single database pg_dump database_name > backup.sql pg_dump -U username database_name > backup.sqlAll databases
pg_dumpall > all_backup.sql
Custom format (compressed)
pg_dump -Fc database_name > backup.dump
Restore SQL
psql -U username database_name < backup.sql
Restore custom format
pg_restore -U username -d database_name backup.dump
MongoDB
# Backup all databases mongodump --out /backup/mongodb/Backup specific database
mongodump --db database_name --out /backup/mongodb/
With authentication
mongodump --uri="mongodb://user:pass@localhost:27017/database" --out /backup/
Restore
mongorestore /backup/mongodb/
Restore specific database
mongorestore --db database_name /backup/mongodb/database_name/
Cloud Backup
Rclone Setup
# Install rclone curl https://rclone.org/install.sh | sudo bashConfigure
rclone config
Add remote (Google Drive example)
n) New remote
name> gdrive
Storage> drive
Follow OAuth prompts
List remotes
rclone listremotes
Rclone Commands
# List files rclone ls gdrive:/backupCopy to cloud
rclone copy /local/path gdrive:/backup/path
Sync (mirror)
rclone sync /local/path gdrive:/backup/path
With progress
rclone copy /local/path gdrive:/backup/path --progress
Backup script
!/bin/bash
BACKUP_DIR="/home/user/Documents" REMOTE="gdrive:/Backups/Documents"
rclone sync $BACKUP_DIR $REMOTE --progress --log-file=/var/log/rclone.log
Google Drive dengan grive2
# Install sudo apt install grive2First time setup (di folder yang mau sync)
cd ~/GoogleDrive grive -a
Sync
grive
Selective sync
grive --dir Documents
Automated Backup dengan Cron
Setup Cron Jobs
# Edit crontab crontab -eDaily backup at 2 AM
0 2 * /home/user/scripts/backup.sh
Weekly full backup (Sunday 3 AM)
0 3 0 /home/user/scripts/full-backup.sh
Monthly backup (1st day, 4 AM)
0 4 1 /home/user/scripts/monthly-backup.sh
Database backup every 6 hours
0 /6 /home/user/scripts/db-backup.sh
Cron Format
* * * * * command β β β β β β β β β βββ Day of week (0-7, Sunday = 0 or 7) β β β βββββ Month (1-12) β β βββββββ Day of month (1-31) β βββββββββ Hour (0-23) βββββββββββ Minute (0-59)Backup Tools GUI
Timeshift (System Backup)
# Install sudo apt install timeshiftCreate snapshot
sudo timeshift --create --comments "Before upgrade"
List snapshots
sudo timeshift --list
Restore
sudo timeshift --restore
Deja Dup (GNOME)
# Install sudo apt install deja-dupFeatures:
- Simple GUI
- Scheduled backups
- Encryption
- Cloud support
Restic (Modern Backup)
# Install sudo apt install resticInitialize repository
restic init --repo /mnt/backup/restic-repo
Create backup
restic -r /mnt/backup/restic-repo backup /home/user/Documents
List snapshots
restic -r /mnt/backup/restic-repo snapshots
Restore
restic -r /mnt/backup/restic-repo restore latest --target /restore/path
Forget old snapshots
restic -r /mnt/backup/restic-repo forget --keep-daily 7 --keep-weekly 4 --keep-monthly 6
Verify Backup
Test Restore
# ALWAYS test your backups!Test tar archive
tar -tzf backup.tar.gz > /dev/null && echo "Archive OK" || echo "Archive corrupted"
Test SQL backup
mysql -u root -p test_restore_db < backup.sql
Verify data, then drop test database
Test rsync backup
rsync -avnc source/ destination/ # -n = dry run, -c = checksum
Checksum Verification
# Create checksum md5sum backup.tar.gz > backup.tar.gz.md5 sha256sum backup.tar.gz > backup.tar.gz.sha256Verify checksum
md5sum -c backup.tar.gz.md5 sha256sum -c backup.tar.gz.sha256
Restore Best Practices
Before Restore
1. Verify backup integrity 2. Identify what needs to be restored 3. Stop services yang affected 4. Create backup of current state 5. Plan rollback strategyRestore Checklist
# 1. Stop services sudo systemctl stop nginx sudo systemctl stop mysql2. Backup current state
tar -czvf pre-restore-backup.tar.gz /path/to/current
3. Restore
tar -xzvf backup.tar.gz -C /path/to/destination
4. Fix permissions if needed
chown -R www-data:www-data /var/www/html
5. Start services
sudo systemctl start mysql sudo systemctl start nginx
6. Verify
curl localhost # Test website mysql -e "SELECT 1" # Test database
Backup Checklist
What to Backup
Critical: β‘ Documents and files β‘ Database data β‘ Configuration files β‘ SSH keys β‘ Certificates/SSL β‘ Application data β‘ Email dataSystem: β‘ /etc (configurations) β‘ /home (user data) β‘ /var/www (web files) β‘ /var/lib (app data)
Development: β‘ Source code (also use Git!) β‘ Environment files β‘ Database dumps β‘ Docker volumes
Weekly Review
β‘ Check backup logs β‘ Verify backup file sizes β‘ Test restore procedure β‘ Review backup retention β‘ Check storage space β‘ Update backup scripts if neededKesimpulan
Backup adalah investasi waktu yang akan menyelamatkan Anda saat disaster. Ikuti 3-2-1 rule dan SELALU test restore secara berkala.
Ditulis oleh
Hendra Wijaya