I've created a pretty straightforward shell script to do backups via rsync. Every night it rsyncs a copy of a website (approx 60GB) to the local machine. Once a week, after the rsync step, it zips the local copy of the whole site into 2GB chunks and places them in another folder.
When running 'unzip filename.zip' I get a long string of errors similar to this:
file #17934: bad zipfile offset (lseek): 1726201856
file #17935: bad zipfile offset (lseek): 1726775296
file #17936: bad zipfile offset (local header sig): 1702568762
file #17937: bad zipfile offset (EOF): 1545311941
Approx 60% of the archived series of 2GB chunks unzips without any problem, but the rest shows those errors above. Those errors "seem" to be happening on files that were already .zip files on the webserver, and which were originally created with a mix of windows WinRAR and command line windows 'zip'. (I don't see how the original method of creation would impact them being zipped into a larger archive, but I'm mentioning it to be thorough).
ALSO: If I ftp the entire linux-created series of 2GB chunks to a Windows machine and open them in WinRAR, they all extract without any problem.
Here's the crontab I'm using:
2 * * 2-7 /home/user/Scripts/backup.rsync.sh > /mnt/data/Backups/logs/$(date +\%Y\%m\%d_\%H\%M\%S)_website.rsync.cron.log 2>&1
0 2 * * 1 /home/user/Scripts/backup.full.sh > /mnt/data/Backups/logs/$(date +\%Y\%m\%d_\%H\%M\%S)_website.full.cron.log 2>&1
And here are the 2 scripts referenced in that same cron with a lot of additional commands removed for clarity:
backup.rsync.sh
# Rsync website.com to local machine
/usr/bin/rsync -Pavzhe "ssh -p 22" --log-file=$dirlog/$date\_website.rsync.log --bwlimit=2500 --skip-compress=jpg/zip --delete user@website.com:'/home/user/' "/mnt/data/Backups/website.com"
backup.full.sh
# Rsync website.com to local machine
/usr/bin/rsync -Pavzhe "ssh -p 22" --log-file=$dirlog/$date\_website.rsync.log --bwlimit=2500 --skip-compress=jpg/zip --delete user@website.com:'/home/user/' "/mnt/data/Backups/website.com"
# Zip site to 2G chunks
/usr/bin/zip -r -s 2g -y -2 $dirzip/website_$date.zip $dircom/
Something in that final zip command seems to be the cause but I can't figure it out. The fact that I can take those resulting files into windows and extract them with winRAR is even more confusing, since linux 'unzip' throws thousands of errors... Can anyone suggest what's wrong with my batch script?
PS: The 'zip' version is 3.0-6 and 'unzip' is 6.0-8+deb7u2. The local machine is Crunchbang 11.