Ok, here's the problem. I need to copy about 200 GB from an old FreeBSD (5.2.1) server to a new Debian server. The old server's fastest port is the ethernet port. I set up an NFS server on the new machine so I can just copy the files over using regular commands/scripts etc. Thing is, I probably shouldn't copy dozens of GB of files using the 'cp' command. The last time I tried that it crashed the server.
My plan is to write a script that will process a list of files copying each file from the list one at a time (while making a separate list of files successfully copied that way if anything goes wrong, I can difference the two lists and pick up where it left off). More importantly, I can copy all files ahead of time and then generate a smaller list of files modified since the original copy so that on migration day I have a minimal number of files to copy minimizing downtime.
The main problem (that I can see) with this is, any list I generate will contain both files and folders, but the folders don't need to be copied, only recreated with identical owners and permissions. Unless there is a flag for the 'cp' command that copies a folder without it's contents (not that I know of), I need to find some other way to test each path to see what it is, then if it is a directory, somehow find the permissions and owner and duplicate it on the new server.
If anyone has any advice, can see any other problems or knows of a better way I might do this, I'd appreciate the help. Thanks!