I do a ufsdump backup of a large filesystem (~8GB) onto local
disk before doing a tape backup of the data at a later time.
I've found the occasional problem restoring from the dump file,
especially when it's been compressed and uncompressed. As
ufsdump is a rather archaic utility, I'm wondering if it's a good
choice for such a large chunk of data. I believe it used to have
a 2GB limit.
Any thoughts out there? Would tar, gnutar or cpio be more
robust? I'm running on Solaris 8.
Many thanks for any assistance.