SUMMARY: Backing up millions of files
sburch@derwent.co.uk
Thu, 25 Sep 1997 16:26:00 +0100
Bit over due this one...my original query :-
I have a requirement to backup a a large amount of smallish files,
approximately 2 million averaging 5K in size (roughly 10GB worth).
I don't want to backup the whole partition using ufsdump and tar takes
too long (about a day) what would be the quickest method of doing
this, I presume some sort of block dump as opposed to file orientated.
A number of people suggested incremental backups and staging area
scenarios what I really wanted was a means of backing the whole lot up
to one tape for simple storage and retrieval. The answer was simple to
use ufsdump on the area in question e.g.
ufsdump 0f /dev/rmt/0 /data/images
Should have tried this but I had always been under the impression
ufsdump worked on partitions as opposed to individual directories.
Many Thanks to all those who replied
**************************************************************************
* *
* Stuart Burch Derwent Information Publishing *
* (Unix & Internet Support Analyst) 14 Great Queens Street *
* London *
* Systems & Database Group, IT WC2B 5DF. *
* *
* Email: sburch@derwent.co.uk Tel: 0171-424 2149 *
* *
**************************************************************************
±î=‡þ ;±î= íA ÷Ò=»ˆ =“=û<