A difficult filesystem

Andy Armstrong andy at hexten.net
Wed Jun 20 13:30:05 BST 2007

I have an rsync based backup regime that results in a filesystem  
containing lots of inodes many of which have as many as fifty hard  
links pointing to them. I'm not sure how many files there are in  
total - the little script I have analysing it has been running all  
night and is up to 26,000,000 plus. debugfs tells me there are a  
shade over 6,000,000 inodes.

I want to migrate the whole (ext3) filesystem onto another, larger  

Any of the normal hard-link-preserving copying methods run out of  
memory pretty early - for obvious reasons.

So I copied the whole filesystem (dd if=x of=y) onto the new device  
with a view to growing it in place. Unfortunately parted deosn't like  
it either - "You found a bug in GNU Parted".

While I work up a bug report for parted I've got a little Perl prog  
crawling the whole filesystem and building a SQLite DB with two  
tables - one for dir entries and one for inodes. When the finishes  
(sometime next week at the current rate of progress) I'll have a 30G  
or so database with more than 100 million rows.

Then I should be in a position to copy the FS preserving hard links.  

While I'm waiting does anyone have tips for other tools that might be  

Andy Armstrong, hexten.net

More information about the london.pm mailing list