Linux backup stuff

31 07 2011

Booted my PC up with a Linux CD yesterday afternoon to do a ‘dd’ image of the hard disk so that I have a checkpoint in time to image my machine back to.  Only thing is, it was still only halfway through making the 300GB image by 9am this morning after running all night.  😦

The command I issued was “dd if=/dev/sda of=/media/Iomega\ HDD/laptop-image-30th-july-2011.img” – which by default will take 512Kb blocks at a time and write them to the image file.

Instead, I stuck the options “bs=100M conv=notrunc” on the end of it and the process has speeded up massively.  This tells dd to copy 100MB blocks instead.  After about 20 minutes, I’m already 10% of the way through making the image.

Another helpful thing I discovered this morning is the -h switch on the ls command. -h makes the output ‘human-readable’ – i.e. it uses K, M and G to indicate kilobytes, megabytes and gigabytes. See below:

ubuntu@ubuntu:/media/Iomega HDD$ ls -lh
total 29G
drwx------ 1 ubuntu ubuntu    0 2011-03-26 21:03 Old drive images
drwx------ 1 ubuntu ubuntu 4.0K 2011-02-20 15:15 Study resources
-rw------- 1 ubuntu ubuntu  29G 2011-07-31 11:37 laptop-image-30th-jul-2011.img
drwx------ 1 ubuntu ubuntu    0 2008-06-11 14:19 System Volume Information
ubuntu@ubuntu:/media/Iomega HDD$ 

All this is probably no news to some people, but I’ve been using Linux for about 10 years and didn’t know all this stuff for some reason.  You learn something new every day…



2 responses

15 10 2011
Matt Johnson

Hello from a fellow UK Networking / Linux guy (One that also doesn’t smell and has a respectable Christmas card list.. Promise!)

Just a quick thought on this, I tackled a data copy (2T disk with only 1T Used) in the following way with DD;

– Fill all free space with a file made of Zero’s
– Delete the file
– Pipe DD if= etc through tar, which will then compress the free space down to nothingness due to your disk’s recent encounter with a massive file full of zero.

– Does mean if you ever need the image you have to go back through tar and it does use more CPU, but images are much smaller and you dont back up free space block for block.

There are alternatives like rsync etc but sometimes just grabbing a DD image feels simplest!


10 11 2011

Nice – didn’t think about that. Will do that next time – thanks!

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: