I'm starting to use some small Linux apps at work and instead of getting lots of licenses for 'Ghost' I said why not try 'dd'.

I tried to "copy" an 8.2G USB-stick and that took 437 seconds.
The speed of Ghost 8 for a 120G SATA HDD take about 10-12 minutes.
I've tried changing the values of bs/ibs/obs without any major changes.

What makes you think you are doing something wrong? That sounds about right. The two apps should be roughly the same speed copying data (assuming the tests are done of the same partition from the same device to the same device. The greatest speed differences would come from the differences in bus speed (USB vs SATA), connection and the data transfer ratings of the drives themselves. When comparing speed use both tests in both places and compare apples to apples. Also something that could effect the test is if there is compression involved before the destination. Again, do apples to apples tests.

The output is a USB HDD and always the SAME disk.
I didn't gzip the source file from 'dd' but I always compress the data in Ghost. Do you think that will increase the speed sending the data to the USB HDD?
...although it'll take time to gzip it up in the source PC.

If the data is compressed before it is written to the destination it obviously means less data will be written to the destination which means the overall process "could" be faster using compression if you are writing to a slow drive. On the other hand, if you are using 100% of your CPU for the compression process then using compression could actually make things slower. You can use utilities like "top, iostat, vmstat, etc" to help you figured out where your bottleneck is (whether you are maxing your CPU out or whether you are maxing out your I/O on either your source or destination disk).

I gotta try the compress thing too I guess, otherwise it might take a longer time then using Ghost and I want them to use 'dd' since it is Linux
and thx for the answer, once again I didn't think this through lol

driving?... and still happens to answer a post... lol
you shouldn't do that, you naughty boy

no need for a ghost like one, as long as it is for Linux and it can be used by command line arguments since this is gonna be used in a script, in an auto booting and after that the OS' script will take over.

btw... the destination disk just got full and the image was over 30G but the source disk has a usage of 4.0G
how come this happens? does it backup the empty space too?
a "ghost" image of the same is about 2 to 2,5G

Yes, dd does a sector by byte for byte copy of the entire partition and has no knowledge of file systems. So basically the entire partition would be data as far as dd is concerned. If you want a file system level backup you wouldn't want to use dd but use tar, cpio, etc, instead. Any of those can be piped to gzip for compression on the fly.

Have no idea what you mean by a 'system level' backup?
I need something to be able to "ghost" the entire disk(or partition) and later restore it and if it has a bootable partition, restore that boot thing too.
'tar' for me is a FILE copying thing, not a copier of everything like the boot record and such, but I have no clue really.
and the same with cpio, and this one is even a more mystery to what it really is

I've done some fiddling with cpio and a 'initrd' at work and they say I need to use 'cpio' so I do, but I can't see the difference between that and other 'packing' programs.

I said "file system level" not "system level". I wouldn't ghost for anything but if you want to ghost something then use ghost. I mentioned before there was a Linux utility that worked a little like ghost but you said you didn't want to use that. The utility that I couldn't remember the name of was "partimage":

We use NetBackup at work to back up to large disk/tape arrays for daily backups of everything but I assume you don't want to spend that sort of money. The advantage of using a regular backup utility is that you can do full backups once a week and incremental backups (only what's changed since the last backup) every day after that. A fairly robust open source backup system that I used to use that would be along the lines of a NetBackup is called "Amanda".

Oh by the way, partimage "can" be scripted (batch mode) and it can back up the data only so I *think* that just may be the tool you are looking for. I would suggest reading through the entire partimage FAQ:

I took a look at it and added that package to the slax boot USB.
Sure worked, too bad the ntfs partitions is just a experimental and might not work.
also had to use 'dd' to take out the MBR since partimage is as the name of it, just for partitions.

I used -z1 as compression (gzip) the fastest one but still taking and image of that 120G disk would take about 45 minutes.

Why we do not want to use 'Ghost' is the license of it.
One license for making a backup of ONE computer and where I work, there could be hundreds and they say that there is no other license form either.