Create large files of random information

The plan was to fill it with dummy files and if there was an issue at the 40Gb point, I would know up front. Which would have been better than finding out a week later.

To that end, the weak-sauce tip for the day becomes this one: Creating a series of 2Gb files made of random gibberish, just to take up space.

for i in {1..20} ; do time dd if=/dev/urandom of=test-{$i}.file bs=268435456 count=8 ; done

The results, after a considerable amount of time (/dev/zero is faster), will be twenty files all 2Gb in size, filled with gunk. Good gunk, that is.

A little tip there: The block size multiplied by the count gives you the size of the file. So what?

So simply setting the block size to one gigabyte (or gibibyte, since I seem to be drawing flak on the issue these days🙂 ) might cause memory errors on a machine with only 512Mb or less. It did for me.

The size of the file in my case wasn’t really important I guess, but I did get that quick primer for performing this stunt on low-memory machines. Reduce the block size, magnify the count, get the same results.

Oh, and the hard drive? It’s fine. I filled it all the way to the brim, and Arch didn’t complain. Good to know.