I use this method for other systems to ready my sites with cache files every night when doing backups using a shell script but, when I tried to do this with my WordPress site running quick cache, the cache files did not get created.

I just tested this on my local site and it worked as expected. wget spidered the site, which resulted in Quick Cache generating cache files.

Quick Cache does not exclude any User-Agents by default (if you're using Quick Cache Pro then you have access to the User-Agent Exclusion Patterns feature, which includes w3c_validator as a default exclusion, but but even that shouldn't affect a wget spider).

If you have the "Discourage search engines from indexing this site" option enabled in WordPress (Settings -> Reading), then WordPress will automatically return a robots.txt file that excludes all User-Agents. (However, if you manually add a robots.txt file, that should override anything set by WordPress.)

Also, since it's related to your question, I thought I should mention that the next version of Quick Cache Pro includes a new Auto-Cache Engine feature, which basically accomplishes the same thing as "warming the cache".

You can see a screenshot of the new Auto-Cache Engine options panel here.