From free directories such as DMOZ, paid directories such as Yahoo or BCentral, reciprocal links, partner sites, friends and family sites, etc. There's much written on this site as to how to do that best.

Yep, if you have solid incoming links to root. So long as you have an easy path from root to internal links, Googlebot will find them. I've even had the issue of Google indexing orphan pages I wanted to abandon. Even though I dropped the links to these pages on my site, Googlebot found them from a link from a minor blog. Googlebot is voracious.

I recall stumbling over some pages of a "Competitor" a few years ago, who seemed to have generated a separate page for each of his products, and was well in the index with all these pages.

For about two weeks.

Never seen him again since then.

I also use script generated pages for my products catalogue, however with a very inhomogenous (i.e. "natural") hierarchical structure of three to four levels of groups of products, groups of groups of products and so forth. I also put considerable effort in these groups making sense and thus do follow googles first law: Concentrate on the user.

If your site is new and you have nothing to loose, you may give it a try. If you just want traffic, be sure the porn industry has had this genious idea about ten years ago.

Let me estimate that in order to convince googles algo it'd be worth to index all these pages, you need four to twelve unique lexical entries (=words) on each of these pages. If you typed more than 40000 words into your database by hand I'd also bet google will..