Number of processes to run to fetch program details. 8 is a good number to try. You could try more with plenty of CPU and bandwidth. More processes will reduce the time it takes to fetch your listings. But be warned, the benefit might not be as much as you think, and the more processes you initiate the more you are making it obvious you are scraping and more likely to get banned by the source site. A 'fast' website scraper is an oxymoron!

The environment variable HOME can be set to change where configuration files are stored. All configuration is stored in $HOME/.xmltv/. On Windows, it might be necessary to set HOME to a path without spaces in it.

TEMP or TMP, if present, will override the directory used to contain temporary files. Default is "/tmp", so under Windows one of these is required.

Like any screen-scraping grabber, this one will break regularly as the web site changes, and you should try to fetch a new one from the project's repository. At some point the breakage might not be fixable or it may be that nobody wants to fix it. Sane people should use Schedules Direct instead.