I left it at two because I wanted the ability to download from two different TiVos and/or download and encode at the same time. If I change "active job limit" to 1, will it change that? Downloads require a trivial amount of system resources, so I don't think they should be thrown in with processor and I/O intensive tasks like encoding and streamfix.

That's why the setting is called active job limit - it applies to CPU intensive jobs only. Metadata & Downloads are not CPU intensive but the rest of the tasks are considered CPU intensive.
For Metadata & Downloads the only restriction is 1 at a time per Tivo (so for example if you have 3 Tivos all 3 downloads can happen at same time), so the active job limit setting has no effect on those jobs.

That is where the program pulls the download from. It is the same place I have tried 4 times already. I downloaded it again this morning and it still will not open. I get the same error the Zip file is saying it's Corrupt.

I even tried 2 differant Unzip programs. As a test I downloaded one of the other zip files from the same site and I am able to unzip it.

Can anyone else download and unzip this file? If so then the problem is on my end, but I have never had a problem with zip files before.

__________________
Sent from my highly modified waffle iron

To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.

To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.

That is where the program pulls the download from. It is the same place I have tried 4 times already. I downloaded it again this morning and it still will not open. I get the same error the Zip file is saying it's Corrupt.

I even tried 2 differant Unzip programs. As a test I downloaded one of the other zip files from the same site and I am able to unzip it.

You might try clearing your browser's cache. It sounds like maybe you've got a corrupted copy in your cache that the browser is grabbing each time.

I tried the file with two different web browsers after you made the post, and both unzipped fine.

That's why the setting is called active job limit - it applies to CPU intensive jobs only. Metadata & Downloads are not CPU intensive but the rest of the tasks are considered CPU intensive.
For Metadata & Downloads the only restriction is 1 at a time per Tivo (so for example if you have 3 Tivos all 3 downloads can happen at same time), so the active job limit setting has no effect on those jobs.

Thanks for the clarification.

In that case, you might consider setting the default value for "active job limit" at 1, and then set the default value for "encoding cpu cores" based on Runtime.getRuntime().availableProcessors() .

In that case, you might consider setting the default value for "active job limit" at 1, and then set the default value for "encoding cpu cores" based on Runtime.getRuntime().availableProcessors() .

Looking at the code there actually is a bug in released version right now. The setting is:
IF active_jobs >= active_job_limit THEN no more active jobs
It should be > instead of >=
(So with released version you have to have active job limit set to at least 2 to be able to do 2 things at a time)
I've made the correction and implemented your defaults suggestion in development version.

What happens when you put these in "Stored User Names and Passwords"?
What happens if the password for LocalService is the same as yours?
Can you run the service as your user account?

Hmm.... that could work. Do you know if it's possible to log in as LocalService? If so I should be able to set the password there. I don't see LocalService as a user in the "User Accounts" widget in the control panel. Because of that I can't change its password to match mine either. Ideas?

I tried running kmttg as my account instead of LocalService but it wouldn't run at all.

I tried reverting the push.py script back to the original suggested version by removing the urllib2.quote(...) argument from the "file" variable, but that didn't solve the problem either. As someone else reported earlier in this thread, the manual push command from pyTiVo uses "+" signs for spaces instead of the HTTP "%20" space designator.

I would appreciate your help with solving this problem.

Thanks,

Cam

PS. I am using the March 2009 version of wmcbrine branch of pyTiVo. I tried to use the latest wmcbrine.git version referenced in the push.py wiki, but couldn't figure out how to install that over my existing version of pyTiVo. I tried copy/replace the files/folders in Windows, but had all kinds of issues, so I went back to 3/2009 version. Thanks. Cam.

I tried reverting the push.py script back to the original suggested version by removing the urllib2.quote(...) argument from the "file" variable, but that didn't solve the problem either. As someone else reported earlier in this thread, the manual push command from pyTiVo uses "+" signs for spaces instead of the HTTP "%20" space designator.

I would appreciate your help with solving this problem.

Thanks,

Cam

PS. I am using the March 2009 version of wmcbrine branch of pyTiVo. I tried to use the latest wmcbrine.git version referenced in the push.py wiki, but couldn't figure out how to install that over my existing version of pyTiVo. I tried copy/replace the files/folders in Windows, but had all kinds of issues, so I went back to 3/2009 version. Thanks. Cam.

The problem is most likely the space in "Family Room". In the push.py script try setting the tivo name to "Family%20Room" instead to see if that works.

I was wondering if there is a way to pause/resume a download/decrypt/QSF/Ad Detect/Ad Cut/Encode/Custom (push.py) thread in the middle of the Ad Detect/Ad Cut phase to manually verify the commercial cuts in VRD and modify the VPrj file. I've found that VRD and Comskip are pretty error-prone, and it would be great if I could verify the cut scenes before the ad cut execution step.

I have tried to breakup the steps into two separate jobs, one for download/decrypt/QSF/Ad Detect and then a new job through the "FILES" for Ad Cut/Encode/Custom job, but obviously that's not as slick as a Pause/Resume function.

I was wondering if there is a way to pause/resume a download/decrypt/QSF/Ad Detect/Ad Cut/Encode/Custom (push.py) thread in the middle of the Ad Detect/Ad Cut phase to manually verify the commercial cuts in VRD and modify the VPrj file. I've found that VRD and Comskip are pretty error-prone, and it would be great if I could verify the cut scenes before the ad cut execution step.

I have tried to breakup the steps into two separate jobs, one for download/decrypt/QSF/Ad Detect and then a new job through the "FILES" for Ad Cut/Encode/Custom job, but obviously that's not as slick as a Pause/Resume function.

Thanks for the great work, and sorry about getting too greedy...

Regards,

Cam

One possibility is to have an option where following Ad Detect kmttg opens up VRD in GUI mode using the .VPrj file with the detected cuts. Then you can modify the cuts as needed and as soon as you are done and close the GUI kmttg will continue. I'm not 100% sure that can be implemented but is that the kind of thing you are looking for?

I'm ok with the 2-step process, I like the batch processing model, but maybe I did something wrong, but when you run the custom script as part of the 2nd half of the batch (ad-cut, encode, custom), the metaFile argument is wrong.

It ends up substituted as *.m4v.txt in the encodefile directory instead of *.mpg.txt in the mpg directory, which is where it was left from the first part of the batch processing.

I'm ok with the 2-step process, I like the batch processing model, but maybe I did something wrong, but when you run the custom script as part of the 2nd half of the batch (ad-cut, encode, custom), the metaFile argument is wrong.

It ends up substituted as *.m4v.txt in the encodefile directory instead of *.mpg.txt in the mpg directory, which is where it was left from the first part of the batch processing.

I'm hoping that using [mpegFile].txt will work around that.

-David

I assume you are breaking things up into 2 steps. In step 1 you have metadata turned on but not encode, thus metaFile is set to .mpg.txt. In step 2 you then have encode and custom enabled so the assumption will be by default that metaFile will be the (encoded file extension).txt. i.e. The assumptions are all built around running everything as 1 step.
In any case yes I think [mpegFile].txt should work for that scenario.

Well looks like AudioNutz had the right setting all along with -async 50 option for ffmpeg. I still don't have a Windows version that generates proper AAC audio with this option set to anything but 1. However, I tried encoding both my short and long trouble testcases on Linux today with a fairly new ffmpeg I built from source and now both testcases ended up perfectly in sync when using the -async 50 option, and the sound is not distorted in any way.

So I think it's just a question of building a new windows version from source which I will attempt later today.
(NOTE: Neither the pyTivo or streambaby ffmpeg Windows binaries generate proper AAC audio either last I tried so it could well be some issue with windows builds)

Well looks like AudioNutz had the right setting all along with -async 50 option for ffmpeg. I still don't have a Windows version that generates proper AAC audio with this option set to anything but 1. However, I tried encoding both my short and long trouble testcases on Linux today with a fairly new ffmpeg I built from source and now both testcases ended up perfectly in sync when using the -async 50 option, and the sound is not distorted in any way.

So I think it's just a question of building a new windows version from source which I will attempt later today.
(NOTE: Neither the pyTivo or streambaby ffmpeg Windows binaries generate proper AAC audio either last I tried so it could well be some issue with windows builds)

Good news about Linux, at least.

I still can't believe there isn't more discussion about this problem with ffmpeg on Windows. I guess there just aren't many cross platform programs with built-in encoding?

I wish I could take credit... Since I used VH before switching to kmttg, I simply continued to use the same arguments that their GUI used.

Out of curiousity, do you know what the -async 50 means?

I think it just means to allow ffmpeg to vary the audio rate up to a maximum of 50 Hz to stay in sync with video. (Otherwise a fixed rate audio is used in which case it could be that A/V sync can get worse and worse during playback as sync drifts further apart). -async 1 is a special case that would only allow fixing sync right at the start so would only work for cases when there is a constant A/V shift all the way through (or would be enough to fix short clips but not longer clips where the drift eventually becomes noticeable).

Building faac library from source on windows is turning out to be problematic - I'm not too surprised that aac encoding seems to be broken in various ffmpeg windows binaries I have tried lately as depending on how the source is "fixed" to compile on windows it may cause issues...

I don't suppose you boys would consider getting a Mac, and calling it done?

It turns out it was a pseudo false alarm. With the particular linux version of ffmpeg, ff_psp profile works great for both my testscases. However I tried 3 other profiles (with AAC & AC3 audio) and those ended up out of sync even for the short 5 minute testcase.

So while the particular ffmpeg binary does make some difference it doesn't look like -async 50 is a cure all anyway... I'm pretty sure on the Mac the issues would be there too. The problem is the source mpeg2. You may be lucky that you are not getting bad sources to begin with which is why you never had a problem, and in some cases -async 50 may be helping you as well. For me it looks like SD sources from FX are problematic while HD sources from the big 4 networks are fine...

P.S. The ultimate fix to these problems is VideoRedo which I do have and is only available for Windows - I'm looking for a solution for those that don't have it, but I think I've spent too much time on that already without getting anywhere...

"Rescue Me" on FX SD and "The O'Reilly Factor" on FNC SD for me. But it's likely very provider specific. I have a short 5 min clip of 116MB that has A/V sync issues for certain ffmpeg encoding profiles that perhaps I can upload somewhere. (Google code only allows <100MB uploads so can't put it there).

In all cases running them through VRD QS Fix has fixed them and resulting encodes were fine. Note that I've only been recently finding/generating problem testcases as I always have VRD QS fix enabled for my purposes so do not have these issues.