Hello All: Back once again for some more questions and I thank all that gave me a better understanding of VBR's and Bitrate. I have installed Foobar and its a great listening tool. I still prefer Media Monkey as its more user friendly, at least more friendly to me. I found it didn't have the latest version of lame installed since I was getting different bitrates in vbr mode when using mediamonkey vs foobar. I believe that the latest version of lame is is 3.99.5 I installed this into media monkey and have some strange results when encoding mp3's in vbr 0 mode. I understand that Bit Rate isn't always a means of measuring quality but I have seen some strange jumps in Bit Rate using the lastest version. I have no idea what the last version that was installed on Media Monkey, but I compared it to some songs that had the old encoder. I found what I thought were some unusual results. Here are some results of my bit rates and I will explain after I list them...

1. In VBR0 with old version of lame: Steely Dan's Black Cow=230kbps2. Same setting with version 3.99.5 = 254kbps

I understand that bitrate doesnt always mean what it should when in vbr mode, but aren't these some strange jumps. Steely Dan gained 14bits. Cattle Decapatation lost 14 bits. Doesnt that seem odd? Is my version of lame good (i checked its not a beta) and if it isnt what version do people reccomend. I want the version that is going to be the most rock solid in VBR mode for me, since lame is choosing the quality for me. And a bonus question so to say: Does anyone have any scientific proof that a higher or lower average BR by lets say 15-20bits will make a difference. I understand BR isnt everything but the average does mean something. I was cruising along encoding, I know I am confused about this behavior.PS---I reseached this version of lame, couldnt find anything about its reliability or anyone that had any problems, so if it was posted already in a previous forum, i applogize, I couldnt find anything on that particular version.

I understand that bitrate doesnt always mean what it should when in vbr mode, but aren't these some strange jumps. Steely Dan gained 14bits. Cattle Decapatation lost 14 bits. Doesnt that seem odd?

Not in the slightest. Why should one expect that changes to the algorithm will result simply in a uniform gain or decrease in the mean bitrate for every possible input? Also, it’s kbps, kilobits per second.

QUOTE

Is my version of lame good (i checked its not a beta) and if it isnt what version do people reccomend.

I mentioned in your other thread that default settings are usually default for a reason, and analogously, the latest version of any given program has usually been released for a reason, specifically the fact that the developers are confident that it represents an improvement over the last. Otherwise there’d be no need to ever update anything, and everyone would still be using certain (in)famous versions from years ago… something that some people don’t seem to want to let go of.

QUOTE

Does anyone have any scientific proof that a higher or lower average BR by lets say 15-20bits will make a difference.

How can someone ‘prove’ this? An altered bitrate on which song(s)? To which listener(s)? How many of each do we need to sample before the summarised results will be representative of you or any other potential listener(s)? If previous references to proper methods of testing haven’t made it clear yet, the performance of a lossy codec ends up mattering only in terms of whether or not it encodes a given signal transparently to the specific person who is listening to it. We simply cannot answer questions this vague with any confidence. Unfortunately the world does not always work easily enough that people can provide you with one-shot answers for things. Again, it comes down either to performing your own tests and trusting their conclusions, or to trusting the conclusions of other people’s tests (N.B.: tests, not evidence-free subjective ramblings) and that they will be applicable to you.

Im sorry to have not been more specific. I was looking to see when developers are programming lame, do they have a target bitrates in mind when they are writing the code for VBR mode? My feeling from what I am reading from others is that it looks like they are trying to make lame produce the best of both worlds. Smaller file sizes with less transparency. Im not sure if the following statement is true but if the bitrate is larger, wouldnt the file size also be larger?? I am not stating this as a fact or opinion, but as a question. I can only think that with this newer version of lame it seems like they increased bit-rate on songs that would previously use less bits, and the songs that used more bits, they decreased, almost like a happy medium. ONce again, I have no idea if this is true, I am only taking a educated guess and have no way of backing this up, it just seems logical. I also see from foobar that when I play my songs, the bit-rate doesnt usually go over 300. Im I to assume that lame developers are finding that music is hardly using those bits in the high range. Its really cool stuff, and I love how the technology has grown. Sorry, its a long statement, but I am hoping I am getting a better idea of lame is doing as it advances to a newer version.