Jitter is audible, if bad enough, although it takes surprisingly large amounts of jitter to show up in an A/B test. (Looking for that article, but haven't found it. Read it just a couple of weeks ago, and should have bookmarked it.)

To be clear, if a player sent a signal over HDMI or any other connection, and the receiving end depended on the player for its clock, then yes, they receiver would be at the mercy of the mastering studio, player, and cable.

But that's not what receivers do, because it would be stupid.

The only thing the D/A converter needs to know is the value of the original samples. It knows that the samples were supposed to be taken by an infinitely accurate clock at the recording rate (for example, 44.1 kHz). So it takes those samples and converts them to analog at the recording rate.

It doesn't matter if the samples arrive from the player separated by wildly varying time gaps. Those gaps are ignored. Only the values of the samples matter. That, and the recording rate.

If you think your way through this, you'll eventually conclude, correctly, that the only possible source of timing errors is when the recording is made, and when the samples are converted in the D/A converter at the end of the playback chain.

If a playback chain is susceptible to any other timing errors, its designer is an idiot.

It's not that simple. How you describe it is how I thought of it for a long time, but recently I've revisited jitter issues, simply because I became convinced I could hear it. Here's an article that explains how it's not enough just to capture the 'original' samples and re-clock.

It's not that simple. How you describe it is how I thought of it for a long time, but recently I've revisited jitter issues, simply because I became convinced I could hear it. Here's an article that explains how it's not enough just to capture the 'original' samples and re-clock.

It actually is that simple. The writeup in that link assumes the receiver does not buffer and reclock, which would be a stupid design.

Please understand, I'm not denying that cables can cause bits to arrive at the wrong time. That's kind of obvious. I'm just stating, correctly, that arrival times aren't used in the D/A process on the receiver, unless the receiver design is stupid. Presumably we all use non-stupidly designed receivers.

Jitter is audible, if bad enough, although it takes surprisingly large amounts of jitter to show up in an A/B test. (Looking for that article, but haven't found it. Read it just a couple of weeks ago, and should have bookmarked it.)

To be clear, if a player sent a signal over HDMI or any other connection, and the receiving end depended on the player for its clock, then yes, they receiver would be at the mercy of the mastering studio, player, and cable.

But that's not what receivers do, because it would be stupid.

The only thing the D/A converter needs to know is the value of the original samples. It knows that the samples were supposed to be taken by an infinitely accurate clock at the recording rate (for example, 44.1 kHz). So it takes those samples and converts them to analog at the recording rate.

It doesn't matter if the samples arrive from the player separated by wildly varying time gaps. Those gaps are ignored. Only the values of the samples matter. That, and the recording rate.

If you think your way through this, you'll eventually conclude, correctly, that the only possible source of timing errors is when the recording is made, and when the samples are converted in the D/A converter at the end of the playback chain.

If a playback chain is susceptible to any other timing errors, its designer is an idiot.

In general I agree with what you're saying, however there is a limit to that claim. When timing smear across a cable (or any other distributed circuit element) approaches half of the symbol period, that is not the case. This is when viewing the received bit stream in an eye diagram format, the eye closes because of inter-symbol interference. The ability of the device in the receiver to correctly estimate the value of the symbol at each sampling instant deteriorates, and when the error correcting ability of any FEC is lost the link collapses. The "suddenness" at which this breakdown occurs depends on the coding gain...less coding gain, the more gradual the degradation.

In the case of a 2 channel audio stream at a 192 ksps sampling rate and 24 bit resolution, jitter or smearing approaching 50 nsec starts to become an issue, if I'm doing my arithmetic correctly.

In general I agree with what you're saying, however there is a limit to that claim. When timing smear across a cable (or any other distributed circuit element) approaches half of the symbol period, that is not the case. This is when viewing the received bit stream in an eye diagram format, the eye closes because of inter-symbol interference. The ability of the device in the receiver to correctly estimate the value of the symbol at each sampling instant deteriorates, and when the error correcting ability of any FEC is lost the link collapses. The "suddenness" at which this breakdown occurs depends on the coding gain...less coding gain, the more gradual the degradation.

In the case of a 2 channel audio stream even at a 192 ksps sampling rate and 24 bit resolution, jitter or smearing approaching 50 nsec starts to become an issue, if I'm doing my arithmetic correctly.

My comments are with respect to a playback chain that is not actually losing bits.

If the bits arrive, eventually, even if they are spaced incorrectly, the original signal can be losslessly recovered, clocked, and converted.

In step #1, quantization errors are introduced in amplitude (bit depth) and time (clock precision). Nothing can make those errors go away.

In step #2, feel free to add other errors. They have no effect on step #3.

In step #3, read the samples, assume the samples were taken at the precise time specified by the sampling rate, and reconstruct the signal. Anything introduced in step #2 doesn't matter. Can't matter. Because the amplitudes aren't changed (else we would be getting bit errors, which I already said wasn't part of this), and step #3 is using the correct clock rate.

It's been claimed that buffering and reclocking can't work because then audio will get out of sync with video, but for that to be true they would have to have been out of sync during mastering. If they were in sync during step #1, then both the audio and video samples were taken at the correct rate, and using the correct rate in step #3 will again keep them in sync.

Basically, if you're relying on the clock rate derived from the wandering eye in the HDMI signal, you're doing something stupid.

1) Someone asks a question about the DACs, or the cables, or the configuration, of an Oppo in their system and/or vs. some other piece of equipment/configuration.

2) The answer, due to the nature of the question, requires delving into audio theory.

3) The discussion degenerates into another one of those I'm-convinced-I-can-hear-it/no-you-can't-you're-deluded discussions.

4) This goes on for a while.

5) Someone says knock it off.

6) Things simmer down for a while.

7) Cycle repeats for a new question at #1.

Same thing with amps and analog cables. If I would say a pair of Parasound JC 1 amps sound much better than my adequately powered receiver, it must be placebo affect and a cycle of lecturing to follow. DBT, level matching etc. from the same people. As they must have an agenda against better designed, built and sounding gear.

I say do what sounds best to you through your own experience. The "experts" will say the same arguments over and over but in the end it is you that should decide the truth for yourself. Through listening and experience with different gear in your own system. If you find satisfaction with a budget system, stop right there and be happy! This is a hobby to me, something to enjoy not argue about!

Very true Todd, but I don't think it's fair to characterize the jitter conversation as equivalent to DBT cable stuff. No one is really 'arguing', I think we're all enjoying our Oppos and enjoy talking about them.

Please take the theory discussion over to the theory forum so people with BDP-105 questions don't get lost in the OT discussion.

Got nothing to do with text size. I didn't see your post until after I'd made mine.

I think the reason you see posts on audio theory in this thread is because some people who own or are considering buying a 105 care very much about audio quality....which inherently involves discussion of audio theory. After all, audio quality IS the defining feature of the 105. So perhaps a bit more tolerance would be helpful to us.

Same thing with amps and analog cables. If I would say a pair of Parasound JC 1 amps sound much better than my adequately powered receiver, it must be placebo affect and a cycle of lecturing to follow. DBT, level matching etc. from the same people. As they must have an agenda against better designed, built and sounding gear.

No, no one in that camp would even be here. The majority BDP-105 buyers are obviously trying to wring out premium sound from the analog section of the Oppo and certainly must realize that can be a differentiator in sound quality. Otherwise, a 103 for $500 would be a slam dunk - and just let the Pre/Pro sort out the analog output.

If it is Placebo affect - or heard mentality - then Oppo sure did one on us all.

Same thing with amps and analog cables. If I would say a pair of Parasound JC 1 amps sound much better than my adequately powered receiver, it must be placebo affect and a cycle of lecturing to follow. DBT, level matching etc. from the same people. As they must have an agenda against better designed, built and sounding gear.

I say do what sounds best to you through your own experience. The "experts" will say the same arguments over and over but in the end it is you that should decide the truth for yourself. Through listening and experience with different gear in your own system. If you find satisfaction with a budget system, stop right there and be happy! This is a hobby to me, something to enjoy not argue about!

will I get video in my Krell using HDMI 2? if you are using Split A/V can the HDMI sound be turned OFF? while using Split A/V in 105 is the HDMI still using the QDEO processor? What is used in HDMI 2 in SPlit A/V mode? It says in the manual page 55
"possible video in HDMI 2".

Thank you

Yes you can turn HDMI Audio OFF even when using Split A/V. Audio will still be available on the Analog outputs.

With Split A/V, HDMI 1 gets priority for video. That means there may be cases where HDMI 2 will get a black screen because the video can not be formatted on both outputs without reducing the quality of what would otherwise go out HDMI 1. But all you should have to do is turn off your Projector when you want to use the Krell video path. Since only one HDMI cable will then be "live" -- the one to the Krell -- the Split A/V vs Dual Display choice will simply be ignored and the best possible video (as well as audio if you haven't turned off HDMI Audio) will go to the Krell.

If you use Dual Display you WILL get usable video on both outputs even with both the Krell and the Projector turned on, but it will be limited to what both devices at the other end of the cable can handle.
--Bob

No, no one in that camp would even be here. The majority BDP-105 buyers are obviously trying to wring out premium sound from the analog section of the Oppo and certainly must realize that can be a differentiator in sound quality. Otherwise, a 103 for $500 would be a slam dunk - and just let the Pre/Pro sort out the analog output.

If it is Placebo affect - or heard mentality - then Oppo sure did one on us all.

Darn it. I didnt want to reply to this hdmi sq issue but i have to chime in. I did a stupid thing a few weeks back and bought an expensive hdmi cable knowing that hdmi was 101010s. I hate to say it but video wise made little/no appreciable difference compared to my old hdmi cable but somehow with audio it did a make a difference. The sound is better, highs are bit crisper and bass response is fuller and more extended. Sound was more dynamic, smoother and warmer. I know i know 1's and 0's but it did make a difference. Some people may not be sensitive enough to hear the difference. I will say it was not day and night, but definitely noticable to me. I'm not returning my cable....I agree this may need to be handled in another forum.

Btw the 105 is testing sweet. Liking the analog sounds i'm getting. Just got a pair of sky ic's...another pleasant surprise. stupid upgraditis...but starting to sound good. Now i got to try the plugging straight from oppo to amp. Still a little worried about doing that. Anything i need to be cautious of? I guess I will make sure volume is turned down...I'm using regular rcas.

Does anyone know if the xlr outputs on the oppo is superior to rca output? Is it worth changing to an avr or amp that can do xlr?

Does anyone know if the xlr outputs on the oppo is superior to rca output? Is it worth changing to an avr or amp that can do xlr?

That's inevitably going to lead to another theory discussion...HEADS UP.

The purpose of balanced cable connection is to reject noise. But the connections add transformers to the path. Hard core purists like short single-ended connections because they worry about the transformer coloration. Those who have to use long cable runs like balanced because of the noise rejection that becomes problematic especially with 50 foot or longer connections. And there is the problem of pseudo-balanced gear.

I sort of like the sound of the Jensen transformers for audio recording.

I recently switched my Pre-to-Amp connections from RCA to XLR just to hear there difference (2 foot long connects) and I can't say that I could hear anything really, but I like the physical layout better as the RCA connectors seemed all cluttered up around the other inputs. So, I'm sticking with the XLR - certainly didn't sound any worse and who knows - maybe those transformers warmed it up just ever so slightly subliminally.

Streaming using JRiver MC to the 105. I have some 24/192 files that play as they should via regular usb hook-up of a hard drive. However, even though I have JRMC set up to output 24/192, 48 kHz is all I get. PC with Windows 7. Does anyone know why it is being down-sampled?

check your streaming options?

Quote:

Originally Posted by 1soupmeister

close to pulling the trigger on the 105. I have a Motorola STB VIP 2262 and Apple TV. What is the best way to hook these up to the 105 and back to the TV which has 2 HDMI inputs. Audio will run from the 105 direct to the amps using the analogue outputs. No audio from the TV. What's best? The STB into the HDMI in the back of the 105 and the Apple to the HDMI in the front? Will the 105 then process the audio based on which box is in use and will the audio be processed automatically from the HDMI inputs or should I use the optical in from the STB? If this has already been covered please refer me to the post.
Thanks

You can also go optical from the appletv to the oppo and hdmi to the tv. I personally would never use the front hdmi because I don't want cables sticking out the front of my components. (except for headphones)

Quote:

Originally Posted by FlatRocky

Not sure if plex has been tried by any one or not? Plex server and plex player suppose to be running on the pc with using oppo as usb dac soundcard or hdmi or what ever other way you can feed sound output of pc to the oppo, then plex client/player app on the ipad controls the pc plex palyer. Plex app has the option as to what player we want to use, so choose the pc player insad of ipad or iphone. To me plex is better than everything else, just the cover art display alone makes it look beautiful. Also it gives you ability to control audio video controls while the content is being played, including lip synch. It has never failed me on oppo.
I would request oppo to include plex as an app along with existing netflix, pandora etc. it will take oppo to whole new level. Oppo already is my most favorite interms of cable box picture quality. With great sound quality, this a keeper.

yea, the control issue get pretty simple when youre using the 105 as a USB DAC. It's when you're using it as a renderer and want to control it from a phone/tablet that it gets complicated/doesn't always work right.

Streaming using JRiver MC to the 105. I have some 24/192 files that play as they should via regular usb hook-up of a hard drive. However, even though I have JRMC set up to output 24/192, 48 kHz is all I get. PC with Windows 7. Does anyone know why it is being down-sampled?

check your streaming options?

Quote:

Originally Posted by 1soupmeister

close to pulling the trigger on the 105. I have a Motorola STB VIP 2262 and Apple TV. What is the best way to hook these up to the 105 and back to the TV which has 2 HDMI inputs. Audio will run from the 105 direct to the amps using the analogue outputs. No audio from the TV. What's best? The STB into the HDMI in the back of the 105 and the Apple to the HDMI in the front? Will the 105 then process the audio based on which box is in use and will the audio be processed automatically from the HDMI inputs or should I use the optical in from the STB? If this has already been covered please refer me to the post.
Thanks

You can also go optical from the appletv to the oppo and hdmi to the tv. I personally would never use the front hdmi because I don't want cables sticking out the front of my components. (except for headphones)

Quote:

Originally Posted by FlatRocky

Not sure if plex has been tried by any one or not? Plex server and plex player suppose to be running on the pc with using oppo as usb dac soundcard or hdmi or what ever other way you can feed sound output of pc to the oppo, then plex client/player app on the ipad controls the pc plex palyer. Plex app has the option as to what player we want to use, so choose the pc player insad of ipad or iphone. To me plex is better than everything else, just the cover art display alone makes it look beautiful. Also it gives you ability to control audio video controls while the content is being played, including lip synch. It has never failed me on oppo.
I would request oppo to include plex as an app along with existing netflix, pandora etc. it will take oppo to whole new level. Oppo already is my most favorite interms of cable box picture quality. With great sound quality, this a keeper.

yea, the control issue get pretty simple when youre using the 105 as a USB DAC. It's when you're using it as a renderer and want to control it from a phone/tablet that it gets complicated/doesn't always work right.

Quote:

Originally Posted by ghostchili

Hi everyone, I just turned on my BDP-105 for the first time and upgraded the firmware. I upgraded from the BPB-93. I just got into the headphone world (Thanks to my 2 children ages 2 1/2 and 14 mo. the Emotiva and my Klipsch THX Ultra 2 speakers are off most of the time:mad:) I was wondering if Oppo upped the output of the headphone amp in any of the firmware updates. I know they have it set pretty conservative currently. I am going to get a pair of Audeze LCD-2 or LCD-3's. I like it loud and was going to hook up a Schiit amp directly to the xlrs and use the DAC on the Oppo and use the Oppo as the direct source with my cd's and SACD's.
Here is the amp I was looking at.http://schiit.com/cart/index.php?main_page=product_info&cPath=0&products_id=10

Laslty I'm letting my Oppo run a DVD-A in repeat for about 20 hours to burn it in. Is this a waste of time?

Thanks,
Lance

You can search this thread for all the LCD-2 discussions and see what people have thought. I'd start with the LCD-2 only then move up to the amp if needed. You may want it, esp if your your LCD-2 has balanced connectors but thats up to you. No volume firmware update. I really wouldn't worry about the DAC esp since the 105 already has a great set of XLR outs that can connect to that amp. There have been a few people who got rid of the external DACs they already had so I'm not sure I'd recommend buying one after already have the 105.

Well, I finally went all out. I sold my AVR and I traded up my XPA-200 amp for an XPA-5 so now the 105 handles everything for me. Í've been using the 105 to amp direct for music but not for TV or movies.

The sound is pretty amazing. Though I think most of that is from skipping my AVR which wasn't up to the level of the 105. I'm really liking these emotiva amps too - I'm a new customer to them.

I've never had any lip synch issues. I had a very slight one when watching OTA TV connected through ARC so I tried optical which had no issues then went back to ARC which suddenly didn't really have any lip synch issues so... not sure. And I guess I just got lucky.

I never liked my system for music that much. That's why I finally broke down and bought some Ascend Sierra speakers which I absolutely can't wait for. BUT last night I just sat and listed to music on my system for hours for the first time. I think it was my AVR that I just didn't like all this time. Or the XPA amps are really that good. hopefully I'll get blown away when my Polk's finally get replaced by my Sierra Towers (which wont be here for at least another month yet )

Well, my most recent issue is with DVD-Audio & SACD Multichannel playback. It seems that each time I first put in either of these disc formats that are multichannel I initially only get stereo output. I have to turn the player off, then turn back on. Then I get the multichannel output. No other adjustments are made prior or after. Anyone else experience this?

Yep, I hit this on the latest official FW release playing Nickel Creek "This Side" MCH SACD. Had to reboot the Oppo. No config changes.

It's quite alright and I'm quite capable of admitting I was wrong. I just wish there was some way to always watch with "placebo-vision". LOL

My wish is that Oppo would make the placebo effect even greater! How? Well, after reading a healthy dose of this thread it's clear the way a product looks has a very strong correlation with how people rate it. So to get an even greater placebo effect, my wish is for Oppo to improve the exterior design of the product to fit in with other high end audio products.

The BDP-105 looks (not sounds, looks) like a fish out of water in my audio system. I would surely enjoy it more if it looked the part it can play. Yes, it would cost more, but it would sound better, so I'd be willing to pay more

Often you can fix that by adjusting settings in the AVR. The most common reason for a device to keep an otherwise unused HDMI socket "live" is in support of HDMI CEC (remote control over the HDMI cable). Turn off settings in the AVR related to automatically powering On/Off other devices over the HDMI connections, or related to automatically switching inputs if another device is turned On.
--Bob

Thanks, Bob. Like Stevepow, I use a Marantz AVR (in my case, the 8801), and I am finding handshake issues with both HDMI outputs from the Oppo connected to my AVR. I have HDMI 1 out to the Blu-ray input on my AVR (which I use for movies to take advantage of Audyssey) and HDMI 2 out to the DVD input with Analog audio (which I use for DVD-A and SACD to take advantage of the 105's DACs). I use Dual mode in the 105 and get handshake issues (intermittent video or audio interruption). I tried using Split mode, but when I do I get no audio from HDMI 1 and no picture from HDMI 2. I assume this is because both inputs are live at all times. I'll see if there is a setting in the Marantz that will disable this.