Calibrationg an LCD is a complete waste of time and money! Calibrating a Plasma reaps significant benefits in performance. The TV will run cooler and consume up to 30% less energy! It will look better, more accurate, more detail and the colors will just jump out at you. Different plasma respond differently to a full calibration. Don't expect to see anything better on an LCD calibration. It most likely will look worse if you calibrate an LCD!

And the world is flat as well....don't go too far away or you will fall off the edges....

HDMI is a digital solution. 1's and 0's. With digital, there is no degredation of signal like there is with analog. You either get it or you don't. If you've ever had a bad hdmi cable, you know what I'm talking about. A $10 cable you can get from monoprice will give you the exact same quality picture as a $100 cable you get at BB. Don't let anyone tell you any different. And if you don't believe me, just do a quick search and you'll find plenty of articles on this subject to back me up.

Also, I'm a computer engineer in my day job, so I know a little about 1's and 0's. ;-)

The digital signal could get extremely corrupted [invert] as it accumulates noise over a long length of cable. Just buy a hdmi certified logo cable to be safe better than sorry. Buying $$$$ PJ to couple with cheapo hdmi cable just to save a few $$ does not make sense, does it. Sony 1.3a 5m run length comes at just about 60$ so why expose & frustate oneself with the consequences of Murphy's Law just to save say what a 30$?

I was surfing through the avs forums after a long time away, and just had to respond to this, and other similar posts.
HDMI is a digital solution, yes, but thinking it suffers no degradation if there is a picture is absolutely incorrect.
Just like Ethernet, which uses Differential Manchester Encoding, the cable medium actually carries an encoded signal. That is, dips and peaks in the electrical signal at a set oscillation. However, these are more of what you would call a logical dip and peak, with the actual signal actually having quite a bit of sloping and other non-uniform structures to the signal. The encoding and decoder then use a particular tolerance level for judging the main underlying signal.
Thing is, wires are not made of super conductive material (unless you live in liquid Nitrogen ), so they suffer from resistance. Even more, they still suffer from cross-talk. Examining the HDMI cable structure, it seems they have attempted to minimize cross-talk, but outside influences can still affect them. The result? Periodic fluctuations in the signal along with, at times, incorrect decoding of the signal if the resistance of the wires, over a certain distance, start to decrease the difference between the peaks and dips in the encoded signal.
The signal may be digital, but it still uses electricity to carry the signal between sources. If the signal gets degraded, like mentioned above, you'll still get a picture (assuming the signal degradation doesn't mess with the security handshaking) but it could very well affect the information on the pixel level. This may represent itself in incorrect color representation, flickering pixels...etc, while still receiving a full image. Some of which, i have experienced personally.

Even in lick nitro, the cable would still be afflicted with random noise. Only at 0 Kelvin[a theoretical ideal temperature at which all atomic motion is supposed to cease] can the noise be zero. So, you could have only random parts of the image affected with any signal- digital or analog. Those random parts would show up as noise[snow/flicker etc] or as image deformities. No cable is free from noise, though digital [hdmi] can take all noise bashing at moderate levels of noise with no distortion, up the noise & the info contained reverses!! For low noise medium/cable, digital will have minimal imperfections. With high noise levels, digital will be totally grotesque & for somewhat median noise levels some bits[ picture/sound carrying bits] can be in error & some error free. Hope this clears up the digital mythos.

In my district, there are two ISF techs; Brian and Gregg. Both are excellent. They calibrate the demo models in the store from time to time and they look superb. Post-calibration color delta readings are below 3% on higher-end sets. So, if you expect your TV be in top PQ form, get a calibration.

On Monster:

I've written at this forum a number of times that the superiority of Monster cables is a very well-sold lie. Anyone with a shred of knowledge of A/V knows that when "120Hz Certified" is marked on a cable, you're dealing with outright dishonesty. Monster hosts training meetings now and then, and I am telling you what, its like sitting in Sunday service with a cult.

On random and unnecessary insults:

Quote:

One of the idiots at a Magnolia told me...and I kid you not.....that front projectors cannot produce real HD because the light spreads out as soon as it leaves the lens. 1080P is impossible with a front projector. Best Buy store on Airport HGY, Toledo Ohio.

Seriously...he argued with me. Why would you even consider using these fools to do anything in your home?

So, let's look at the reasoning, here.

Premise: One (presumably green) Magnolia Pro made a false assertion about front projection. That is, ∃x∃y(Fxy & Mx) (read, there exists some x (the salesman), who made false statement (F) about y (front projection technology); and that same x is a Magnolia employee.

Therefore: All Best Buy/Geek Squad employees are inept. Id est, ∀x(Bx → Ix) (read, for any x, if x is BB/GS employee, then x is inept).

This inference doesn't look like it is going to be accommodated by a consistent decision procedure, so if you don't mind, please either retract your rash and rude generalization or deliver a full proof for the rule of inference to which you've appealed.

If you'd like me to affix semantic values to each node of the omitted construction trees (assume MIT school syntactical analysis), I'd be happy to render a Montague-school currying of the predicate calculus expressions, with full recursive definitions in set theoretic terms.
__

The argument is constructed this way in an attempt to demonstrate that Best Buy employees are very often articulate, well-informed, and capable of offering logically/scientifically up-to-par advice and analysis.

I am a Magnolia Pro, and I am not going to be shy about this: in most respects, I am damned clever. Calling me and all of the other sharp BB employees "fools" is stupid.

Even in lick nitro, the cable would still be afflicted with random noise. Only at 0 Kelvin[a theoretical ideal temperature at which all atomic motion is supposed to cease] can the noise be zero.

To go a bit more indepth, I brought up liquid nitrogen because i was speaking of super conductors. I did not bring up absolute zero because it was not relevant. The point with the super conductors was to talk about a theoretical zero resistant cable. This removes cable distance from the equation, as resistance increases with distance, and slowly reduces the amplitude of a signal. However, since we're on the topic of az, if we were to reach absolute zero (though it is impossible), it affects the molecular level, not the atomic level, or else electricity would cease at or near absolute zero. But the opposite is true, the closer we get to absolute zero, the better a material is as a conductor.
But, back on topic, while it is impossible to make a completely noise free cable, it is highly possible to create a cable that is 100% accurate at transferring its data. Digital transference, while still using electricity, uses a threshold. If you make the cable super conductive, you then only have to worry about crosstalk, however, there are methods to deal with that. And then you have outside induced interference, however this can be dealt with by differential signaling methods. So, with these things added together, you can make a very accurate transfer cable.
Though, this is not practical, so we have our regular hdmi and other digital cables. Here, we make cables that work under a minimum of standards for practicality sake. Now, because we have no super conductive wires, we must take distance into consideration, because of this, digital signals CAN and WILL succumb to interference as their signal voltage level drops. Digital signals are not immune to interference as i have stated before. However, there is some merit in buying the better (read "better" not more expensive) cables, as they often have better shielding. However, the cheaper cables will more than likely work for most people.
My point is that you shouldn't sway too much one way or the other. Don't tell people to avoid cheap cables like the plague, and always go for the expensive cables and that those 1's and 0's are immune to be altered (the annoying, either it gets there or it doesn't argument). However, don't go around telling people that those expensive cables are useless and always go the cheap route. I say it depends on your needs, look at your cable run length, look at how many and what sources of possible interference you may have...etc, then make your choice.

I am a Magnolia Pro, and I am not going to be shy about this: in most respects, I am damned clever. Calling me and all of the other sharp BB employees "fools" is stupid.

While i agree stereotyping all employees of bb as "fools" is not right, logically or otherwise. However, the great majority of them, seem to be misinformed and illogical (from personal experience). Sure, there are some bright ones here and there, but imo, if they don't know their stuff, why the hell do you have them out there? Maybe its just hard to find good people, but why would you place someone who obviously does not know what they're talking about in that section at bb? I feel its as much a bb management problem as much as it is an employee one. However, while you may take offense to what is said about bb and its employees, you cannot fault someone for sharing the experience. And it just so happens that the majority of the bb employees that we, here, happen to run into are inexperienced and ill/uninformed.

To go a bit more indepth, I brought up liquid nitrogen because i was speaking of super conductors. I did not bring up absolute zero because it was not relevant. The point with the super conductors was to talk about a theoretical zero resistant cable. This removes cable distance from the equation, as resistance increases with distance, and slowly reduces the amplitude of a signal. However, since we're on the topic of az, if we were to reach absolute zero (though it is impossible), it affects the molecular level, not the atomic level, or else electricity would cease at or near absolute zero. But the opposite is true, the closer we get to absolute zero, the better a material is as a conductor.
But, back on topic, while it is impossible to make a completely noise free cable, it is highly possible to create a cable that is 100% accurate at transferring its data. Digital transference, while still using electricity, uses a threshold. If you make the cable super conductive, you then only have to worry about crosstalk, however, there are methods to deal with that. And then you have outside induced interference, however this can be dealt with by differential signaling methods. So, with these things added together, you can make a very accurate transfer cable.
Though, this is not practical, so we have our regular hdmi and other digital cables. Here, we make cables that work under a minimum of standards for practicality sake. Now, because we have no super conductive wires, we must take distance into consideration, because of this, digital signals CAN and WILL succumb to interference as their signal voltage level drops. Digital signals are not immune to interference as i have stated before. However, there is some merit in buying the better (read "better" not more expensive) cables, as they often have better shielding. However, the cheaper cables will more than likely work for most people.
My point is that you shouldn't sway too much one way or the other. Don't tell people to avoid cheap cables like the plague, and always go for the expensive cables and that those 1's and 0's are immune to be altered (the annoying, either it gets there or it doesn't argument). However, don't go around telling people that those expensive cables are useless and always go the cheap route. I say it depends on your needs, look at your cable run length, look at how many and what sources of possible interference you may have...etc, then make your choice.

Are you kidding me. Crosstalk, interference are all structured info, they can be eliminated completely through training[remember them bulky noise cancelling headphones], noise you cannot do away with, there is no way a digicable can transmit 100% correct bits over any length of wire without sufficient coding. A perfectly transferable cable cannot be guaranteed as the display source/receiver do not incorporate error correction capabilities.

Are you kidding me. Crosstalk, interference are all structured info, they can be eliminated completely through training[remember them bulky noise cancelling headphones], noise you cannot do away with, there is no way a digicable can transmit 100% correct bits over any length of wire without sufficient coding. A perfectly transferable cable cannot be guaranteed as the display source/receiver do not incorporate error correction capabilities.

You need to do some learning in electrical engineering. Especially lookup differential signaling. My point if that if you use an electrical encoding scheme (like Manchester encoding), combined with sufficient shielding applied in the right way, combined with a differential signaling method, and without distance as part of the equation, you can make a really accurate cable, short of some catastrophic failure of some sort. Heck, you could even do high voltage differential signaling, on a super conductive cable, over a shirt distance, with good shielding... it would be near impossible to induce any type of sufficient noise into the signal, unless you really tried. The point is that you don't need to get rid of all the noise, what you can do is raise the noise immunity a cable system has.

"A perfectly transferable cable cannot be guaranteed as the display source/receiver do not incorporate error correction capabilities."
That is not entirely true either. You could easily have a something like a buffer for the signal feed. This would allow plenty of time to allow some ecc to be practical. Streaming video does this.

You need to do some learning in electrical engineering. Especially lookup differential signaling. My point if that if you use an electrical encoding scheme (like Manchester encoding), combined with sufficient shielding applied in the right way, combined with a differential signaling method, and without distance as part of the equation, you can make a really accurate cable, short of some catastrophic failure of some sort. Heck, you could even do high voltage differential signaling, on a super conductive cable, over a shirt distance, with good shielding... it would be near impossible to induce any type of sufficient noise into the signal, unless you really tried. The point is that you don't need to get rid of all the noise, what you can do is raise the noise immunity a cable system has.

"A perfectly transferable cable cannot be guaranteed as the display source/receiver do not incorporate error correction capabilities."
That is not entirely true either. You could easily have a something like a buffer for the signal feed. This would allow plenty of time to allow some ecc to be practical. Streaming video does this.

Well, I would consider a BB calibration if I knew FOR SURE they could correct the color decoder on my Panny 42PX80U. I've adjusted the settings the best I can from my DVE and it looks great, but I can tell that the decoder is off. The reason I'm skeptical that they'd be able to correct the issue is because the service menu doesn't seem to have an option to adjust the reds or greens.

Are we speaking of differential encoding or differential signaling? They're not the same as far as my understanding goes. I was speaking of differential signaling. With differential signaling you're taking a signal, and transmitting it through two wires (generally twisted together), one wire is carrying the signal with one phase, and the other wire is transmitting the signal with the opposite phase. When the signal gets to the device, the difference is taken to find the actual signal. What this allows is a large rise in noise immunity, as any noise that enters one wire also enters the other (for the most part), and when the difference is taken, the noise is pretty much subtracted and removed.
Although, i admit i might have exaggerated when i stated it as a "perfect" cable, as perfect is hard to prove. However, i would like to restate that as "almost perfect" under the above stated conditions.

"over any cable then"
But i never stated it would be possible with "any" cable, only the theoretical one i stated in my above post.

Are we speaking of differential encoding or differential signaling? They're not the same as far as my understanding goes. I was speaking of differential signaling. With differential signaling you're taking a signal, and transmitting it through two wires (generally twisted together), one wire is carrying the signal with one phase, and the other wire is transmitting the signal with the opposite phase. When the signal gets to the device, the difference is taken to find the actual signal. What this allows is a large rise in noise immunity, as any noise that enters one wire also enters the other (for the most part), and when the difference is taken, the noise is pretty much subtracted and removed.
Although, i admit i might have exaggerated when i stated it as a "perfect" cable, as perfect is hard to prove. However, i would like to restate that as "almost perfect" under the above stated conditions.

"over any cable then"
But i never stated it would be possible with "any" cable, only the theoretical one i stated in my above post.

You don't need differential signaling for this, you just need a balanced line. A balanced line need not be differential, but can be.

So this may be off topic (it's about calibration in this anti-BB HDMI cabling thread )

From what I've been able to ascertain, a calibrated tv actually uses more power than an uncalibrated tv.

Its inaccurate to use a blacket statement that a display will use less/more energy after calibration.
On many lamp based displays, you can get a great calibration, but never touch any type of lamp power settings. So, in this case, your power will be the same before and after. In other cases with plasma, it may be more or less, but there is no guarantee it which it will be, and from some of the plasma's i've seen, power savings will only be realized if you touch the "power savings" options, which puts it in the same place as lamp based displays. Lcd displays, purely the panel based ones, will only get a power difference if you boost or lower the backlight... and again, there is no guarantee this setting will be altered at all during a calibration.
The whole "power savings" thing is just a selling point, and is only a possibility, not a guarantee.

I would say based on the large majority of owner's that currently watch their display in "torch" mode, power savings will be achieved through calibration. Especially with regards to plasma display panels and to a slightly lesser extent, LCD's.
Not to mention, extended operating life.

I would say based on the large majority of owner's that currently watch their display in "torch" mode, power savings will be achieved through calibration. Especially with regards to plasma display panels and to a slightly lesser extent, LCD's.
Not to mention, extended operating life.

I went ahead and let BB calibrate my 42PZ700U. I must say I was impressed. Was at the job for 3-4 hours, and was using the service menu to do the job. He knew what he was doing, and even offered to calibrate my smaller LCD (26g40) for only an additional $50. I'm sure that not all BB techs are near this good, and some might be downright horrible, but I was impressed. In Indianapolis, it was worth it to me to have BB do the job. Now if they could only figure out how to stop BS-ing people on cables and their protection plans.

Many TVs are now offering the necessary controls for calibration in the user menus. However, many also still limit such adjustments to a service-level menu. Reliability of information heard at Best Buy is probably slightly less than what can be encountered on public forums or in the media. Cross check everything to see if it's opinion, misinterpretation of the facts, or solidly founded upon imaging science and display industry standards and recommended practices.

Best Buy fills a legitimate need in the marketplace. The electronics consuming masses consider price first and foremost, then convenience a close second. They will suffer the abuse of third-rate advice and third-rate performance for the sake of saving a few bucks and/or getting it at the local big box store (ie. Visio). The pursuit of excellence is not valued by the general populace.

"Nobody ever went broke underestimating the taste of the American public." H. L. Mencken

Excellence and expertise at Best Buy is the exception rather than the rule, a notch better than WalMart or Costco. Some employees do try harder, particularly in the Magnolia departments. They still don't pay very well because of the harsh realities of internet pricing and their brick and mortar overhead.

Which TV' s(plasma) offer the necessary controls for calibration in the user menu? Thanks.

Of course the freakin' CE manufactures could easily avoid the whole problem by putting in a menu item called "Standard" and actually have it calibrated to the proper d65 greyscale, gamut, color matrix, et al. But nooooo!

You can't do it at the factory, because a proper calibration has to be done in the room under the same general conditions under which you watch that tv.

COMING SOON – Finding the Ark of the Covenant by Brian Roberts, in the iBook Store on iTunes, a new investigation into the Hebrew’s Most Sacred Relic!

This is not correct. Anyone who knows what they are doing can calibrate a display. I have my Samsung LN-T4661F completely D65 calibrated in the service menu so that I don't have to touch anything but the backlight in the user menu. Granted, that model of TV doesn't have anything beyond greyscale controls, so you can't set the gamut on the TV, but to say that your average Joe can't "completely" calibrate a TV is patently false.

So, if you're leaving settings untouched that may or may not be at D65 ref levels, then how are you 100% calibrating it?

COMING SOON – Finding the Ark of the Covenant by Brian Roberts, in the iBook Store on iTunes, a new investigation into the Hebrew’s Most Sacred Relic!

I went ahead and let BB calibrate my 42PZ700U. I must say I was impressed. Was at the job for 3-4 hours, and was using the service menu to do the job. He knew what he was doing, and even offered to calibrate my smaller LCD (26g40) for only an additional $50. I'm sure that not all BB techs are near this good, and some might be downright horrible, but I was impressed. In Indianapolis, it was worth it to me to have BB do the job. Now if they could only figure out how to stop BS-ing people on cables and their protection plans.

Good to hear. I got a deal on a Best Buy Calibration but am somewhat worried how good of a job it will be. This somewhat eases my fears.

It completely depends on who you get. Some of their calibrators are likely very good, within the limitations of the system and their equipment, which is adequate for a Panasonic to do basic gray scale.

Think of Best Buy like fast food. Don't expect too much and you will not be surprised.