bthylafh wrote:LCD panels don't flicker like CRTs do. I can see 60 Hz flicker; I can even see the difference between 75 Hz and 85 Hz. Anything under 85 gives me a headache, especially when the tube is displaying something bright and/or contrasty.

I'm glad that we've gotten rid of almost all of our CRTs at work because of this.

I can see 60 Hz flickering too, I'm just not bothered by any refresh rate that's between 60 and 85. Progressive 50 Hz or funny things like that start to get really weary on my eyes, but then again, I don't use stuff like that, just checked them out on the monitor for the sake of experience.

My desktop resolution runs at 80 however (actually works out okay for all movies too, no vsync), and I normally play games between 104-168 Hz (monitor is incapable of more).All that with stellar image "correctness" (I also have regular gamma correction set up) and no ugly interpolation even if game X needs a different resolution to run fine. The experience couldn't be any better for my uses.

Wtf is the dude making blanket statements about "LCDs" are ghosty as hell etc.What a load of bollocks.I went from a hi res CRT to the Samsung 206BW then to the U2410 I have now and there's no ghosting playing BF2 whether flying jets at high speed or twitching around during close in infantry fights.Get freaking current dude or post the worn out FUD elsewhere.

I'd say LCDs have gotten much much better, but there's still an amount of ghosting involved; particularly in contrasting and dark spaces. But speaking just for TNs in the 2-5ms range, it's a non-issue. It's a noticeable aberration to be sure, but it won't affect gameplay. Definitely not near as bad as my Samsung 204b was when I mirrored it with a LaCie Trinitron 19" (now dead and buried). Blur city there.

Beyond that, I will say that I can notice any refresh rate on CRTs less than 85Hz, and I much prefer 100Hz+.

I don't know how any of you can seriously suggest anything other than a TN panel for gaming. I've had my hands on all types and many (especially CS surf maps) showed terrible ghosting and/or input lag. TN panels are the only ones that are passable. I eventually settled on Sony's GDM-FW900 flat screen CRT. You won't find a better gaming monitor. 2304x1440 @ 85Hz can't be beat by any of the flat panel varieties.

Dashak wrote:I don't know how any of you can seriously suggest anything other than a TN panel for gaming. I've had my hands on all types and many (especially CS surf maps) showed terrible ghosting and/or input lag. TN panels are the only ones that are passable. I eventually settled on Sony's GDM-FW900 flat screen CRT. You won't find a better gaming monitor. 2304x1440 @ 85Hz can't be beat by any of the flat panel varieties.

You're mad. My E-IPS 2209WA doesn't have any ghosting that I truly notice, while gaming or otherwise, and the color reproduction is worlds better than any TN screen.

Bensam123 wrote:So this may sound like blasphemy as everyone has been creaming their pants over bigger and bigger LCD displays over the last few years and two years ago I gave in myself and upgraded to a 23" LG. I had a secondary 19" Samsung as well and after all this time I'm starting to have my doubts. Originally I didn't upgrade because I'm a hardcore gamer and the two things I enjoyed the most about CRTs over LCDs is the unbeatable response rate and contrast ratios.

Even to this day I'm still seeing these issues (ghosting, crappy contrast, seeing pixels shift). I've used a few different LCDs and I've used my friends LCDs and I've found myself regretting switching. I do enjoy having a widescreen LCD, that I will not lie about, but overall I find myself missing CRT. As expected, the CRT market has dried up and I have no idea where to go to compare new ones or even find out if there have been new ones released. So, I'm curious if anyone still lurks in this market or knows about any new technology?

Higher refresh rates and LED-backlighting are able to counter both of your response rate and contrast ratio complaints. CRTs used to be made that could operate at 120Hz. My Dell Dimension 350V's 17" Trinitron Screen could do 85Hz. It had no ghosting, but because the monitor alternated between an image and no image at a given frequency, it tended to produce headaches, especially at lower refresh rates. The high contrast was because the laser in the back of the monitor would be off for pixels that were supposed to be black. With LED backlighting, individual LEDs can be disabled to enhance contrast, which has a similar effect.

dude, lcds still suck for gaming. at best, you just get used to ghosting but it's still there (and i had a 2ms tn panel). tn panels have other issues as well, ips is too slow. if you ever have a dark scene, you have to deal with inconsistent backlighting, backlight bleeding and blacks that are dark gray at best. larger and larger lcds suck because you need more powerful video cards to render at native resolution and have decent frames (depends on game though).

This is why I do research on monitors and televisions before buying them. Everyone seems to like the monitors I buy. The review site I visit is TFT Central:

Bensam123 wrote:Aye, I know there is a lot of used monitors laying around. If I were to get one, it would be a 21" as my last one was a 19". I still have it, NEC MultiSync FE992. I was just wondering if there is a niche brand still manufacturing better CRT monitors.

Yes, I looked through LCD panels before I bought one. It was a trade off. Speed or picture clarity and crispness, I being a gamer, chose a TN panel instead of a IPS. I've looked at lots of different panels and played around with my friends as well. They still all have crappy contrast ratios compared to a monitor and if your eye is trained enough you can see the crystals shifting forms and it creates a water ripple effect that moves across the screen in areas that change. It is especially noticeable if entire 'slabs' of environment are shifting across it with some sort of noticeable texturing.

All of the LCDs still suffer from those inherent flaws. They seemingly stopped increasing the response time when they hit about 2ms (which is just some hyped up form of GTG and usually a lie) and the contrast ratios haven't seem to have been improving all that much either. They attempted to with dynamic contrast, but I HATE that with a passion as it messes with your eyes if you have decent eyesight and notice brightness changes. Speed and contrast are the two things I liked the most in any sort of monitor. If plasma screens were small enough and had the resolution of a LCD I'd switch to one of those.

Aperture grills and shadow mask monitors both have their ups and downs.

I was just curious if anyone was still into the technology or knew of any sources. No, I don't want you to dump your used monitors on me you wont take to the recycling plant cause it costs money.

My laptop has a TN panel. The color reproduction is awful, even with proper calibration. It is simply painful to look at it without proper calibration. Do your research on internet review sites to find out what panels are being used and how they compare before you buy a LCD screen. My general rule of thumb that you can follow is that any monitor that uses a non-TN panel made by Samsung is good quality. So far all of the reviews I have read support that rule.

Disclaimer: I over-analyze everything, so try not to be offended if I over-analyze something you wrote.

Shining Arcanine wrote:My general rule of thumb that you can follow is that any monitor that uses a non-TN panel made by Samsung is good quality. So far all of the reviews I have read support that rule.

Color reproduction is close to an IPS- although that's not their strength. S-PVA panels have very good black levels, and when over-driven (which introduces input lag) exhibit very little ghosting. These wind up in Samsung TVs. Thing is, input lag is a no-no, and so S-PVA panels are a no-go for gaming. Won't do it.

Either live with a TN or CRT, or find an IPS that has very little or very fast pre-processing. 30" IPS panels have very little, which make them my personal 'holy grail'.

Shining Arcanine wrote:My general rule of thumb that you can follow is that any monitor that uses a non-TN panel made by Samsung is good quality. So far all of the reviews I have read support that rule.

Color reproduction is close to an IPS- although that's not their strength. S-PVA panels have very good black levels, and when over-driven (which introduces input lag) exhibit very little ghosting. These wind up in Samsung TVs. Thing is, input lag is a no-no, and so S-PVA panels are a no-go for gaming. Won't do it.

Either live with a TN or CRT, or find an IPS that has very little or very fast pre-processing. 30" IPS panels have very little, which make them my personal 'holy grail'.

When I played World of Warcraft, I was able to do raids on my Dell 2405FPW without any issues. My main concern was having it calibrated properly, because poor color reproduction makes it difficult to identify important details. Failing to recognize details can be very bad when you are raiding.

Disclaimer: I over-analyze everything, so try not to be offended if I over-analyze something you wrote.

Shining Arcanine wrote:Higher refresh rates and LED-backlighting are able to counter both of your response rate and contrast ratio complaints. CRTs used to be made that could operate at 120Hz. My Dell Dimension 350V's 17" Trinitron Screen could do 85Hz. It had no ghosting, but because the monitor alternated between an image and no image at a given frequency, it tended to produce headaches, especially at lower refresh rates.

Refresh rate alone does not determine pixel response time. The LCD elements themselves have a response time, and take a certain number of milliseconds to respond after they are refreshed.

The high contrast was because the laser in the back of the monitor would be off for pixels that were supposed to be black.

CRTs don't use lasers. In fact, the invention of the CRT predates that of the laser by over a half century.

With LED backlighting, individual LEDs can be disabled to enhance contrast, which has a similar effect.

Some LCDs dim the entire backlight down for darker scenes.

I'm not aware of any that dim or disable individual LEDs to enhance contrast within a single high contrast scene; has someone started doing this? It sounds like it would be very difficult to do this properly, without causing a perception of uneven illumination.

The years just pass like trains. I wave, but they don't slow down.-- Steven Wilson

just brew it! wrote:I'm not aware of any that dim or disable individual LEDs to enhance contrast within a single high contrast scene; has someone started doing this? It sounds like it would be very difficult to do this properly, without causing a perception of uneven illumination.

Look up "local dimming". It was done in more expensive TVs and the results are pretty good. It got close to plasma but made the units more expensive than the plasmas. These days "LED backlight" is mostly edge-lit.

There was a front page article about LED local dimming from CES a couple years back I think (or 3?).

The Model M is not for the faint of heart. You either like them or hate them.

Shining Arcanine wrote:The high contrast was because the laser in the back of the monitor would be off for pixels that were supposed to be black. With LED backlighting, individual LEDs can be disabled to enhance contrast, which has a similar effect.

You must mean the electron gun, which is a standard feature of CRT technology but is definitely not a laser. True enough, it doesn't energize pixels that are intended to be black, although LED local dimming is hardly the only modern technology that offers a similar effect -- DLP illumination comes close (corrupted mainly by some washover effects from adjacent screen area) and plasma does it natively.

Shining Arcanine wrote:My Dell Dimension 350V's 17" Trinitron Screen could do 85Hz. It had no ghosting, but because the monitor alternated between an image and no image at a given frequency, it tended to produce headaches, especially at lower refresh rates.

As others already pointed out, you're full of garbage through and through, but nobody pointed out this particular quote yet.CRTs don't alternate between "image" and "no image". Not sure where you got that from. As far as their logic goes, image display is continuous.I'll be happy to explain it more if you don't get it, however.

Meadows wrote:As others already pointed out, you're full of garbage through and through, but nobody pointed out this particular quote yet.CRTs don't alternate between "image" and "no image". Not sure where you got that from. As far as their logic goes, image display is continuous.I'll be happy to explain it more if you don't get it, however.

Easy on the trigger, Tex. He's just describing CRT flicker in layman's terms.

Geez, it seems even the most rabid CRT fans refuse to accept the fact that LCD for all intends and purpose have finally match their CRT predecessors. They also willful ignore the nasty drawbacks of CRTs; imperfect geometry, sensitivity to EMI, image quality depends greatly on RAMDAC implementation on the video card, shadow mask versus aperture grill, mass, volume, power consumption and heat dissipation issues.

Shining Arcanine wrote:My Dell Dimension 350V's 17" Trinitron Screen could do 85Hz. It had no ghosting, but because the monitor alternated between an image and no image at a given frequency, it tended to produce headaches, especially at lower refresh rates.

As others already pointed out, you're full of garbage through and through, but nobody pointed out this particular quote yet.CRTs don't alternate between "image" and "no image". Not sure where you got that from. As far as their logic goes, image display is continuous.I'll be happy to explain it more if you don't get it, however.

After a pixel is hit by the electron gun, its brightness will behave according to mathematical function, where it goes up and then declines, until the next hit, which occurs at exactly 1/<lamda> seconds later. During that period, the pixel will become dim. The entire screen behaves that way and it occurs like a wave, with pixels all behaving the same exact way, at different phase angles from one another depending on how far apart they are in terms of the ordering of their attention from the electron gun. This causes the screen to behave in a strange way, where it essentially goes from light to dark repeatedly.

If you do not believe me, take a picture of a CRT's image with a digital camera and a picture of a LCD's image with a digital camera. You will find that the picture from the LCD is consistent while the picture from the CRT is not. You will likely see a lower dark band and an upper bright band. The upper bright band are the pixels that the electron gun recently visited while the lower dark band is where it is heading. If you look at brightness, you will notice that towards the top of the upper band, it is about the same as the bottom of the lower band. That is the fade-out that occurs and it is why the screen is basically alternating between dark and bright. It is what causes headaches. Higher refresh rates make the phase angles smaller while lower refresh rates make the phase angles larger. A large enough phase angle and you get pixels that are not only dim, but basically black.

Because of the motion of the electron gun, CRTs display a series of pulses of an image rather than a continuous image. If you disagree, I would be interested in hearing how you think CRT technology works. It should make an interesting conversation topic should I make small-talk with a physicist in the future.

Disclaimer: I over-analyze everything, so try not to be offended if I over-analyze something you wrote.

Shining Arcanine wrote:My Dell Dimension 350V's 17" Trinitron Screen could do 85Hz. It had no ghosting, but because the monitor alternated between an image and no image at a given frequency, it tended to produce headaches, especially at lower refresh rates.

As others already pointed out, you're full of garbage through and through, but nobody pointed out this particular quote yet.CRTs don't alternate between "image" and "no image". Not sure where you got that from. As far as their logic goes, image display is continuous.I'll be happy to explain it more if you don't get it, however.

After a pixel is hit by the electron gun, its brightness will behave according to mathematical function, where it goes up and then declines, until the next hit, which occurs at exactly 1/<lamda> seconds later. During that period, the pixel will become dim. The entire screen behaves that way and it occurs like a wave, with pixels all behaving the same exact way, at different phase angles from one another depending on how far apart they are in terms of the ordering of their attention from the electron gun. This causes the screen to behave in a strange way, where it essentially goes from light to dark repeatedly.

If you do not believe me, take a picture of a CRT's image with a digital camera and a picture of a LCD's image with a digital camera. You will find that the picture from the LCD is consistent while the picture from the CRT is not. You will likely see a lower dark band and an upper bright band. The upper bright band are the pixels that the electron gun recently visited while the lower dark band is where it is heading. If you look at brightness, you will notice that towards the top of the upper band, it is about the same as the bottom of the lower band. That is the fade-out that occurs and it is why the screen is basically alternating between dark and bright. It is what causes headaches. Higher refresh rates make the phase angles smaller while lower refresh rates make the phase angles larger. A large enough phase angle and you get pixels that are not only dim, but basically black.

Because of the motion of the electron gun, CRTs display a series of pulses of an image rather than a continuous image. If you disagree, I would be interested in hearing how you think CRT technology works. It should make an interesting conversation topic should I make small-talk with a physicist in the future.

It's good to see you get it after all, because your initial comment had nothing to do with this truth. Normally CRTs don't alternate between "bright" and "black" (or "image" and "no image", as I mentioned it first), but rather "bright" and "not so bright", which becomes virtually unnoticeable after a certain frequency. Nevertheless, your first-time quote made you look ignorant enough that I had to point it out, much like other people pointed out different kinds of nonsense you also said.

Meadows wrote:It's good to see you get it after all, because your initial comment had nothing to do with this truth. Normally CRTs don't alternate between "bright" and "black" (or "image" and "no image", as I mentioned it first), but rather "bright" and "not so bright", which becomes virtually unnoticeable after a certain frequency. Nevertheless, your first-time quote made you look ignorant enough that I had to point it out, much like other people pointed out different kinds of nonsense you also said.

There is no nonsense involved. Go outdoors at night after being in a brightly lit room and try to look at the stars. You will see very few, if any, but if you keep looking for a few minutes while your eyes adjust, you will begin to see many more than you did originally. The human eye is not capable of adjusting to sharp changes in brightness, so if look at a bright source of light and then look at a dim one, you will likely not see the dim one at all. Given the quick transitions between bright and dark on CRTs, there might as well be no image at all after the pixel dims below a certain threshold.

While this effect might become unnoticeable to you at a certain frequency, there are people who claim to be capable of recognizing it at all frequencies used in practice, even when they are at 120Hz. Such detection by humans is possible because human nerve synapses operate at a rate of approximately 250Hz, with the nerves firing at random intervals, allowing for very fine grained detection, which in many cases manifests itself as a headache.

Disclaimer: I over-analyze everything, so try not to be offended if I over-analyze something you wrote.

Bensam123 wrote:So this may sound like blasphemy as everyone has been creaming their pants over bigger and bigger LCD displays over the last few years and two years ago I gave in myself and upgraded to a 23" LG. I had a secondary 19" Samsung as well and after all this time I'm starting to have my doubts. Originally I didn't upgrade because I'm a hardcore gamer and the two things I enjoyed the most about CRTs over LCDs is the unbeatable response rate and contrast ratios.

Even to this day I'm still seeing these issues (ghosting, crappy contrast, seeing pixels shift). I've used a few different LCDs and I've used my friends LCDs and I've found myself regretting switching. I do enjoy having a widescreen LCD, that I will not lie about, but overall I find myself missing CRT. As expected, the CRT market has dried up and I have no idea where to go to compare new ones or even find out if there have been new ones released. So, I'm curious if anyone still lurks in this market or knows about any new technology?

OP, may I ask what you decided to do, and if you're happy with your decision?