AMD's monthly Catalyst video card driver updates are a little late this month, but there are a few Halloween treats in store for the faithful. The unified desktop Catalyst 11.10 drivers enable support for Eyefinity 5x1 display configurations in portrait and landscape modes. The maximum supported resolution has been increased to 16000 x 16000 pixels on the entire AMD Radeon HD 6000 series, but only on DirectX 11 applications. Additionally, bezel compensation is now possible when using sets of displays that have mismatched pixel densities.

Numerous fixes for Battlefield 3 were introduced in the Catalyst 11.10 V3 preview driver made available earlier this month. Those fixes have been incorporated into the final release, along with several fixes for OpenGL games.

CrossFire users should be sure to download the 11.9 CAP4 Catalyst Application Profiles as well. CAP4 resolves stuttering issues when running in CrossFire mode and better scaling for Battlefield 3. It also improves CrossFire performance for Deus Ex Missing Link, F1 2011, Dead Island, and the DX9 version of Arkham City.

CRT's have higher refresh rates than LCD panels and a lower G2G rate causing less motion blur. There are a few articles out there that show when using LCD you might be about 3 frames behind that of an CRT user. But are your reflexes up to being able to take advantage of it.

The Sony FW900, if you can still find it, is an incredible display. 24", 16:10, 2304x1440@80Hz (or 1080p@240Hz), plus you can calibrate it beyond what most LCDs are capable of in terms of color accuracy.

Once you experience raw, high resolution content beyond the 60Hz LCD barrier, it's hard to go back. Sure, it weighs about 100lbs, but it doesn't need to move. Take an LCD for LAN parties.

geometry/ blooming issures are for cheap displays with inadequate power supplies. Get a good quality flat tube and nothing compares for pq. lcd is good for power consumption, depth/weight, generally more available better antiglare, and larger sizes. CRT however will always reign supreme with contrast ratios people drool over someday having.

grampies 14" B&W magnavox might now compare, but apples for apples? Oh how quickly the crt has faded from everyone's mind.

Photographers are better off with CRTs as well. Cheap panels only provide 6 bit color, regardless of the million+ colors they claim on the box. Wide gamut flat panels are expensive and virtually always require calibration.

You guys are whack... lol, I guess if you "prefer" a bulky and very outdated technology on your "1337" station, then whatever. However, they do make professional LCD's panel that have much higher bit deph then 6-bits, 6-bits is in your average cheap monitor. Your forgetting there has been professional displays that have been in production for years in the medical industry and for science projects or movie production. They are developed by pretty much every major PC company, a la' Dell, NEC, HP, to name a few and are now pretty cheap.

In fact, just buying an Apple IPS Cinema display will give you better color reproduction then pretty much any CRT that wasn't a ultra high end CRT. Also of note, CRT's might've had great color reproduction, but don't forget that they are also not energy efficient and have pixel jags (good lcd's generally don't). Another thing, your visual cortex can't tell the difference after 24 fps anyways, making any argument of higher hz frequencies null and void, unless your dealing with eye strain, which defers among person to person. So alas, I'm not really buying into any of these arguments. I personally have a Samsung 950 TN panel that is 120hz and it's the clearest picture I've ever seen in my life, period. I use to work with high-end NEC CRT's and they don't come close, IMO.

No sir. Check out reviews of these "3D ready" 120hz capable monitors and the testers are all raving about how smooth the desktop become. I can easily see a difference of 100hz+fps and 120hz+fps in a blind test (blind as in I don't know what the monitor is set at). At these high speeds not everyone can see a difference, but everyone can at 60 vs 120 (or at least everyone I've heard about).

quote: I personally have a Samsung 950 TN panel that is 120hz and it's the clearest picture I've ever seen in my life, period

And you haven't discovered the improved desktop experience and solid looking gaming 120hz can provide? Maybe you havent set your desktop to 120hz, or sometimes windows render the desktop at 60fps.

I ran Fraps and found using the display's 120hz mode that once the framerates were up above 80 there is an amazing solidity and 3d-like quality to the gameplay .(I have no interest in getting 3d glasses at present.) Once the framerate hits 100+ - well, the effect has to be experienced to understand it.http://hardforum.com/showthread.php?t=1486357&page...

If the display has a frame interpolation or frame creation feature (MotionFlow, Automotion Plus, etc.) it will take 24fps material such as Blu-ray and effectively make it a 120fps or 240fps film - the effects of which are so noticeable by humans that it is called "the soap opera effect".

Humans can perceive beyond 24Hz, whether actual or effectual. While the human eye-brain interface may not be able to interpret 240 individual images per second, the effects of a 240Hz display (more stable image; enhanced clarity; reduced judder) are all noticeable to the human brain. To quote you:

quote: I personally have a Samsung 950 TN panel that is 120hz and it's the clearest picture I've ever seen in my life, period.

However, you should know that while your panel is operating at 120Hz like any other 120Hz TV, it is simply refreshing the native 60Hz digital signal from your computer twice as fast. With a CRT over analog VGA, 120fps can be realized with 120Hz. HDMI can't transmit more than 60Hz (75Hz in some cases).

24Hz was chosen as the standard for film because it was the most effective at conveying motion and sound at the lowest cost. Nearly all film projectors operate using shutter devices that display each frame twice (48Hz) or three times (72Hz) per second. Of course modern TVs with 120Hz or 240Hz (or 480Hz I guess) panels display the images 5x or 10x per second, either the same frame 5/10x times or use the frame creation mode I mentioned above.

Also, jaggies appear on CRTs because CRTs don't have a fixed resolution. The difference is as clear as digital versus analog. The weakness isn't the CRT, but the device sending the signal. If you push a high enough resolution, the jaggies go away because your eye can longer see them. If you feed any fixed-pixel display (plasma, LCD, etc.) with a resolution below its native resolution, picture quality will be degraded and you will either see jaggies or smudging from the required video upconvert processor. CRTs use an analog-driven continuous electron beam to energize phosphors in order to draw the image on the screen. This is why CRTs inherently scale so much better than digital displays.

As an owner several LCDs and plasmas, I'm no hater of digital displays, I love 'em, but I also know from experience most of their strengths and weaknesses.

No. His Samsung 950 is a nvidia 3d vision ready device that can be used in conjunction with shutter glasses. This means 120hz, alternating 60hz for each eye, from the computer through the DVI-D dual link cable and onto the display.

By saying clearity it is not clear whether the poster meant sharp colors, good black levels, godd contrast, vertical viewing angle or reduced blur and sample-and-hold smearing due to fast crystal switching and 120hz refresh rate.

Only if he's using DVI instead of HDMI. We don't know if he is or not because there are two 950s, both are 120Hz panels, but one has dual DVI for NVIDIA 3-D (S27A950) and the other only has HDMI (T27A950).

So depending on which one he has he would get to enjoy 120fps @ 120Hz for 2-D. However, if he's using it for 3-D with shutter glasses, then he will still be limited to 60Hz (per eye), thus 60fps of actual gameplay.

Dead wrong, your visual cortex can make a smooth image out of 24fps. But definitely can benefit from more. Besides that someone trained and used to 100fps will see 24fps like a slide show. I know movies look very jittery.

Ok so do this if you don't believe me. Install XBMC and turn the visualization from 24 to 60fps.

I learned that lesson the hard way when I bought an LG 27 inch monitor. I thought the picture was sharp, bright and vibrant but for the first three weeks of owning it, I thought something was odd.

This all changed when I opened up a .jpeg in paint.net on it. I observed something striking. When I zoomed in an image all the way to where each pixel was a large box on my screen... the monitor was dithering the box!!!!!! Separate pixels shouldn't be dithered, but it was.

I then took the monitor back to the store promptly, just in time for Fry's to honor their return policy before it was too late. This is why I make most of my high-impact hardware purchases locally... just in case I find something silly like this with things I buy.

Photographers are better off with CRTs as well. Cheap panels only provide 6 bit color, regardless of the million+ colors they claim on the box. Wide gamut flat panels are expensive and virtually always require calibration.

It may get you close, but still not there. My 24" BenQ LCD is still way behind my old viewsonic CRT. But that's all rather small compared to the difference my crappy internet here. My old cable internet made a much much much much much bigger difference.

input delay CERTAINLY still is an issue, it is just LESS of an issue. When I decided to get my panasonic plasma tv last year a large determining factor for me skipping out on a brand new lg plasma sale was the at times near 200ms input lag. While many good tv's have a quality fast display, you should certainly go your research if you are a all a gamer.

Back when I used to use CRTs, I would constantly get eye strain. Sometimes it only took an hour or two. Once I moved to LCDs (even my first one, which was a crappy Acer model), the eye strain never returned. Even at work, when I stare at my 27 inch iMac for 8-12 hours each day, it never bothers me.