Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

At about the same time, two readers sent in questions about those expensive LCD panels that, if you are not fortunate enough to own one, you are envious of anyone who does (I know I am!). However, these two questions raise some interesting issues which I'm sure those of you looking to buy one, may be asking as soon as you make that purchase. One issue is longevity: how long do those LCD pixels last? Another issue is cost: why don't LCD manufacturers make lower-cost Monochrome LCD screens available for those who don't need to work in full-color glory?

tester data

Jack Frost IV asks: "Since higher resolution LCD panels have started to become a lot more common, many people have been complaining about dead pixels. I just received two SGI flat panels direct from the factory, and each shipped with a single dead pixel. In fact, the second display was a replacement for the first (for an issue unrelated to the dead pixel). While I understand the difficulties in manufacturing the displays, the single dead pixel doesn't concern me right now, as I don't notice it often. What bothers me is how many more pixels these displays will lose during a lifetime; say over the next three to five years. I want to replace a lot of my CRTs (Yes, *some* of us don't do color critical work, and LCDs are perfectly OK) with LCD panels, but if, say, a dozen pixels are going to die in 3 - 5 years, it's going to be quite annoying. SGI claims that once the pixel burns out, you'll never notice it was gone... but I don't buy it. Can anyone explain some of the longevity and degredation issues relating to flat panels?"

The rationales that I've heard behind the high costs of these monitors were:

The manufacturing yield is low because of the large number of transistors that need to all work properly to display colours.

The economy of scale currently is aimed at laptop users, for whom a 1600x1200 screen is impractically large (unless you happen to be Andre the Giant)

Ignoring the second part, why doesn't anyone make a decentmonochrome LCD monitor for those of us who want a large screen but don't necessarily need color?

In my case, I want to edit multi-channel audio. A Color display adds almost nothing to the information that I extract from the screen. I can select, cut, copy paste, apply effects, and otherwise mangle the sounds as well on a 1-bit per pixel display as I can on a 32-bpp monster.

I am also a technical writer. The documents I write are produced on a B&W laser printer, mostly. Certainly, on-line documents (and even most printed ones) can benefit from intelligent use of colour for various reasons, but most printed documents end up in black and white (mostly for cost reasons). Again, colour adds little to the experience.

Although I can't help with some parts of the article, I can give you some experience I have. I have a very old laptop ~10 years old. It's monocrome, 16 grey scale (this laptop is a 386/25, BTW.) It still works fine, no dead pixels, but the backlighting isn't the best in the world either. But it still works. So they can last around 10 years, if not longer. Probably the backlighting bulbs are the major problem, but I've never replaced them either.
My own advice for anyone buying an LCD is to buy a brand that probably will be in business several years, like IBM. Might be higher, but think I can get backlight bulbs for this laptop anymore? Not likely.

It may be happening more slowly than we would
like, but LCD prices are dropping. I found a
Princeton LCD17 for $699 "while supplies last" at
the local Fry's earlier this week. 15-inch LCD
panels are easy to find for less than $500 now.

When comparing CRT and LCD displays, I think that
a lot of people fall into the "equal size trap".
I.e. they think that they need a 17" LCD to
replace a 17" monitor, etc. This fails to account
for LCD's superior clarity, lack of flicker, and
visible area. Most people could happily replace a
17" CRT with a 15", etc.

Remember about, what was it, 4 years back? when monitor manufacturers began inflating monitor sizes. What was once advertised as a 16" monitor became a 17" monitor, despite only having a viewable size of 15.8". As you said, not far off a 15.1" LCD.

I've 12 of them for 4 years now without any dead pixels. They are Apple 15" LCD displays (first and second generation with VGA connectors) used mostly on Linux station (PCs) or Apple G3/G4. They have the size of a taiwanese cathodic 17", a longer longevity than most of our Sony trinitron, and a better comfort for users eyes.
They were not really more expensive than good quality cathodic display (about one third more) but what a gain of room on our desktop!

So bad Apple doesn't manufacture them anymore (the new ones don't have a VGA connector) but at least they have proven that it was possible to sell good quality LCDs at a reasonable price.

Well, NTSC is 30 frames a second drop-frame. Every 10 seconds a frame in dropped (It's late, might not be 10). So it turns out to be 29.97.

Analogue resolution is measured in horizontal scan lines, you can't say it's "640x480", it doesn't work that way in the analogue world. A BetacamSP deck can record upwards of 500 lines of resolution, a DV deck (Consumer mini-dv, dvcam, dvcpro) can record 480 lines (Cause it's D1 and uses a different shaped pixel than computer monitors, the resolution is actually 720x486 which would turn out to be 3:2 on a computer monitor, but is 4:3 on a TV.), etc.

Every frame is divided into fields, upper and lower. If you numbered them starting at 1 for the first line, the odd numbers would be displayed first, followed by the even ones. So you have approximitely 30 frames times 2 fields, so you actually have 60 frames per second at an effective "resolution" of app. 240 scan lines a frame. This increases temporal resolution by decreasing spatial resolution. This is called interlacing. It's also why when you pause video and there's movement it will jump back and forth, from odd to even fields.

Pal is 25 fps, with two field per frame, so it has a slightly small temporal resolution (with only 50 effective frames per second) but the spatial resolution is larger, there's more scan lines per frame.

Most 35mm film is projected at 24fps, however the shutter opens and closes several times that every second, if it only opened and closed 24 times every second the flicker would be very noticeable. It still is if you sit very close to a large screen.

In video this is called "progressive". Progressive is where the entire frame is displayed at once, and not in fields like interlaced video. There are a couple different compatible HD formats, one is 24p (p=progressive, i=interlaced) fps, the other 30p fps, and the other is 30i (effectively 60fps). the 24p would be for movies and stuff that originated on film or a 24p HD camera. The 30p stuff would be video content mostly, and the 30i stuff would be for sports where a faster shutter speed would keep movement sharper.

It's interesting to note that most DVDs with content that's from film actually store it as 24 frames per second/progressive. And your DVD player does what's called a 3:2 pull down in hardware to turn it into 30fps, that's also why you can freeze a DVD without interlacing artifacts of the picture jumping back and forth. I'm not sure how PAL DVD players do this, as the preferred method for transferring and distributing PAL VHS tapes was to just run the film at 25 fps, which sped it up. I think the PAL version of Titanic was actually something like 10 minutes shorter than the NTSC version.

But film can be shot and played at virtually any speed mechanically possible.

Hmm.... wonder why TV uses 25 fps and film 24 fps then. They don't look to bad to me, but when you move to lower frame-rates the inter-frame flicker becomes noticeable. There is a very real limit here, set by the intergration time in our rods and cones

The reason it looks okay is because each frame is motion-blured. It is not a perfect snapshot like a computer rendering is -- it represents the complete range of motion that occurs for the 1/24th of a second of a frame. Look at a still during a scene with a lot of motion to get the idea.

The inter-frame time is a very small fraction of the frame time, which is why it isn't noticeable at 24 fps.

Two problems with that one that if the electrons were not being absorbed by the coating on the inside of the screen. There would be no light.
Second is that acording to modern electrical thoery the electrons actually traval from the negitivly charged screen to the positivly charged electron "gun".

I was about to say the same thing I have a CRT that has a dead pixel amost in the center of the screen, and it has been there sence the moniter was new. I have also seen parts of lcd screen backlights go out.
It is like all electronics it is ether going to work forever or start going bad early.

A pen did that much damage? Wow. I dropped an IRON, which was also full of water onto the top of my closed Gateway solo 9300 laptop a few months after I purchased it, it's been over a year now and still no dead/stuck pixels. I guess maybe I'm just lucky? The iron did no damage other than cracking the plastic on the back of the lcd screen, luckily the water didn't go anywhere (but it did make the iron heavier).

Well, actually, it just comes with a 21" crt monitor on the Sun UltraSparc that drives it... No LCD Screen... Don't know what kind of printer you're talking about, but the dt6180 doesn't come with a LCD....

Well, OK, you might not need more than 30 depending on what you are doing, but you can certainly see far higher than 30.

Hmm.... wonder why TV uses 25 fps and film 24 fps then. They don't look to bad to me, but when you move to lower frame-rates the inter-frame flicker becomes noticeable. There is a very real limit here, set by the intergration time in our rods and cones.

You're probably being confused by motion blur.

Motion blur is certainly the key issue here. In graphics rendering you don't have motion blur, each frame is perfectly sharp. For perceptually pleasing results, moving objects should be smeared an amount that depends on the amount of inter-frame movement. This happens as a side effect in cameras, so film captured this way looks OK at 24 fps. In graphics rendering on the other hand you will have what is known as temporal aliasing due to moving objects being too sharp. By increasing the frame-rate, an acceptable blurring will instead occur in the retina.

I've been looking for 8- to 12-inch color LCDs, and I simply can't find them for sale. There are products (net appliances) that incorporate them, but I don't want to shell out that much money.

I'd like to spend no more than ~$170 on a 10-inch, 800x600 LCD screen.

Why?

Most people stick their brandy-new LCD screens right up where the CRT used to be. Up high on the desk, where people get cricks in their necks 'cause you're not designed to sit and look straight ahead like that... I'd like to position the LCD on a tray about 16 inches in front of my face at about a 45 degree angle down, and have the keyboard under the tray at level-forearm height (I'm a touch-typist).

Like a book. When do you see people holding a book at arms' length, level with their eyes, while sitting ramrod-straight in a chair? You don't, because people hold the book in a more *comfortable* position; hands low, book low (but not too low).

But I can't find any 10-inch color LCDs with a connector that I can just plug in to my computer! Any info? Somewhere? Someone?

Acutally, power consumption is much lower with monochrome LCD displays. Note that I did not say that the LCD panel itself consumes more power, but the display as a whole does. The reason is simple: a color display has a color filter which lets through only the desired light color to each sub-pixel - Reg, Green, or Blue. Fully two thirds of the backlight's display is absorbed by this filer, necessitating a much brighter backlight just to get as bright of a display as a comparable mono display.

the answer to the second question is extremely easy. Just how many people do you think would want a monocrome LCD? "Economy of scale" answers it to an extreme. The process for making a normal screen would have to be redeveloped, and even with different sizes there are still certain things that are similar between them, whereas monocrome would be quite different all together. Therefore...due to "economy of scale" as you yourself mentioned, a monocrome LCD would be more expensive than a color LCD. You're competeing with all the normal uses of an LCD. I wouldn't think the manufacturing cost of a monocrome would even really be all that much less...

Your logic is flawed. "B&W LCD's exist for thigns like PDA's but nobody makes them big" First you state that 8 dead pixels might not be a problem on a 1600x1200 display. Then you say that on a small screen even one bad pixel can be a problem. This contradicts the logic of your initial statement.. if bad pixels (an unavoidable part of the manufacturing process) are less of a problem on large displays, then manufacturers would focus on those rather than smaller displays which you say need to be perfect.

Oh, and learn to spell/punctuate properly. A few errors usually aren't a big deal, but you post was almost incomprehensible.

In the RF lab I work, where we test GSM phones, LCDs are used instead of CRTs because of the 60 Hz signal that comes from the CRTs which degrades the signals we test. So as you see LCDs have another use besides the obvious "cool" look!

I work in a hospital and we use large monochrome LCD displays. They are expensive, but a cost effective solution for us.

Many hospital xray departments are going to PACS systems -ie digital xrays. No film used. The images are displayed on very highresolution monitores. A radilogist looks at the image on CRT screens. (mono, very large, hires upto 3000x3000, high contrast).

In operating theartes, emergency rooms, intensive care units etc, space and portability is a necisity. In these places we use mono LCDs. These are resonable resolution + brightness, but can be hung on an arm over the operating table so the surgeon can see while operating.

Given how much money is spent on health technology this could be a big market as more hospitals move to filmless systems.

1600x1200 pixels are already on lots of laptops. They just managed to fit all those pixels in a 15" format and that's it. And oddly enough - it's very readable once you get used to the unusual smallness of letters and icons.

There are no more dead pixels appearing. I've owned a LCD for 2,5 years - it arrived perfect and is still perfect with all pixels 100% functionnals.

I mean, it used to be 9 dead pixels/line for the first mac laptop, but dead pixels are BAD. Why would you invest several hundred or thousand for a really good LCD and get 2 dead pixels and shrug it off? I mean, people are complaining about the G4 cube's cracks.

Oh, and TIP: If you moisten a tissue and slowly rub around the dead LCD pixel, it hopefully will change back or get unstuck in color.

It save you only a few dollars for the increasing complexity. In the price range to which you refer, we are talking inkjet technology. It is the consumables market which makes money, and are the expensive bit to manufacture.

> Its not that they are from Sony, that is an
> effect of aperature grille monitors which need
> the wires to keep the vertical wires straight.

For whatever the duration of the patent, "aperture grille monitor" == "Sony Trinitron" display. That was at least 17 years, perhaps 20 and only expired maybe 8 years or so ago. I'm sure sony had licensed producers but the bulk of non-shadow mask displays carried the Sony brand.

I Seem To Recall (tm) that Microsoft requires manufacturers to use color-capable hardware if they want to install Windows on their products. Probably for that wonderful user experience. And beyond Windows, the market gets small real fast.

Um, not wanting to sound like a complete idiot here, and having actually had some EE theory, I must still ask the question: if your second assertion is true, then what is it that excites the phosporous (sp?) coating? Holes? Can energy be transferred to the coating by electrons leaving it? Wow.

And a throng of ppl would flock to purchasing a large Mono LCD monitor, just as they would flock to buy a large mono CRT monitor.

Mono LCD monitor demand, despite a cheaper price, will always be a niche market product. And, being niche market will mean that eventually the monitor will be MORE expensive than their colour LCD counterparts once the laws of supply and demand kick in.

I would trust the original poster to know their own needs. Some of us are old enough to remember working with b/w displays and remember what they're good for. The story submitter described uses that 1-bit displays are perfectly suited for.

I'm a little concerned about whether OS's and applications are still designed to work with 1-bit displays, though. I haven't seen one in use for the past four years or so, which suggests that application developers might have stopped checking to make sure that their programs work well in b/w mode. (As an example, you wouldn't be able to distinguish between the red underlines vs green underlines that MSWord uses to mark words with bad spelling vs words with bad grammar.)

Ever hear of a Deltra grill or Trinitron Mask? If you look very closely, you can see vertical lines down your screen. The maximum resulotion depends on the mask / grill. You're in the same boat as TFT/LCD screens.

Its not that they are from Sony, that is an effect of aperature grille monitors which need the wires to keep the vertical wires straight. They aren't completely invisible and can be seen on white backrounds. Everyone who makes aperature grille monitors uses this technique.

I would trust the original poster to know their own needs. Some of us are old enough to remember working with b/w displays and remember what they're good for. The story submitter described uses that 1-bit displays are perfectly suited for.

You need at least 4 bits of grayscale to get a decent-looking spectral display of audio, that is, the energy at each (time, frequency) pair.

"That's the minimum speed, below which we lose the impression of continous motion and start to see seperate images (actually, 18fps is roughly the min"

Acctualy, it's 12fps. You can still whatch something at 12fps and have your eyes tricked into thinking it's something moving. As opposed to just a whole lot of pictures changing.

Another thing I might as well add. While you maynot notice the difference between 30fps and 150fps. If one was to rotate the camera fast, or film a car driving past. There would be more motion blur at the lower speed.

Even with all the fancy techology today. there are still shots that you can't shoot on film (24fps) because of motion blur and the like.

"With a CRT, an electron gun sprays electrons on to a screen, and they are moved magnetically to the right position. So there is no physical object that corresponds to a display pixel. This means that you can theoretically have an unlimited number of resolutions on a CRT."

Sort of... don'f foget the little bits of phospher of what ever it's called. Spray a few water droplets on your monitor, and you will see lots of little 'pixels', each one with a r,g,b part.

I have heard that Sampo makes a 12" LCD. I can't find it around, though.
The New Internet Computer Company has 12" 800x600 LCDs which you may have seen, but it's more like 480 dollars rather than 200.
Then there's Sony's 13" LCD --apparently the only 13" LCD made in the world --which is very popular in hospitals and similar places since it's small but still does 1024x768. That puppy is 900 dollars. Even though larger screens are below 600 dollars.

I'm fairly sure that wasn't apple bashing, it was owner bashing.. People who buy over powered computers based on looks are in the same boat as a lot of people who buy SUVs and sports cars, and to a large extent, LCD displays. They 'want' one, they don't 'need' one..

You know, I was thinking about that as I was typing it and it didn't quite seem right. I just saw the figures briefly yesterday, so I'm just going from my ever falable memory. I know Fred's looking at getting their research released so they can post it on their webpage. In any case, the point was that LCDs represent a significant cost savings after only a couple years of average (5 days/wk, 8 hr/day) use.

With a CRT, an electron gun sprays electrons on to a screen, and they are moved magnetically to the right position. So there is no physical object that corresponds to a display pixel. This means that you can theoretically have an unlimited number of resolutions on a CRT.

With a LCD, each pixel is a physical dot, so there is really only one "correct" radiation for a LCD. Some LCDs can "fake" this resolution better than others - my (well, the company bought it for me) spiffy new SGI 1600SW LCD does a pretty good job at 800x600 and 1280x1024, even though its native resolution if 1600x1024. But try a cheap laptop screen sometime - look at the console text and you'll see that it's often difficult to read due to poor quality at lower resolutions.

Other than games (which are not within my area of interest), I see no reason to want to change a LCD monitor to a lower resolution.

THere is more of a demand for color displays than b&w. They probably don't see a very lucritive market for b&w displays. If you have a choice between color crt and b&w lcd many will choose the color. It makes their pictures and games look better.

The prices on color displays are coming down. In another 2 years they will probably be less than 500 for a 17 inch and it will be of very good quality.

Personally I think this will be the next technology push. Flat panal display devices. SImilar to lap tops, but more along the lines of the sony pc that is super thin. basically like the mac cube adn its display. Only instead of it being 3k it will be 800 for the whole deal. Wait and in a year or two you will see these come down.

The eye only sees one small detail at once and shifts to another one approximately 70 times a second. I guess that's why the ergonomic frequency barrier in monitor refresh rates is usually 72Hz.

There is a 70 Hz tremor in the eye called nystagmus tremor which is probably what you are thinking about. This only affects refresh rates on CRT displays, and only when your eyes are close to the screen though. The problem occurs due to interferrence between the high frequency screen flicker and the eye tremor.

AFAIK the pixels stay lit on LCDs in-between frames, so there is no screen flicker that could interfere with the eye tremor. For this reason the refresh rate is better called an update rate on an LCD.

I agree completely but the **real** problem is we now live with an industry that resorts to snap, crackle and pop rather then intelligent functionality. The move to "pretty" is overwelming when motivated by the need to impress reviewers and salesmen devoted to plying their wares to the mass buying and usually unsuspecting public.

Issues like supporting cost-effective monochrome displays with its resulting lower prices also means lower commissions and less profits. Its ironic that I know so many people who would comparison shop for the cheapest soap and shampoo prices seem too often reliant on a salesmen for higher priced goods, who themselves are usually more interest in furthering their own ends rather then the needs of their clients. I agree that monochrome displays would likely fit the needs of most people most of the time -- but tell them that they might need it 1% of the time and they will not give up the option.

There wouldn't be any point in a B/W LCD because it probably wouldn't be any less expensive to produce. You'd have to develop and build a separate process and manufacturing line, the materials cost wouldn't be much lower, and for all that you might as well be making color displays at practically the same cost but getting higher margins.

It would make a good deal of sense if you followed common UI design critera. A reasonable number of computer users have some sort of colorblindness and as such it is usually a good idea to design applications to not depend solely on color.

True -- if you aim for cost savings, you need to set up a new process and a manufacturing line for producing LCD panels with 1/3 of the sub-pixel density of a color screen.
But what if you went for resolution instead of savings? You could use more or less the same process, only with monochrome elements instead of R, G and B -- and have a display with three times the horizontal resolution of a color display.
4800x1200 resolution in a greyscale laptop screen? I could exchange color for that..

We don't have monochrome displays now because the resolution is too low. Just wait, when we get over 300dpi then greyscale will be more interesting (as halftone), and at higher resolutions you won't be worrying about the gradations of even rgb pixels, just how many you turn on next to each other. Think about the resolution of 150 line-per-inch magazine glossies.

The other reason they don't sell now is no demand, and not enough volume for the reduction in margins which the price point we want necessitates. Used mac b/w laptops are still sitting in the 2nd hand
stores..

Both Dell and Compaq make a 1U foldaway rack drawer that has a keyboard and a flat LCD screen: [dell.com]Dell, and Compaq [compaq.com]. They're not exactly cheap, but with a KVM you would only need one of them for a whole room full of servers.

It's a sad fact that in the world of mainstream computing (Microsoft Windows), everything is made a succsess or failure depending on how mainstream it is. In the area of mainstream computing, the trend is towards complete, graphical interfaces over which the user has little control which has its roots in the media industry rather than the traditional computational history of computing.

Although the notion of a monochrome display is noble (why do I need more than one colour to read text?), the problem remains that more people are likely to go for devices like the all-singing proprietary NVidia GeForce cards. It's all because people don't understand that there is an application of computers beyond that which has been popularised by the mainstream press and "EasyPC" initiatives that aim to reduce the computer to the level of other consumer devices like the TV or washing machine. As long as there are geeks around, the true importance of computers will not be forgotten, but as time progresses we will find it more difficult to find hardware that meets are needs without exceeding our needs, or our budget.

All in all, of course, mainstream computing has done more to bring down the cost of hardware for geeks who are actually interested in the technology, and for this we must be thankful. But the simple, one-purpose devices analogous to the *NIX tools designed to do one thing right, will be the casualties of this new trend. Our best hope is to get over it and enjoy cheap, powerful computers while they last before embedded devices take over the world and put the PC back where it was two decades ago.

I'm afraid I completely disagree, there are any number of _real_ advantages to LCD's and they more than outweigh the cost as far as I'm concerned.
My top three (without even mentioning power saving) is as follows...
Space; Deskspace is EXPENSIVE! A 17" or larger screen is purely impractical on my desk. I have a printer, scanner, keyboard, desk lamp and working space to fit onto an area about 2' by 4' Losing the massive bulk of the CRT has been a life-changing experience.
Ergonomics; Kind of related to the space issue, but still important. Something that is about 6" deep can be positioned a lot more comfortably that a 25"+ deep monitor. My previous monitor had to be positioned at the side of my desk overhanging the edge to make room for all the rest of my junk - not at all comfortable to use for a long stretch of time. My lovely new LCD is slap bang central, right in front of my keyboard.
Clarity; There is _no_ distortion on an LCD panel. This makes a huge difference when you're editing text, code or whatever.
(And *yes* I know that some of these points have been made elsewhere...)
There will always be a place for CRT's (well, for the next five years or so at least) but once you've used a quality LCD Panel for an hour you will never want to go back to a CRT display.
As for the cost, I paid about the same (give or take £50) for my 15" LCD panel two months ago as I did for a 17" CRT about four years ago.
Cheers
Chris

There's not enough market for monochrome LCD displays. If you're going to shell out for an LCD display, you'll shell out for color, and most home users wouldn't DREAM of monochrome (even if it does look cool).

It's much the same as with television: the Web is next to useless these days without color. Seeing that almost everyone want to run Web stuff, the market is really small for it.

For the longest time, I has a 1280x1024 black and white screen on my desk. It is just easier on the eyes, and it's perfect for coding and e-mail. I eventually replaced it with a color screen because of all the web sites I had to use that abuse color.

What is an interesting market once someone uncovers it, is cheap 640x480 B&W LCD's to use as a console in colo facilities. I hate wasting 6U os rack space for a monitor, and I'm not the only one

much more expensive for the same as a CRT, but overall not bad.
They recently changed their inventory it seems so I cant find any of the more fitting LCDs i once thought they had, but may be useful none the less.

B&W LCD's exist for thigns like PDA's but nobody makes them big....
The Dead Pixel issue. 8 might be an acceptable amount of dead pixels on a 1600x1200 screen, but on a 160x160 screen ala a Visor Prism, one dead pixel can be quite annoying. As I take it, on Smaller screen,s there are a lot fewer dead pixels in total, than on the big screens.

every lcd monitor has dead pixels. LCD manufacturing of large sizes is *very very* difficult (which is why they cost so much). I'll take a couple dead pixels that i dont even notice over the blurriness of a crt any day...

Having worked in a Computer store once I can tell you the reason you can't get a great monochrome lcd. Simply put, 99% of the demographic that wants to buy one of these is an older person that is wanting to "get on that internet thing." Unfourtunatly, while I'd have one of these bought for every LAN room I manage now (dozens), techs are rarely the target audience for technical devices anymore. Sure, we have a good amount of buying power, but when compared with the AOLers we're not much in the eyes of a marketing dept. I could only imagine an engineer at company X trying to sell the idea of a monochrome LCD to the higher ups. The conversation would sound something to the effect of {engineer} "There are many uses for these devices and they can be built cheaply." {Marketing guy} "Yea, but why would anyone want a screen that was only black and white. Windows would look terrible."
So as you all can see, while it's a great idea it proabaly won't come about. Of course, that's just my opinion, I could be wrong.

Then there's all the static electricity around a CRT. This causes dust particles to attack you, and fill your pores. Combine that with hot weather and perspiration, and you understand why hax0rs have such bad skin. It's a dirty job, but someone's gotta do it.

I could have sworn there was an article on slashdot recently about how CTX monitors are better than LCD becasue LCD monitors cannot display as high FPS on games like Quake and will start to blur out after 40 of so. Not that the human eye needs 30 but still....

We've seen how you can use sub-pixel positioning to get an apparent increase in horizontal resolution for monochromatic text (Cleartype, etc.) on full-color LCDs.
Why not produce, for example, a 15" 1024x768 LCD with each sub-pixel the same color? Sure, the display would be monochrome, but horizontal resolution would increase by 3x.
As a bonus, you get hardware support from any truecolor video card, since 8-bit monochrome has the same layout as 24-bit RGB.
It's so simple that someone must have already done it.

The company I work for, a large multinational telecommunications outfit, uses LCD panels primarily in the 'back office', not the front. The receptionists have CRTs, while the operations centers use LCDs.

This is because the receptionist needs only one monitor, while even a small operations center needs, at a minimum, about 40.

Even with 17" LCDs going for four or five times the price of CRTs, it saves money in the long run due solely to the fact that the operations consoles can be shortened by about two feet each.

This might not hold true if you're building a facility in the middle of nowhere, or if all those LCDs wind up having a maximum life of about five years; but where real estate is even moderately expensive, the rental on the floor space CRTs would take up would make their total cost of ownership higher than that of LCDs in about five years.

I expect that the break-even point would come even sooner if one was inclined to figure in the cost differential of powering and cooling the things. CRTs use a lot more power and generate a lot more heat than LCDs. I used to use the back of a Sun 20" monitor to keep my breakfast bagel warm in the mornings until I was ready to eat it. The vents there were so well suited to this task that I think they must have been designed for this purpose.

Of course, the potential problem with this reasoning is that CRTs last almost forever, while all these LCDs might well sputter and die within that five-year period.

I've heard that the limit to an LCD display's life is usually its backlighting; supposedly the backlighting dims gradually over time, making the display harder to read. This is why LCD display manuals recommend you turn the display off or put it to sleep when it's not in use, even though it won't suffer burn-in problems like a CRT. I haven't ever seen an LCD display with backlight problems, however (though most LCD displays seem to take a few minutes after power-on until they reach their normal brightness).

As for dead pixels: Sometimes you can 'un-stick' them by massaging the screen over a dead pixel very gently.

The only problem I've had with my Apple Cinema Display is some mild 'burn-in,' believe it or not. Apparently with any LCD display, if you leave a static image on the screen for a while, the LCD hardware will 'remember' that image and you'll continue to see a faint ghost of it on the screen. I see this most often when I've been in Mac OS X for a few hours, and then I reboot into LinuxPPC -- I can still see the ghost of my Mac OS menu bar at the top of the screen! The ghost stays even if I power down the computer and display for a while. I've been told that the ghost will go away after as much time as it was on the screen to begin with (if the menu bar was there for eight hours, its ghost will fade after eight hours), and that's been borne out by my experience.

I see your point, but it depends on what type of printing you do. I work with black and white "laser" printers that cost hundreds of thousands of dollars (I don't really know how much, you'd have to talk to Xerox [xerox.com] about the price, the bean counters could tell you though). What, that much? Well yea, but they rip and print 180 pages a minute. They are variable data printing monsters, but if you've got several million b&w unique pages to print, that's what you need....
It's all about finding the right piece of equipment for the job.

Basically I see two major reasons why you don't see companies manufacturing a lot of B&W displays (especially of the size you are describing):

1) Minimal demand
2) Lower per unit profit margins

The first problem is that ultimately most people want to buy color LCD panels. It's not worth it to most LCD manufacturers to bother with the small segment of people who would be happy with large black and white LCD panels.

The other issue is that with Black and White, because it is simpler technology, ultimately their profit margins are going to be lower per unit. That means that they have to sell that many more in order to make a profit. If they charge enough to recoup their investment it wouldn't be that much more for people to shell out for the color display.

To see how these economics work, look at the price of processors in the market. There's a certain optimum point where you get a significant amount of power for a low price. If you reduce the power of the chip, the price doesn't drop much because. So you end up in the bizarre situation that you could pick up a K6-300 for just slightly more (or maybe even less) than an old pentium 60. It's just all economics.

During a trip to the Far East a couple of years back, I was very surprised to see the number of laptops that appeared on office desks. I commented on it to my guide who then pointed out the obvious; Laptops represented a **HUGE** proportion of office desktop market for one very good reason, the amount of desk space occupied is much smaller (primarily because of the screen) and (at that time) there was no other option then to get a "space saving computer" -- the proper name when referring to the laptops over there (which btw, rarely leave the office or even get unplugged).

Further, a friend in Japan tells me that the number of relatively new tube monitors appearing on the street during garbage day has risen dramatically over that last year as everyone is trying to recover a couple of extra square feet and are moving in troves over to LCDs.

Something else to consider is that most applications are not optimized to work in a B/W environment. I know very few developers who have considered, let alone tested the appearance of their applications in a B/W environment. For example, how many developers have a monochrome monitor on their desktop (or even in their dept.) to test against???

Colors, especially monochrome reds/greens/blues appear nearly identical in B/W, yet these are often used together as background/foreground colors in dialogs and are rendered largely incomprehensable in a monochrome environment --appearing as dark grey lettering on a darker greys/black background).

My first laptop 10 years ago was monochrome and the number of times I had problems even then (when monochrome displays were everywhere) was daunting. Yes, if an application is "monochrome" aware you wouldn't have a problem, but the number of apps that fall into that category now has seemingly all but disappeared.

Actually, there are some 640x480 B&W LCD panels, but without backlight.
One place that sells them, that I mention only because my brother was looking at them recently, is All Electronics http://www.allelectronics.com
The 640x480 LCD panel is $25, not counting material and labor needed to connect it.

That's the minimum speed, below which we lose the impression of continous motion and start to see seperate images (actually, 18fps is roughly the min -- film was sped up to 24fps so that a soundtrack could be run on the edge of the same piece of film without sounding too crappy). The maximum fps we can discern is much higher (somewhere between 70fps and 150fps, depending on who you ask) -- beyond which point most people could not tell the difference caused by additonal frames per second.

Douglas Trumbull did a lot of research into this area, and created a process called Showscan that uses 70mm film projected at 60fps, which is supposed to look incredible!

Just look away from your TV or computer screen, look out the window and watch the real world, and see how crappy our screens look in comparision. They could be (and will be) a lot better.

I think this is very true. Go to any trade show and the companies are trying to look good. This means all of the slickest hardware they can find. Optical/wireless mice, small keyboards, lighting, and flat screens everywhere.

In my experience the pixels last a long time. I have a laptop that is over five years old now. It has only one dead pixel which was there from the beginning, and the screen is left on basically all the time. As for my new laptop, it too has one dead pixel which appeared the second time I turned on the computer. I can live with one as long as no new ones appear.

It seems for some reason that one or two dead pixels seem common place, but once the bad ones die off when the monitor is new, the rest seem to last a long time from what I have seen.

Why would you need a console monitor on a server? Run a serial cable from your laptop or pda into the server's serial port / alternate console port and chug away. Here in my office all of my servers are headless and have their serial ports plugged into a portmaster.

We ordered 40 flat panel View Sonics, and 1/4th had to be sent back because of what we called, "Hay Screen" which looks like someone took a fist full of hay and smashed it up against the screen from the inside. With those stats, we expect the rest to fail within the next year or two, after they are out of warrenty. At that point all we can do is throw the $1000 plus monitors away.

I suspect it's because the whole 'LCD on the desktop' industry deal is still very much a special thing for people who want a slick front office. Yes, I know people (more then average around here) use them for solving real problems when a big CRT won't do it, but I'd say at least 80% of units sold goto vanity applications. Vanity means color.

"light pixel" is when a pixel is >75% lighter.
"dark pixel" is when a pixel is >75% darker (or "death", I presume).
"other fault" is, well, for other faults (e.g. a subpixel is defect, giving the pixel a distored hue - think color-stuck pixels).

Class II is considered acceptable for office use. Classes III and IV are not.

For a 1,024*768 panel, which is 786,432 pixels, that makes 1.57 light pixels (rounds up to 2), 1.57 dark pixels (rounds up to 2) and 3.93 other faults (rounds up to 4).
This give a maximal number of defective pixels of 8, which is 0.001 % of the screen surface.

This data is very useful when you're a techie on the field and you're being annoyed by some customer who keeps asking for a new monitor because his/her got one or two death pixels. You can tell the monitor still meets the industry standard and therefore will not be replaced.

As for the ViewSonic monitor, I suppose ViewSonic was pretty nice to you when they replaced your "defective" monitor.

Well, OK, you might not need more than 30 depending on what you are doing, but you can certainly see far higher than 30. You're probably being confused by motion blur. You should also ask yourself whether the concept of 'frames per second' is really applicable to how the human eye works.

In my case, I want to edit multi-channel audio. A Color display adds almost nothing to the information that I extract from the screen. I can select, cut, copy paste, apply effects, and otherwise mangle the sounds as well on a 1-bit per pixel display as I can on a 32-bpp monster.

Actually, what you want is grey scale, not 1 bit Mono.

You can see the effect by taking any BW photo, and convert it to 1 bit color.

You also see this in printing. Laser Print IS 1 bit color, more or less, but you get true photo-grade at about 1500 dpi. Contrast this with grey scale, say on a screen, where 70 - 100 dpi is adequate for a photo, if you are using grey scale. 100 dpi in 2 bit color is horrible.

The medical industry uses high-end LCD's in radiology departments. Resolution and brightness are the most important factors and you pay for it.
Check out these puppies:
http://www.dome.com/products/cx/cxdisplays.html

While your point is well taken, another major advantage of LCD displays (other than the space savings as noted by another poster) is the power and cooling savings. Fred Cohen [all.net] had his students in the CCD [sandia.gov] do a power and heat analysis of all their equiptment in the wake of the CA power crisis. They found that a 17" LCD monitor only drew 1/10 the power and generated 1/4 the heat of a 17" CRT monitor meaning that the higher cost for the LCD monitor would pay for itself after just a couple years of use.

I'm not sure how todays displays differ from those of older laptops, but I have a PowerBook 170 from almost 10 years ago that is still looking great. At the time we were extatic to have found one with no bad pixels (they were few and far between at the time), and I can say for sure that the same is true of that screen today. It should be noted, however, that that was one of the earliest shipping Active Matrix screens, and monochrome, but it would be hard to find a 10 year old Apple Cinima Display (drool) to check.

Dead pixels are usually there from the beginning, so shop carefully. Later, there is the more likely possibility of having a dead streak of pixels.

These LCDs are produced on big assembly lines that need a lot of capital and long runs to be economical. Monochrome LCDs are being produced for small screens such as for cellular phones. Color LCD assembly lines mostly produce larger screens for such uses as portable computers and now desktop displays. Since manufacturers can make more money producing the big screens, and since the factories were built to produce big color screens, they are mostly color now. The price varies by supply and demand--both now favor color screens, which are now going down in retail price as the supply has recently been increased.

If you are considering an LCD display for use by a technical writer, look at the digital LCDs and not the cheaper analog variety--type will look better. The price of a 19-inch CRT monitor has gone down dramatically in the last year and might be more cost-effective for such a purpose than a digital LCD screen.