Posted
by
CmdrTaco
on Wednesday January 13, 2010 @10:15AM
from the i'll-believe-it-when-i-see-it dept.

waderoush writes "It's not every day you hear about a brand new display technology, but San Jose, CA-based Prysm came out of stealth mode yesterday to talk about its plans for manufacturing laser phosphor displays, or LPDs. The new devices, which the company will show off at the Integrated Systems Europe trade show in Amsterdam next month, reportedly use 25 percent as much electricity as equivalently-sized LCD screens. And they should be easier to manufacture too, since they don't have a backplane of transistors like LCD screens: the image is generated by a laser beam that sweeps across phosphor stripes under the control of a scanning mirror. The venture-funded startup, which plans to build and sell LPD screens under its own brand, is promoting them as a low-cost, low-maintenance way to display information in lobbies, airports, broadcast studios, command centers, and the like."

And they should be easier to manufacture too, since they don't have a backplane of transistors like LCD screens: the image is generated by a laser beam that sweeps across phosphor stripes under the control of a scanning mirror.

Of all the information I can find [prysm.com], no one is addressing the thickness of the display unit. I'm not saying it can't be done in close quarters but I'm basically inquiring how thick the unit must be in order for a laser beam to sweep across the phospher stripes that comprise the screen? Are we talking about moving back towards the sizes of back projector displays? Because it might not matter how efficient or awesome the picture display is to the consumer.

I guess that might explain why they're targeting airports and malls and not your living room.

Prysm's founders (Amit Jain and Roger Hajjar) have had their names on quite a few display related patents since 2005. I'm excited a small startup can enter this market but I'm skeptical of the marketability due to the one drawback: a step backwards in compactness and style.

my first thought was I'd need to care about VSYNC and HSYNC again. It wouldn't need to be as big as old CRTs - those use an electron gun and electromagnets to move the beam - all they need is some way for the laser to target each pixel on the phosphor at a suitable speed.

No, it just has to have some mirror arrangement that allows it to reach the whole screen. I don't see any reason why the laser has to strike the phosphor at anything close to a perpendicular alignment.

I don't see any reason why the laser has to strike the phosphor at anything close to a perpendicular alignment.

The angle at which the beam strikes the phosphor would determine the shape of the intersecting region, which may be difficult to correct for. However, a small mirror near each "pixel" that redirected the beam straight at the phosphor would likely correct the situation without taking up too much extra space.

At the end of TFA, they claim that conceptually it would work for a laptop display; so it must be pretty thin. The reason to target big displays before worrying about home TV's seems to be that the cost of manufacture is less an issue there. Until they can do relatively cheap mass-production, they won't be able to address the TV market.

Also, the headline notwithstanding, this may face tough competition in the TV market from advances in LED-type displays.

Strikes me that we're back to the scanning electron gun, but this time it's photons... there will be a "whirr" from the scanning mirrors, otherwise it sounds like a reasonable idea - just need to put the phosphor on a reasonably rigid substrate and you're good to go. They'll never be as thin (or flexible) as my current LED backlit notebook screen.

It does sound like a fairly sensible idea, although the article doesn't appear (from my admittedly quick glance) to mention refresh rates or resolutions, which both become more significant issues in computer monitors than airport info displays.

DLP sets use moving components, often including a rotating "color wheel". I've never heard of an audible "whir" being a problem there, so I'll hold off on speculating whether there would be one here.

I also know of no reason the screen couldn't be as thin as a notebook LED. I would think the laser's beam thickness would be the limiting factor (since it would govern how shallow an angle the beam could use to approach the screen without spilling across pixels).

If they can find a market where energy efficiency is more important than thickness and durability (another issue that I would be concerned about in anything with moving parts, mirrors, etc.), then they should be able to do alright with their product. I am just not sure I can think of a market where durability is less important than energy efficiency.

Yeah, but cost is also a factor in a lot of cases, and this could well be an acceptable compromise for a lot of people.

These are supposedly a lot cheaper to manufacture and draw a lot less power, so if you are willing to put up with something that has some depth, you may be able to skip the 55 inch screen and go straight to 70 inches for the same money, and lower long-term costs of operation. Or get that 55-inch screen and have $800 left to buy a whole lot of movies to play on it.

Moreover, Prysm's LPD screens--which the startup plans to manufacture at a plant in Concord, MA--can be built in any size or shape, from square tiles to long, <b>thin ribbons</b>, meaning they could turn up almost anywhere someone

It's a bigger step backwards than you might think. There were big screen TV systems attempted long before color TV that used essentially the same setup, but using beams of ordinary light instead of lasers. They actually worked surprisingly well for the electromechanical kludges that they were, aside from the size issue. In fact, the degree of similarity is enough, IMHO, to count as prior art.

"Ordinary" light in this case is incoherent light, i.e., light in which the individual waves are not synchronized with each other, either by being out of phase or being of different wavelengths, usually both. This is the kind of light that comes from most light sources. Laser light is coherent: it's (mostly) all one wavelength and the peaks and troughs of the waves are all synchronized.

Well - I designed what would be portion 320 in the diagram, the image modulation system for a scanning LED TV. The first problem was that LEDs were too dim at the time. The lasers in this system against a phosphor take care of that issue. The second issue you have is what is called the pin-cushion effect. As you scan the laser over the surface of the rotating polygon, it will tend to modulate the length of the scanline making the picture look like a pin cushion. I had a way to fix this in the modulation controller - can't talk about HOW to fix it;-) Just know that is a pretty big problem to overcome.

Once you have a method to overcome the pin-cushion effect, then you need to have to have a way to align the TVs in production (another REAL headache I didn't come up with a solution for..but then we only got to the prototype stage so didn't have to face that issue.)

Finally - there is the issue of NOISE. Rotating mirrors can be REALLY loud. Our prototype sounded like a jet engine when we spooled up the motors. The precision optics are also expensive. The mechanical engineers believed they could build a much quieter mirror assembly - maybe with air bearings.

So there are a lot of real - practical - tough design problems with this approach.

Finally - I expect it to be a relatively BIG TV.

It's a neat technology - but I don't believe there is any market for it.

Precision optics seem like overkill to me... all that shit with f-theta lenses and optical correction of pin-cushions seems so... archaic.

As long as the distortion is static and a sufficient maximum distance between lines is maintained you can just correct it digitally can't you? Transistors are cheap nowadays, really really cheap, hardware to perform an image warp on a HD signal is pennies worth of die space on an ASIC (in volume, the million dollar mask costs have to be earned back first of course).

Never mind, when I get my idea for a warp drive going it should be simple. I just have to come up with an antigravity device, a tractor beam and a zero-point energy device to power it and sourcing neutronium will be easy.

Alternatively CERN probably have a few containers of that, right next to their antimatter containers. Give them a call and they'll probably pop over in their scramjet spaceplane and let you have a few tonnes.

guaranteed to be thicker than LED or LCD, and with phosphor delay; I want LED so that I can have [effectively] instant transitions. we can get back the delay effect with processing, but you can't eliminate phosphor delays when you've got phosphors.

guaranteed to be thicker than LED or LCD, and with phosphor delay; I want LED so that I can have [effectively] instant transitions. we can get back the delay effect with processing, but you can't eliminate phosphor delays when you've got phosphors.

Unfortunately, it doesn't seem like we'll see OLED any time soon. It still has longevity issues and burn-in is nasty. We'll see what happens 3-4 years from now, but as far as computer monitors are concerned, I don't think OLED is viable... So we're stuck with poor LCDs.

OLED won't be here any time soon, you are right, but only because it is already here on the market. OLED can now be found in cell phones, MP3 players, in-dash displays, and televisions already being sold to the public. While the price is high, the quality of these displays is definitely worth it for the early adopter type. Just google 'OLED display' and you'll see a few of the better options already on the market (though for the TVs the price is steep).

I meant for computer monitors. It's not likely to happen, unless miracles happen with the technology itself.

If you take a look at existing OLED screens, they are all used in devices that don't display a static picture 8+ hours a day. Buy an OLED TV, use it as a computer monitor, and watch it get ruined quickly. According to some people, the burn-in is worse than with early plasma screens.

WTF... there was a time when people didn't want to move to LCD because of motion blur issues, problems that CRTs, a phosphor-based technology, didn't have. Now you're saying the exact opposite is the case? *boggle*

What? Of course CRTs have motion blur issues. Just because they have no fixed pixel grid, it does not look that bad.I dare you to take any CRT out there, set the background to black and the mouse cursor to white, and then move the cursor around quickly. You will always see shadows of where the mouse was the last couple of times. (No, that is not the mouse trail function.)

What? Of course CRTs have motion blur issues. Just because they have no fixed pixel grid, it does not look that bad.

And is the effect sufficiently noticeable to disqualify this tech? I highly doubt it, given how prized CRTs were amongst the gaming crowd, among others, until LCDs improved sufficiently to displace them.

And my point is that with modern CRT, you *still* don't see any blur (or so little as it doesn't matter). See the other poster at the same level as mine for a little information about phosphor decay rates. It's simply not an issue. The OP is, IMHO, just inventing something to complain about.

As a company, they're targeting the deep pocket markets (big displays - really big from the sound of the article). I don't read anything particularly expensive in their description, maybe the high power laser (or the fact that they're manufacturing in Massachusetts), for now they're touting low energy to operate and component longevity as their value-adds.

In other words, the investors don't give a damn about selling you an inexpensive display for your peasant self.

OTOH, with time LPDs may just mean the return of the affordable TrueColor screen. With current non-CRT displays you need to shell out quite a bit of money if you want a monitor that actually supports 24 bpp.

guaranteed to be thicker than LED or LCD, and with phosphor delay; I want LED so that I can have [effectively] instant transitions. we can get back the delay effect with processing, but you can't eliminate phosphor delays when you've got phosphors.

There is essentially zero phosphor delay (I defy you to measure it... I am a visual neuroscientist and have, so yes, it is possible, and no, it is not easy) on the scale of perceptual latencies. I believe the latency from excitation to phosphor emission is on the nanosecond scale. The typical perceptual delays in the early visual system (retina and the first few stages of processing in the brain) are on order of 30 milliseconds, going from the time photons enter the eye to the first wave of activity in primary visual cortex. Different orders of magnitude. Like 6. Phosphor delay is irrelevant.

What you are perhaps thinking of is the phosphor DECAY which is another thing entirely. When phosphors are excited (such as by an impinging electron or photon beam) the emitted brightness steps up almost instantaneously and then decays down through an exponential relaxation curve. That decay time can tend to blur images when too long, or induce eye bleed (to use the vernacular) when the update rate is too low. The thing is that phosphor decays can be adjusted by reformulating the compounds, and are determined ultimately at time of manufacturing. Very fast phosphors are available to support KHz updates, but also very slow ones (some older oscilloscopes have phosphor decay constants measured in seconds).

Contemporary LCD monitors have typically 2 or 3 frames of latency because of the push to get faster transition times. Those 5 ms response time LCDs get fast specs by overdriving the pixels in a highly controlled fashion, but one that requires knowing what is on the next handful of frames. Since we live in a causal world, that means introducing a 2-3 frame latency for processing within the display. Since the update rates on LCDs are typically 60 Hz, that's on order of 45 ms latency, a non-trivial fraction of the loop from visual perception to motor action (known in the gaming vernacular as twitch response). If you're watching a movie, that latency is irrelevant and wholly, entirely unperceived. If you're playing a game, then it is very important.

I had a similar idea once, except using electrons instead of lasers. It also required a vacuum tube for the electrons to travel through. I called it the Fluorescent Electron Cathode Konduit, or FECK for short. After considering it a while, I thought the concept was rather ludicrious and without merit, so abandoned it.

I didn't see any mention in the article - will it have this horrible weakness that CRTs had?

Well, the HV supply for my old CRTs is a couple watts, my LCD backlight displays are a couple watts, I'm guessing this thing will require a couple watt laser for equal brightness. So if the scanning mirror jams in one spot, a couple dozen focused watts will burn a hole clean thru the screen, not just discolor the phosphor. That'll be exciting.

I don't have time to do the math at the moment, but don't forget you have to scan the thing, which means you need a brighter laser. On the other hand, I can't think of a reason why the energy deposition should be any higher than it would be for the electron beam in a CRT.

Phosphor burn is massively overstated. I'm still using the same 19" Hitachi I got in 1997 as my regular computer monitor and there's zero evidence of phosphor burn. It's also still bright enough to use under bright fluorescent lighting (and more than bright enough to use in a dungeon environment).

Depends, if phosphor burn was due to constant electron pummeling and photons are gentler, then maybe not. I suspect they will have some effect, though - especially if they're achieving high brightness on static high contrast displays.

I didn't RTFA but I thought of a laser-driven TV quite a while back. I don't understand why they would phosphors at all. It seems that with three different colored lasers they could simply use translucent glass.

Thinking about it, though, it's probably because the lasers don't produce light of the right colors to combine realistically.

However, I've seen phosphor burn on computer displays, particularly ATM machines, but I've never seen a TV affacted like that, except really old ones, and in that case the pict

How is this better than Mitsubishi's LaserVue [mitsubishi-tv.com] technology? It's basically a laser DLP to phosphor opposed to whatever material is used by Mitsubishi for a standard DLP screen. It even looks like the LaserVue uses less power than this.

I suspect your answer lies in the respective patent portfolios - and my guess is that the new guys have a phosphor formulation that works at high brightness levels, which would not be suited to living room use, or consumer price points.

Lasers+moving mirror == great reliability! Have a feeling these are going to make DLP or LCD lamp replacement look downright economical. Still prefer Plasma, personally, but the LED/LCD my SO's dad bought isn't horrible. Even at 240Hz, I did still notice some streaking, though (watching a football game).

One commenter asked the same question I am asking -- "How thick is this?" The notion of a beam or beams scanning over a phosphor surface that is treated with cells and filters? Sounds like a CRT in most respects. But to have scanning beams, there should be some distance travelled which implies some thickness issues.

One commenter asked the same question I am asking -- "How thick is this?" The notion of a beam or beams scanning over a phosphor surface that is treated with cells and filters? Sounds like a CRT in most respects. But to have scanning beams, there should be some distance travelled which implies some thickness issues.

It also implies bringing back all of the alignment issues of CRTs and rear-projection TVs. This really sounds like a step backwards, regardless of any power savings (which in an LCD or LED monitor is mostly from the backlight anyway).

> It also implies bringing back all of the alignment issues of CRTs and> rear-projection TVs.

Indexed beam technology should take care of that (though it was never commercialized for CRTs). Alternatively, one could use three lasers operating at three different wavelengths and three phosphors each sensitive to one of the lasers. Still seems like a CRT with moving parts, though.

In any case, the Trinitron I'm using right now has never had any alignment problems.

So they took the basic idea of a CRT and replaced the electron beam with a laser and a moving mirror?

Sounds interesting, but I guess this will bring back all of the problems of a CRT (sharpness isn't guaranteed, image may flicker depending on the refresh rate, etc), plus a few new problems (mechanical parts that might be subject to wear, etc).

Yeah, they want to go back to the phosphor and sweeping beam technology? There were more things than just the thickness that I liked about LCD versus CRT's, and radiation & flickering were some of them.

I like the idea of a laser taking the place of the traditional electron beam, and I can see how it would be far more efficient, but I have to wonder if this is going to bring back the flicker problem that we always had with CRTs. One of the things I really like about LCDs and LEDs is the fact that the whole raster is lit all the time.

FEDs (Field emission displays) [wikipedia.org] are superior to CRTs, LCDs and these new LPDs in every way. FEDs have the same thin 2-4 mm profile as LCDs, but unlike LCDs produce very bright and clear images even in direct sunlight (which is why they were used as HUDs in airplanes) while consuming up to 10 times less power. Sony had a 36" FED prototype that consumed only 14 W, which is 1/8 of what a typical LCD and 1/2 of what an LPD of that size would consume.

I still have a CRT television - backlit displays are rubbish for my preferred viewing habits, which tend to involve lots of darkness viz ; sci-fi, fantasy, etc. Any genre where significant amounts of screen time is spent in the dark just aren't as good on a display which can't achieve 100% blackness (in a darkened viewing room).

The colour response of CRTs is better also.

For picture quality this is on a direct footing with OLED displays, which are going to be using the same optically-excited phosphor compou

How powerful are the lasers being used? If the phosphor wears thin over time, would you have laser radiation burning out your eyes. Perhaps the technology will bring some truth to that old parental adage about sitting in front of the tv too long.

I'm actually in the process of hacking together something similar with a 405nm violet laser pointer, a sheet of glow-in-the-dark material, and a moving mirror. The laser pointer leaves a bright trace on the phosphorescent sheet. My notion was to build a small robot that could write glowing messages as it moved across the glow in the dark sheet.

Anyhow, these guys are apparently working on a full-color version. I think what makes this possible now is the cheap availability of blu-ray laser diodes with suffici

Red and green didn't work? Wrong phosphor. I would expect these guys to be using IR, but whatever it is, the right spectrum will excite the right phosphor.

ps- Display thickness (or thinness) is overrated. DLP is a similar concept, so scanning the beam is not so hard. Sony showed some right-angle CRTs a while back that were 3/4" thick, and I expect there is not too much angle needed if you focus the beam carefully, which can be done as we do with various optical players.

Yay, sounds exactly like a CRT, just with a laser instead of an electron beam. But the flickering would obviously still be there. Or if not, the refresh rate would be bad.

I don’t see this becoming a hit with me. I can still use my CRT. That one at least has a flexible resolution... And the colors also still blow any LCD I have seen out of the water. (I have to note, that this is not your average CRT thought. It did cost about $7200 when it was new. I bought it cheap on eBay.)

If it is new, it is unfortunate not only to reuse an acronym, but reusing one in the same domain.

There are only 17,576 three letter acronyms. We've been warning people for years of the need to upgrade to TLAv6, which allows for a wider range of three letter acronyms, including punctuation and numbers as well as unicode support. But many major buzzword providers have refused to upgrade. The last unique TLAs will be depleted within 18 months in our field. Thanks to AAT (Acronym Address Translation), there are already far more TLAs than there are available spaces -- we've been using CIAR (Classless Inter-Acronym Routing) to separate namespaces based on subject matter and field, but it's only a matter of time before even that fails.

If it is new, it is unfortunate not only to reuse an acronym, but reusing one in the same domain.

There are only 17,576 three letter acronyms. We've been warning people for years of the need to upgrade to TLAv6, which allows for a wider range of three letter acronyms, including punctuation and numbers as well as unicode support. But many major buzzword providers have refused to upgrade. The last unique TLAs will be depleted within 18 months in our field. Thanks to AAT (Acronym Address Translation), there are already far more TLAs than there are available spaces -- we've been using CIAR (Classless Inter-Acronym Routing) to separate namespaces based on subject matter and field, but it's only a matter of time before even that fails.