spoco2 writes, "As reported in major news outlets yesterday in Australia (The Age, the Herald Sun), a new television technology has been developed which is touted (by the developers) as far and away superior to both plasma and LCD. From The Age: 'With a worldwide launch date scheduled for Christmas 2007, under recognisable brands like Mitsubishi and Samsung, Novalux chief executive Jean-Michel Pelaprat is so bold as to predict the death of plasma. "If you look at any screen today, the color content is roughly about 30-35 per cent of what the eye can see," he said. "But for the very first time with a laser TV we'll be able to see 90 per cent of what the eye can see. All of a sudden what you see is a lifelike image on display."' The developing company, Arasor International, is said to be listing on the Australian stock exchange shortly."

Hype, shmype - I saw this on last night's news, and watching the plasma vs laser demo on a standard def tv, I could see a noticeable improvement in colour and clarity. They've got a definitely promising product, and the manufacturers getting behind them aren't the idiots who buy shares of free, clean unlimited plasma/fusion/dark matter energy providers, for instance.

If this TV can demonstrate such a massively wider colour gamut than normal TVs, what were you watching the demo on?

While I'm posting, I'll also call bull on the "quarter of the electricity of conventional plasma and LCD TVs" claim. Simply because my LCD already uses a third of my friend's plasma, so I'm guessing they're just picking the numbers that make them look good, or they would have said a 10th or more.

The previous poster did in fact respond to what you said. Picture quality is the noticable difference in the technology, I am sure the manufacturer wasn't comparing energy usage between the two TVs.

The posters' argument was the fact that it may not have been a Laser TV beside a plasma. It may have been a poorly configured Plasma beside a new Plasma giving off the appearance of a new TV technology. If the second was the case, then one could argue that no such TV exists and hence we have vapourware.

It may have been a poorly configured Plasma beside a new Plasma giving off the appearance of a new TV technology.

Yes, some entrepeneurs will push the envelope when trying to introduce something new. I used to work at Mitel Corp, which made business telephone systems. After much pre-announcement, we were supposed to roll out our SX-200 at a major trade show. Unfortunately, the software wasn't fully debugged, and so the thing didn't work properly. So Terry Matthews (that's Sir Terry now, of course) went out, bought a NorTel SL-1, and installed it at the back of the booth behind a curtain. They ran cables out to the SX-200, which was to all intents and purposes an empty shell. Everyone thought the SX-200 was fantastic, we got a lot of pre-orders, and when the software was debugged just a few months later, the SX-200 became one of the most successful PBX's of all time.

So there's certainly precedent for the idea of presenting something as a "done deal" while it's still in development. The question is, will the Laser TV actually appear in the market, as the SX-200 did?

I dunno... It sounds pretty reasonable to me. The only difference between Laser and DLP technology is the source of colored light. DLP uses white light through a color wheel to produce the RGB colors. Lasers produce the colors directly, and lasers in all three colors are now commercially available, although expensive (been to ThinkGeek lately?).

Laser TV technology is definitely NOT vaporware. The technology is already here. Now, the claims of quality may be a bit hyped at this moment, but given the intensity possible with laser light, I fully expect the laser tv to be an amazing display when all the bugs get worked out.

That would be true only in the abscence of competition. If Mitsubishi and Sumsung are both making it the price should decrease until both companies both make what they consider to be an acceptable profit margin. If these displays really do come to market, and really are better than plasma (2 big ifs, I know) it probably will kill plasma.

Unless there is healthy competition driving the price down towards the manufacturing cost. And consumer electronics is a pretty competitive market - although early adopter stuff will probably still have a premium.

Apparently, this guy [com.com] already saw the TV in action and was pretty impressed:

The laser TV made the plasma look like an old console colour
TV. It was so good, the only way i could describe it was that it
looked like a wet photo in a developer tray - if you haven't done
photography, that may not mean alot. But the colour depth and
contrast, especially the space shuttle shots where space was
REALLY black, and you could see the gold foil crinkles in the
cargo bay, was amazing.

His post is a comment on another news story [com.com] about the technology. Of course, take it with a grain of salt since nothing stops a company's marketing guy from posting as Joe Internet.

Traditional displays can't properly emulate shiny objects... It has to do with color reproduction no amount of resolution will help it... hence why TFA makes mention of traditional displays only capable of display 30 to 35% of the colors our eyes are capable of seeing while the laser display is capable of closer to 90%. Plasmas are better then most in this department which is why it was chosen for comparison.

If they were completely phony, I doubt they'd be presenting at all the major display technology industry conferences http://www.novalux.com/company/events.php [novalux.com]) because their exposure to hype-killing doubters would open them to a lot of attacks. And Mitsubishi is really big in projection TV, so is a clear choice of manufacturing partner to use the laser modules Novalux produces. As for the cost issues, clearly the quickest time to market way to go is to replace conventional display components with this optical front end, and modify existing electronics - ie, Mitsubishi chassis - to handle the increased bandwidth. It all sounds feasible. Note they are demoing at the SMPTE conference next week; it's not like some Gizmondo handwaving. SMPTE attendees would smell phony a mile off.

Plasma is way overrated. It's expensive for the cost/year factor over the lifetime of the unit and it's temerature sensitive and pressure sensitive. Where I live, that matters.I live in a mountainous state and if I wanted to buy a plasma to take into the mountains to a relative that lives there, it ain't happening. I have to buy a different rated plasma for the altitude (So says Best Buy, Circuit City, and Frys Electronics in the metropolitan area that has dealt with returns because of people doing exact

I must say I'm not too impressed with the picture quality of the plasma- and LCD TV's we can buy here in the Netherlands. Especially if you take the price into account. I'm glad I've bought one of the last CRT widescreen TV's a few years back. My old CRT IIyama monitor is also better than most LCD flat monitors you can buy today. Hopefully this new technology will deliver the colours and the viewing angles we have become accustomed to from CRT's!

Have you looked at LCD TVs or computer monitors lately? The only reason I can think of to choose a CRT monitor is outstanding color accuracy (Which I don't need...), or high resolution (personally I much prefer to have 2 lower-res monitors side by side). As for price... TVs are fast coming down in price and computer monitors are already dirt cheap. I paid only slightly more for my new LCD TV (a Sharp) than I did for my last CRT TV, both 28" widescreen ones. Picture sharpness and color quality are simila

CRTs still have quite a large niche - young/poor people, and have many advantages over LCDs/Plasmas/these new fangled Laser TVs. I am sitting in front of 2 21" CRTs which I picked up off eBay for less than $100AU each; that's much less than a poor quality 15" LCD (~$150AU). The only disadvantage is they are bigger (but who uses the space behind their screens anyway?), heavier (harder to steal, an important factor if you live in a poor area), and use more power (altho

I must say I'm not too impressed with the picture quality of the plasma- and LCD TV's we can buy here in the Netherlands. Especially if you take the price into account.

Same here. I've looked at many lcd and plasma TVs, but none of them look good enough to justify their cost. I'd rather stick with a CRT for now. Plus the CRT I have (non-HD) doesn't have that annoying high pitch coming from it.

Same here. I've looked at many lcd and plasma TVs, but none of them look good enough to justify their cost. I'd rather stick with a CRT for now. Plus the CRT I have (non-HD) doesn't have that annoying high pitch coming from it.

My first question would be what the source was? Because if the source was non-HD, then certainly no advantage will be evident. My second question is where you checked them out. Usually, in the stores, either the sales staff doesn't know how to set the picture, or they set it on "n

I agree. The best CRTs are very very good, at least until the CRT starts to have problems. However it's rare to see a good CRT these days. I have some old Apple CRT monitors that are exceptionally good, but for every one of those, there were probably a thousand ghastly low end monitors with 60Hz refresh rate, greenish tint, and a convex surface guaranteed to turn any light source into glare no matter how you position them.

The thing about LCDs and plasma is that they are consistent. There's less art to making a decent one or scaling it up in size, its simply a matter of cost.

Cheap but consistent mediocrity is usually an engineering win. If it can be marketed as "high end", it spells big margins. Think SUV.

I have some old Apple CRT monitors that are exceptionally good, but for every one of those, there were probably a thousand ghastly low end monitors with 60Hz refresh rate, greenish tint, and a convex surface guaranteed to turn any light source into glare no matter how you position them.

Then, of course, there's all those top quality CRT monitors with a 85+ Hz refresh rate, reasonably accurate colors (but not necessarily between identical models from the same manufacturer), and a surface so flat that straight

I purchased one of the last CRT WEGA TVs that Sony made (before the rootkit fiasco). It has a 3:4 aspect tube but it also displays 1080i. The picture doesn't wash out time during the time of the day when the sun shines into the room where the TV is located -- unlike my LCD computer monitor. The only problems are that it is HUGE, it weighs over one hundred kilograms (most of which is in the front glass of the CRT), and it is made by Sony.

You must have missed the other press release from their sister company stating they have genetically modified sharks with freakin lasers on their heads to be small enough to fit millions of them inside the TV.

Red lasers are the easiest to create of all. The issue is probably due more to the fact that red lasers don't have the same intensity for a similar powered blue laser and also focal for different wavelengths.

I know you're being sarcastic, but this actually is what I want in a monitor. All the current drivers for LCD's have DAC's w/ only 8bpc, which makes them pretty much unsuitable for doing critical color-correction work.As for frame rate, I'm happy with 24 - though response time of the screen is a serious issue with LCDs -- not so much for my professional work, but as a comsumer the lag really bothers me.

As for content -- I agree, but I think that discussion is orthoganal to this one.

I'd actually prefer that things DID blurr when they moved. I find that old clips (news coverage of the Kennedy assassination comes to mind) look MUCH more fluid and natural simply because they have motion blurr built in. The strobe effect of the current stuff doesn't cut it.

They plan for this next year, SED has been planning to enter the market for several years, too.The problem for all of them is that some companies like Panasonic are able through mass-production and new factories to really push the price down for Plasma displays.

If they can make screens even flatter and brighter and at a low price, it might have a chance to succeed.

If it is just an expensive, better looking device, it can only survive in a fringe market.

Even if laser tech allows one to see amazing 99.99% of what their eyes can see.. it'll just not a make a lot of difference.

We have incredibly humongous content in digital RGB, YUV, PAL, NTSC, movie reel formats. These formats contain only what you can see on an existing TV. Hence an DVD would look as vibrant on a normal plasma as on this laser.

Now of course things are not as simple, since for advertising purposes they'll scale the range up to demo the colors. If they overdo it though, they'll just skew the picture too much and receive at grotesque results.

There's a point where a tech is just "good enough" and color representation of a *modern* TFT (notice the stress) or plasma is sufficient.

Laser TV's may succeed if one or more of the following are met though:

- longer life, more durable- less power consumption- more portable (?)- cheaper

I have to disagree. Their claim is that images will be more 'real looking' than ever before. When was the last time you went to a TV store and were walking around, and thought an image on a screen was a real person for a moment? It never happens, even from a distance or the most confusing conditions, because the colors are just slightly off.If they can do this and this alone, it'll sell the TVs.

They also claim less power consumption and less depth, so it's 'more portable' as well. And cheaper.

But then, they've made a lot of claims without a lot of proof. We'll know if it's vaporware sometime before Duke Nukem Forever is released.

They look kinda suspicious to me. Their page is nothing more than 3-4 template pages touting proud statements like "Industry sources estimate will be huge in 2009".

Their domain doesn't reflect their company name. Worst branding example yet? No sane company would use "lightbit.com" for their official company domain when their name is "arasor".

A normnal company might register a promotional domain but won't make that their main domnain.

Last but not least, they try to pull it off as if they have monopoly over laser TV technology, but they actually have a lot of competitors with actual products to show, such as Novalux, Mitsubishi etc.

You are missing the point entirely. When 99.9% of the existing content is designed with a smaller colorspace in mind, being *able* to display more colors won't make anything look better. We'd need new content to do that.

And why exactly would anyone produce content in more colors with nothing to display it on? It sounds 'chicken or egg', but I think if you look back, the ability to do something always came before the content for that something.

One of the major problems with using lasers for displays is speckle, the random interference patterns that develop as the highly coherent laser beam hits the display screen (whose surface is far from smooth when compared to the wavelengths of laser used). This greatly diminishes the quality of display and more importantly, anyone sitting in front of this for extended period is likely to get headache and temporary vision problems.

Extended field trails on psychophysical effects are needed before such technology is approved by FDA or equivalent regulatory organizaiton.

The FDA has control of 21 CFR 1040 which is the US law that controls lasers. The basic test assumes that the laser emits its light out of asingle small aperture and that the collimated beam expands. The cop speed lasers found a trivial way around that test even though optics that give an equivalent beam at 100 meters wouldn't be allowed. Some lasers are allowed for use in public but only for about 20 minutes according to that finely worded law.

Web 2.0 emphasises pastel [wordpress.com], deliberately limiting the color content to even less of what the eye can see, so presumably it's doomed. Also Slashdot after its new design.
But I'd love to see this guy's original press release. Did he follow his own theory that people like more color, or was the text black-and-white?

The problem with the extended colour gamut of the new system is that existing source material is based on the sRGB colour space, which encompasses roughly 35% of the eye's gamut. Anything shorter wavelength than blue, such as spectral violet; many saturated greens and oranges, and most cyans are not available, and the nearest colour is used.

We're all used to this, so when a violet flower is shown as purple (red + blue) on our displays, we don't question it. But try putting a vase of violets next to your TV and you'll see the difference.

Some proper digital photography setups try to improve on the situation using colour profiles, which is simply a lookup table to transform the RGB colours in the file to absolute colour values.

Digital cameras can record colours outside sRGB, so if you ensure your workflow never enforces that constraint, you can end up with a file that can be printed using colours your monitor can't see.

Typically, the input file (usually a raw camera file) is transformed via a device profile (representing the camera's actual spectral response) into a working space (a device-independent space for editing). Whilst editing, the image is viewed using a transform to sRGB (or your display's output profile, if you've calibrated it), but this restriction is for viewing only and doesn't change the file. Then, when you print, the image is converted via a device profile for your printer to print to the extremes of its capabilities - which may exceed sRGB in some colours (e.g. cyan), and be even worse in others (e.g. pure blue).

To make use of this new TV system, we'd need something similar - wide-gamut source material, and device profiles for each set (or simply assume sRGB as default, for backwards-compatibility). Otherwise, it's like listening to music mixed for cheap portable radios (i.e. most current CDs) on a real hi-fi system.

Digital cameras can record colours outside sRGB, so if you ensure your workflow never enforces that constraint, you can end up with a file that can be printed using colours your monitor can't see.

Typically, the input file (usually a raw camera file) is transformed via a device profile (representing the camera's actual spectral response) into a working space (a device-independent space for editing). Whilst editing, the image is viewed using a transform to sRGB (or your display's output profile, if you've calibrated it), but this restriction is for viewing only and doesn't change the file. Then, when you print, the image is converted via a device profile for your printer to print to the extremes of its capabilities - which may exceed sRGB in some colours (e.g. cyan), and be even worse in others (e.g. pure blue).

Most 6 or 7 component inkjets can go well beyond sRGB gamut.

Life stops being simple and nice once you take that step, thought. With AdobeRGB for example, you cannot share any of your images with your friends or print them in commercial shops unless the recipient can handle color profiles properly. XP image preview actually can, but none of the browsers do.

True, you can change the profile but unless you've got full photoshop, it's more conversion steps as the freeware utilities that I'm aware of can only do TIFF and JPG.

2nd hurdle is actually getting the photos to print. You have to be able to bypass all windows color management (which uses sRGB) and use photoshop (or photoshop elements) to print, which needs to have the profile for your printer AND photo paper for things to work right.

As an end result, you *may* get images of a lagoon or something that has deeper hues your commercial print shop would print. But how many of images like that "ordinary" people have in the 1st place?

There are even wider gamuts as AdobeRGB still doesn't surpass what you can see. I think PhotoPro will show all the colors (reference) eye can see and in fact quite a lot it can't, since color vision is not nice and linear.

Bottom line is, unless you're absolutely sure what you're doing, stick with the sRGB! Going with AdobeRGB or similar will make your photos look WORSE unless the rest of the cain supports it.

Windows has color management support that goes far beyond sRGB. It is capable of doing color space conversions, and its printing subsystem does support this. It's up to the application to spec the source profiles of artwork, and to invoke the ICC support to do the conversion. All that said, Windows color management is crap. That's why all the commercial print products such as Adobe's stuff, disable it.As to the browers, you are correct. However, this is more a problem of a lack of web standards than a brows

At least I didn't see a pic! I hate when they put up a screenshot of some amazing futuristic HD quality for me to see on my old CRT monitor or in a commercial on TV. I obviously can't view those pictures in their amazing futuristic HD quality... so what do they do? Blur and mute the comparisons.

It's nice that TV's will be able to play 90% of the colors the eye can see, but what about video cameras. Will the technology play into recording 90% of the color we can see. Currently SD NTSC is what like under a million colors. Whats the point of having a TV that can play more colors if the devices we use to shoot with can record less.

Now I know HD has a larger color spectrum, but is it 90% of what the human I can see?

The linked article talks about Arasor International. If you read carfully, the real company behind this innovation is US company Novalux [novalux.com]. Arasor just makes one of the chips.

Novalux has an interesting history [fastcompany.com]. They first wanted to target long haul telecom with their technology (laser on a chip). As of 2002, they were developing lower powered lasers for short haul markets. Their web site [novalux.com] also claims a forey into bioinstrumentation.

I can't believe people still spew that bunk. Current plasma's are good for 60k+ hours (first gen was worse, maybe 15k?). Anyway 60k hous is 7 years of 24/7 viewing. Another way of looking at it is that a $3k plasma TV costs about $.05 per hour to watch.

Wow! A company's bigwig claiming their product, not yet shown to anyone, is somehow better than an existing product, that's been out for years, looks great to the average eye, but that somehow, although everybody wants it, has several fatal flaws! And before an IPO!

Seriously, isnt there some restriction on making "forward looking statements" before a stock offering?

There's not much info in the articles...What about SED-TVs [wikipedia.org]? (Surface-conduction Electron-emitter Display)They've been on their way for a long time, and how it looks like they're about ready... 100.000:1 contrast ratio, 1ms refresh-ratio, 450 nits:-) Check out SED-TV-reviews [sed-tv-reviews.com] and some info from HDTV-solutions [hdtvsolutions.com]... It's interesting stuff - I've head the image described as very lifelike and just floating in the air:) Using less power than LCDs and with only 10% degradation in 60.000 hours. It's basically a flat

n the first place, I seriously doubt that there's any meaningful way of measuring the "percentage coverage" of a gamut of colors, since the mapping of colors into a plane is somewhat arbitrary and there are two very different systems in wide use. I notice that this comparison of Adobe RGB vs. sRGB [cambridgeincolour.com] doesn't try to estimate any "percentages."

Second, if we're talking about something like "area included in the CIE xy plane by thus and such system of reproduction" as a percentage of "area included by the entire spectrum," I seriously doubt that you can get a number anything like 90% with only three primaries. You're still trying to approximate a blobby blunt shape with an inscribed triangle.

The article is so vague on details that it's not clear how many primary colors are used. If it uses six primaries instead of three, I'm prepared to believe it could give meaningfully better color than traditional systems. How important that is remains to be seen. HDTV gives obviously, dramatically better picture quality (in terms of resolution) than traditional TV, but it doesn't seem to be setting the world on fire.

The big question, of course, is where one would find program material encoded with more than three primaries; it would need to be specially recorded for this system (requiring new video, broadcast, and optical disk standards).

As usual, the news stories didn't contain any technical info and could preferably have been (almost losslessly) compressed into a headline. So, how do they make the colours? Are there several laser beams of different colours that blend on the screen? Or are the beams exciting some material (like CRT screens) that then show colours?

Heh, a simple laser projector, as I think of it, with a single beam sweeping over the wall would use someting like 0% of the visible color spectrum.:-) You know, laser is narr

So we have to wait a year until we can get the "latest and greatest" in picture technology, hm? How will audiophiles looking for something to plug their PS3 or their X360 into possibly pass their time until then?

Laser TV has existed for a long time using Argon (blue, green) and Krypton (red) lasers as a white light source (either mixed gas or two lasers) The color is chosen using an AOM or a PCAOM (see a patent for laser TV at: http://www.freepatentsonline.com/6426781.html [freepatentsonline.com] ).

The new breakthrough is that we have solid state Diode Pumped Solid State lasers (specifically high power DPSS), you should be familiar with the 532nm green laser pointers. The green is achived through frequency doubling 1064nm infared DPSS lasers. Red lasers need not be frequency doubled because they can manufacture Diode lasers to that frequency and is available in higher power ranges. Blue DPSS lasers were developed, usign 808nm infared lasers frequency doubled, the power available is still really low, (and I can't wait to rip apart a blue ray drive to get the laser out!) and the lasers are extremely expensive. Hopefully with greater production of blue lasers the prices will go down.

The next issue to deal with in the U.S. (I don't know austrailian law) lasers are regulated by the FDA and any laser over the power of 5mw that exposes radiation to the public has to have an FDA varience to legally operate. I am wondering how this TV would be classified. I really would prefer a solid state DPSS laser projector to replace easily broken, expensive to maintain, LCD projectors. If you need more information about this technology sam's laser faq, and the guys at alt.lasers are nice and answer questions.

CRTs are traditionally analogue, and as such are capable of reproducing many more shades of certain colours than are perceptible by the human eye. LCD/Plasma displays traditionally have at *least* 18-bit DACs which is not enough to avoid visible colour banding - granted. And that's got nothing to do with the display technology (LCD/Plasma/CRT/etc) - as I understand it, that is simply a limitation of the DAC. I don't know what current standards are but I would be surprised to find that current DACs are gener

I think they are indeed talking about color range (frequency range) rather than intensity. Classic screens only produce 3 very discrete colors (red, green and blue), in varying intensities. The sensitivity of the receptors in the eye has a wider band. (that's why you can see laser light that doesn't exactly meet the peak sensitivity of your receptors).Maybe This new technology produces light with bandwidths that match the sensitivity of the eye's receptors better?

Yeah, I think it's to do with purity of the component colour frequencies. Maybe current technologies produce, for example, a red which would look like a bell curve on a frequency graph instead of a sharp peak, meaning less faithful representations of those component colours. Maybe the grass really is greener on the other screen:P

It's actually the fact that, at a constant intensity, the color gamut (visible hues) isn't triangular - it's only approximately so, and curved. With any number of colors, all you can get is a linear combination, which, at a constant intensity, ends up being a convex polygon. So with three, you can impose a triangle of color over the sort-of-triangular gamut. The more colors you can combine to make the vertices of the polygon, the better coverage you get.I'm not sure what this has to do with a laser display,

Good link. The main bit of relevant information in there is that lasers are able to produce more saturated (read: pure) colours.Would it seem rather that the near 3-fold increase they are are talking about is the ratio of the areas of the two shapes in this [wikipedia.org] graph? So it's not all about brightness then...

I'd expect that many people, like me, are so used to subconsciously compensating for the inadequacies of normal displays that they hardly see the deficiencies compared to real life. I'm looking forward to se

If you look at a light source consisting of a single wavelength of light (monochromatic light), you will see one colour from the rainbow of visible colours. Interestingly, the human eye can be fooled into seeing the same colour by creating an additive mixture of three different colours of light. You might think the mixture needs to contain the same wavelength as the monochromatic light, but in fact by varying the proportions of the three different colours in the mixture, it is possible to create a mixture t

The new laser tv display is different because each pixel is created by light from a tunable laser

I was wondering about that! It didn't seem feasible to me (given my limited knowledge on the technology) that they would've been able to "tune" a laser's frequency rapidly enough to scan the entire display. That's many millions of different "frequencies" per second! That's exactly what I was hoping for until I read TFA, which didn't seem to mentioned that at all.

Interesting. Surely it only requires 1 wavelength per type of cone receptor in the eye? I am aware that the cones really respond to all visible wavelengths, but with different ranges of sensitivies for each of the 3 types. I remember what the sensitivity / response graphs look like for each type of cone, and how they overlap - but it still seems like as long as you're using any small number of discrete frequencies to reproduce an image, it will still be a rough approximation. You need a whole bunch of wavel

Actually, there's up to four frequencies eye cones can be tuned to: the fourth one is tuned to orange (see here [personales.upv.es]), and appears in about 32% of the population. If you add up the rods being tuned to yet another frequency (between blue and green), five frequencies would probably be needed to present colours that cover efficiently the eye range.

But, presumably, since the signal once decoded contains RGB, not instructions on how to tune the laser - all they have is RGB to work with unless the TV standard is modified to allow for something better than RGB. The only way they know what colour to make the pixel is from an RGB input anyway - so they are stuck with the limitations of RGB.

But this particular product is a television, not a computer display. The colour of each pixel on a television is controlled by chrominance signals [wikipedia.org]. Chrominance spans the entire u,v (for PAL TV) or i,q (for NTSC TV) colour spaces. This is one reason why chrominance is a useful way of representing colour.

The new laser tv display is different because each pixel is created by light from a tunable laser

I strongly doubt that. The laser frequency depends mostly on the laser medium. This is why most tunable lasers are dye lasers, because here they can replace the dye (solution) with a different one that gives a different laser frequency. And you can't replace the dye within the few ms that it takes to light a pixel.Probably they use 3 laser diodes here in primary colors in to create an RGB image on a white phosphor screen. The lasers can be modulated in an analogue way, so it will have better intensity dynamics than LCD.Also, the pixels will be sharper, because you don't need 3 phosphor colors and a mask (one pixel instead of RGB pixels). Using mirrors, they can fold the path of the screen and create thin TVs.

The quote is vague but I can only assume they're referring to the sRGB color space [wikipedia.org] which is what TVs today will display. So yes, it's very possible that this new TV can actually display over twice as many colors as common ones (actual hues, not just intensities). Unfortunately, from what I understand, if this TV used a different standard, then it wouldn't be backward compatible (e.g. if you plugged your cable into a laser TV the colors would be very distorted, because the signal provider expects the display

Why not try forming an opinion on it based on things they've actually confirmed and denied?

Half the weight and size of a plasma TV. Uses a quarter of the power to the same effect. Increases the range of colours displayed from 30% of what we are able to conceive to 90%. Costs half the price of a plasma screen.

"Oh, but they never said whether or not they support these three completely random display connectors so obviously it's a waste of time."

"Half the weight and size of a plasma TV. Uses a quarter of the power to the same effect. Increases the range of colours displayed from 30% of what we are able to conceive to 90%. Costs half the price of a plasma screen."What, and you believe that?

It costs half the price of a plasma? Yeah, I'll believe that when I see it. You really think if this tech actually works they'll sell it that level? No. Better picture - more expensive. Smaller/lighter - more expensive. Combine the two.. get ready to mortgag

Manufacturing cost has nothing to do with it - things are *not* sold for what they cost to produce. They are sold for what people are prepared to pay.

Incorrect. Things are sold at a price to maximize profits. As price goes up, you'll attract less people to buy your product. These guys don't have a monopoly on televisions, so people will just buy something else if it's too expensive. I just bought a new TV and didn't even consider the HDTV sets because it was just too expensive. I could have afforded it

HDRI generally involves more bits per pixel, or a different breakdown of the color and intensity to effectively get the equivalent. Since this TV will get the same input as everything else, at least as far as the technology is concerned I don't think they're talking about HDRI per se. Instead, I suspect it's more an issue of contrast ratios: the brightness level between the brightest white and the darkest black. There's plenty of TVs that can't show dark scenes well because they wash out in bright or dar

This TV will use most of the same technology that already exists. Check out http://en.wikipedia.org/wiki/Lcos [wikipedia.org] and http://en.wikipedia.org/wiki/DLP [wikipedia.org]. I haven't seen a major revolt against DLP due to lorry traffic yet. All they are changing is the light source from a lamp to a laser. Now, you can assume that in order to generate the same image brightness then the same amount of energy has to hit the screen with a laser and a lamp. However, ALL of the laser's energy is used on the screen as opposed to a regular lamp which loses a lot of energy to heat through radiation in directions other than towards the screen. With all that, I'd argue that a laser based TV would generate a lot less heat than one with a lamp.