Basically they strobe the back light so it only pops on after pixels change their state giving you a film style experience, only at an extremely fast rate. It's pretty ingenious and I'm surprised I didn't hear about it till I did some investigation on the VG248QE (which I'm still waiting for word on before buying one). Keep in mind this is only a perception trick, but since the refresh is so fast it has almost no side effects.

There is an added high speed video showing the results of enabling it too. This is a Nvidia only feature, but I'm sure we'll get a AMD version as soon as this takes off and they catch wind.

1. If I'm correctly understanding how this feature works, the claim that it does not introduce lag isn't entirely true. Since the backlight is turned off when the pixels start changing, and doesn't turn on until they've settled, there will be a *slight* increase in effective lag (equal to the length of time that the backlight stays off). The whole point is to hide the period of time where the pixels are changing from the user; this means the user doesn't see the pixels change until after they have all *finished* changing; by definition that is going to introduce lag (albeit a very small amount).

2. Since the backlight is off for part of the time, the maximum effective brightness level will be slightly reduced. But unless you're using the display in a very brightly lit environment where you would want to max out the display brightness, this probably won't matter.

Any lightboost certified product with a Nvidia card will work with this. I'm unsure about this being talked about for a long time... It only came up in december. Lightboost has been around for a long time, but that was just part of Nvidia's 3D package and was never used in this form.

Yeah, having the backlight off will introduce some lag, but people have only started talking about using it this way with really low response time monitors. The VG248QE for instance has a 1ms GTG response time.

Benq did something similar to this with their monitors a few years back (it was something about inserting a blank between frames to improve the perception). Or at least they were talking about it. Not everyone liked the effect IIRC.

It sort of amuses me that it was claimed that CRT's have no motion blur. They have instant response rate but motion blur could definitely be noticed (although I'd assume really nice phospors would dim faster).

ChronoReverse wrote:Benq did something similar to this with their monitors a few years back (it was something about inserting a blank between frames to improve the perception). Or at least they were talking about it. Not everyone liked the effect IIRC.

Actually, every LCD TV that advertised "240Hz" a couple of years ago was actually doing that: pulsing the backlight at double (or even quadruple) the actual refresh rate in an effort to fool the viewer into thinking there was more going on than there was (and since the 120Hz frames were really just interpolated from the actual 60Hz data, they were piling fakery atop exaggeration). I don't see 240Hz advertised much anymore so I don't think it was a winning pitch.

Good question... I'm not entirely sure what causes the strobing effect either. I assume it's Nvidias 3D technology and it's built into that, but it has something to do with the monitor too. This particular trick only seems to work at 120hz, which is in sync with the backlight.

Maybe I missed it, but I've never seen motion blur on a CRT. The only thing that could be in regard to is the low refresh rate... but there are CRTs that went upwards of 100hz, sometimes into 120 (although those were towards the end of CRTs hay day and on expensive models). Most would do 75 or 85 at their recommended resolution.

Meadows wrote:How much will this reduce the lifespan of the screen (the backlight at least)?

LEDs don't care if they're pulsed on and off. Shouldn't affect the backlight at all.

UberGerbil wrote:I don't see 240Hz advertised much anymore so I don't think it was a winning pitch.

Yeah, that's because the new hot marketing pitch is 600Hz plasma screens. Not much point saying yours runs at 240Hz when someone else is saying "But mine goes to 600!"

Bensam123 wrote:Maybe I missed it, but I've never seen motion blur on a CRT. The only thing that could be in regard to is the low refresh rate... but there are CRTs that went upwards of 100hz, sometimes into 120 (although those were towards the end of CRTs hay day and on expensive models). Most would do 75 or 85 at their recommended resolution.

CRT displays used phosphors with short persistence, specifically to minimize motion blur. Longer persistence phosphors were used for specialized applications like oscilloscopes and radar screens, where having the image linger for a while was a feature instead of a bug!

Since LCDs rely on a physical process to change the pixels (molecules in the LCD cells physically deform to change their optical properties) it has been difficult for manufacturers to get the response time down into the range of a short persistence CRT phosphor.

CRTs are also scanned, so in a sense they inherently behave like LCDs using this Lightboost trick. By the time the next frame comes around, the image from the previous frame has already faded out. That's why CRTs exhibit pronounced flicker at lower refresh rates.

I actually ended up ordering a VG248QE as it seems amazing enough to tips the scales towards a new monitor purchase, even though it's still 1080p. I'll have to report back here with my findings. I don't a Nvidia card though so I can't test Lightboot.

There is a pretty big thread on the monitor on Overclockers for anyone that is interested in this model. It has garnered a lot of attention from gamers. There is also a BenQ built off the same 1ms TN panel. This monitor was the best item at CES this year IMO.

Kinda sad that I have to 'settle' for a monitor. Ideally I'd want a 1440p, glossy, 1ms response, zero input lag, and 144hz refresh. You could throw IPS on there too, but that just isn't going to happen. It should be pretty easy for them to offer a glossy model too. I hate matte screens. The only reason I'd ever use one is for mobile applications (such as a laptop for going outside) or in a room where I can't control light sources and just seeing the information is enough.

ChronoReverse wrote:Benq did something similar to this with their monitors a few years back (it was something about inserting a blank between frames to improve the perception). Or at least they were talking about it. Not everyone liked the effect IIRC.

Actually, every LCD TV that advertised "240Hz" a couple of years ago was actually doing that: pulsing the backlight at double (or even quadruple) the actual refresh rate in an effort to fool the viewer into thinking there was more going on than there was (and since the 120Hz frames were really just interpolated from the actual 60Hz data, they were piling fakery atop exaggeration). I don't see 240Hz advertised much anymore so I don't think it was a winning pitch.

Some corrections.

(1) Proper scanning backlights only flash once per frame. For example, a 1/120sec flash during a 60Hz frame (same perceived motion blur as sample-and-hold 120fps@120Hz), or a 1/960sec flash during a 240Hz frame (same perceived motion blur as a theoretical sample-and-hold 960fps@960Hz). This is explained at Science & References.

(2) They still market them -- see Existing TechnologyThey've just been told not to advertize them as "Hertz" because it's a misleading term. Instead, they use different terminology such as "Clear Motion Rate" or "Motionflow" which is still common in many higher-end HDTV's.

(3) They actually have real benefit. Download the PixPerAn motion test while testing a LightBoost with an nVidia card. PixPerAn readability test score of 30 on ASUS VG248QE. PixPerAn chase test of 1-pixel apart. These are motion test scores unheard of for LCD, until LightBoost came along.

ChronoReverse wrote:Benq did something similar to this with their monitors a few years back (it was something about inserting a blank between frames to improve the perception). Or at least they were talking about it. Not everyone liked the effect IIRC.

This is very true, BENQ had this AMA-Z in year 2006. It flickered like mad and reduced motion blur by only about 30% or so.However, LightBoost is _vastly_ superior, because:-- It runs at 120 Hz refresh, so flicker is not a problem to most people-- It eliminates 90%+ of motion blur compared to a 60Hz LCD-- Today we have panels that have nearly no crosstalk between refreshes (a requirement for 3D).-- It's the first strobe backlight that some CRT die-hards put away their CRT's

Since people are talking about LightBoost here, it was normally designed for 3D (reduce crosstalk between eyes), but also allows elimination of motion blur. Some people have put away their Sony FW900 CRT's now -- because it looks just like a CRT in motion (even if not color). Quotes from OCN:

Vega wrote:Gaming on this monitor is a pleasure as far as motion clarity is concerned. As a FW900 aficionado, this monitor with the right settings can have just as clear of motion. While the FW900 does have superior image quality, you also have a smaller image (22.5" versus 24"). Using NVIDIA driver 313.96, enabling Lightboost has been a fairly painless experience (although as some others have found out there is a bug in which under certain circumstances your computer will start pausing and behaving extremely sluggishly when adjusting 3D settings). Interestingly enough, the monitor seems to like to stay "stuck" in LB mode, even after adjusting settings in the control panel. This is actually a boon for those of us that bought this monitor for 24/7 LB mode like myself.

Romir wrote:Thanks for the timely review Vega.I went ahead and opened mine and WOW, it really does feel like my FW900. I haven't tried a game yet but it's down right eerie seeing 2d text move without going blurry.

Transsive wrote:Then yesterday I, for some reason, disabled the 3d and noticed there was no ghosting to be spotted at all in titan quest. It's like playing on my old CRT.original post

Inu wrote:I can confirm this works on BENQ XL2420TXEDIT: And OMG i can play scout so much better now in TF2, this is borderline cheating. original post

Terrorhead wrote:Thanks for this, it really works! Just tried it on my VG278H. Its like a CRT now! original post

Bensam123 wrote:Good question... I'm not entirely sure what causes the strobing effect either. I assume it's Nvidias 3D technology and it's built into that, but it has something to do with the monitor too. This particular trick only seems to work at 120hz, which is in sync with the backlight.

AFAIK the strobing effect is caused by the PWM circuitry used to modulate the brightness of the backlight. The further the monitor is from 100% brightness, the more pronounced the strobing.Nvidia's technology just syncs the backlight strobe to vsync.

My concern is this potentially takes us back to the days of CRT. 60Hz was totally unacceptable then, because it resulted in lots of flicker and tired eyes, so much so that the minimum refresh rate considered "acceptable" was about 72 or 75Hz. I ran my old Samsung Syncmaster at 1280x1024 and 85Hz but improvements were perceptible all the way up to 120Hz.

On a side note, anyone who says there's no discernable improvement above 60Hz obviously didn't live through the CRT era. Back then, every computer screen was proof that the eye does see way above 60Hz.

jihadjoe wrote:On a side note, anyone who says there's no discernable improvement above 60Hz obviously didn't live through the CRT era. Back then, every computer screen was proof that the eye does see way above 60Hz.

Yup. The only reason 60Hz looks OK *now* is because LCDs remain "on" until the next frame comes in instead of fading out like CRT phosphors do, so there's no flicker. The downside (as we've been discussing in this thread) is smearing/blurring since the pixels don't switch instantaneously.

I have a BenQ XL2420T, best gaming purchase I've ever made. 120 Hz is amazing and makes 60Hz feel like slow motion in comparison (I had to go back to my 60hz screen for a few weeks while I RMA'd the first one I got). You can really feel a huge increase in fluidity, especially if you're able to get 120fps in the games you play.

jihadjoe wrote:AFAIK the strobing effect is caused by the PWM circuitry used to modulate the brightness of the backlight. The further the monitor is from 100% brightness, the more pronounced the strobing.Nvidia's technology just syncs the backlight strobe to vsync.

Yes, a very specially motion-optimized PWM. But it is surprisingly complicated, because the monitor electronics must work hard to erase pixel persistence before the next refresh.

Today, you now have LCD panels that successfully erase pixel persistence before the next refresh:(Credit: NCX review of VG236H)

And also this one from a BENQ XL2411T (1ms) with LightBoost:(Credit: imgur from overclockers.co.uk)(Photo looks the same in repeated photograph attempts!)

You can only see faint remnants of pixel persistence if you look closely; not possible to notice at arms length view distance (near zero 3D crosstalk on 1ms monitors). Today, it only recently finally became possible to time the backlight strobes to bypass pixel persistence, ghosting, coronas, and RTC overshoots. All of that is now successfully kept in dark (99.5%+ pixel transition complete, for VG248QE and XL2411T -- these two monitors have near-zero 3D stereoscopic crosstalk). Backlight strobes when the image is clear. This is good for both 3D (crosstalk between eyes) and also 2D (zero motion blur).

You need to precisely fine-tune the pixel response-time acceleration technology because you need to transit pixels as fast as possible. But you also need to hide overshoot artifacts (coronas) when the backlight strobe flashes, so you don't see either ghosting or coronas. When I enable LightBoost on my BENQ XL2411T, all ghosting, all blur, and all coronas disappear.

It's been a complex science for LCD's to tune the pixel persistence to finally reach this point. The critical "true CRT zero blur" point only occured about year 2012; superseding all past half-hearted attempts at strobed backlight technologies.

Enabling LightBoost 120Hz, looks like a "simulated 500Hz" due to the black frame insertion effect (like CRT). This is due to the approximately 2 millisecond strobes (1/500sec strobes occuring 120 times per second, with completely black period between strobes). PixPerAn tests show over 75% less motion blur than regular 120Hz. There's far less motion blur at 120Hz LightBoost than 144Hz non-LightBoost. Some people who said "wow" going 60Hz->120Hz, have said "WOW, again" when going from 120Hz->120Hz(LightBoosted) when playing on a GTX680 in games like TF2, BF3, Quake Live. Fortunately, if you don't like 120Hz flicker, you can turn it off. These monitors are still great general-purpose 120Hz monitors, even if you turn off LightBoost.

morphine wrote:Okay, stupid question: this only ever works in 120Hz monitors, right?

It is a feature only found in 3D monitors, originally designed to make images brighter during 3D gaming operation. However, it has a side effect of eliminating motion blur. You can also use 100Hz too, as LightBoost is also supported during 100Hz operation. Currently, it is not supported at below 100Hz.

Monitors confirmed to have zero motion blur with the LightBoost strobe backlight: ASUS VG248QE*, BENQ XL2411T*, ASUS VG278H, ASUS VG278HE, BENQ XL2420T, Samsung S27A950D and Acer HN274H. *The best-performing LightBoost monitors are the 1ms monitors, because of reduced crosstalk between refreshes. The 1ms VG248QE has been found for under $300 on some online sites, so that's the one I'd get since it's more easily available in North America than the XL2411T.

It's important to note that this is still a perception trick and it's not 'actually' eliminating all motion blur. We're still quite a ways off from a display that can make a image flow like water, which would be downright awesome.

Some of this stuff TR really should be looking into. They were pushing IPS panels hard for the colors, but with stuff like this makes for a super fluid experience. I've heard that even though the Catleaps OC to 120hz, you still end up with a garbly-gook mess due to the response time for the pixels themselves (which makes sense). Refresh rate is only part of the equation.

I was actually hoping my next monitor purchase would be a plasma monitor or a SED, but that never happened. I would've most definitely bought a plasma monitor. SED development sadly ceased. Plasmas are prohibitively expensive to make in a smaller form, but monitors in general are more expensive then their TV counterparts so it makes me wonder if there is actually a market for them, especially when you drop like $300 on a gaming monitor which definitely doesn't give IPS colors. You could get both good colors and a fluid experience out of a plasma.

If I knew anything about monitor development that would be a great idea for a kickstarter.

Bensam123 wrote:It's important to note that this is still a perception trick and it's not 'actually' eliminating all motion blur. We're still quite a ways off from a display that can make a image flow like water, which would be downright awesome.

The point is that the blur occurs while the backlight is off, so you can't actually see it. We're basically trading off blur for flicker; with fast enough frame rates the flicker will not be perceptible. Sounds like we're nearly there.

I would disagree, but I'm sure you know that. A perception trick isn't the same as hardware actually capable of producing fluid motion in such a manner. You could apply the same technology to a 60hz monitor... It's also why film makers were whining about higher then 24fps coming to theaters... (people can actually see and tell what's happening on the screen so they need to do a better job of hiding special effects and CG).

I'm not saying LB is a bad thing... it's just not the same as a legitimate display technology that provides fluid motion at all times.

I'm holding out (still) for OLED monitors to come to maturity (/sensible prices). It looked like the new era was about to dawn 6-5-4-3-2 years ago, and it slips back every time, but they're finally becoming commonplace in smartphones, and there's finally some genuine 55" commercial products on the market.

I don't like the idea of LCD trickery to improve motion perception, and I sure don't like the idea of TN panels with their crappy colour reproduction being the only way to get a remotely fast response screen.

Non-facts-backed-up rumblings place OLED at a 0.01ms response time (yes, I just looked at wikipedia), which pretty much means any frequency required, and the pixels should change almost instantly. I don't know if the limitation would then come down to the control circuitry or some-such, but at least it won't need to physically deform crystals any more.

I've disliked LCD since day one, due to the fact that it always seemed like a technology that was battling relentlessly against physical properties that were inherently detrimental to image quality. It emits white light everywhere, and uses its technology to attempt to selectively block/filter it, which it can never manage completely (contrast ratio) or entirely consistently (viewing angles/uniformity/backlight bleed/etc). It tries to display fast motion by physically deforming a light blocking medium that doesn't want to move quickly, and has to use endless hacks to force it to move quicker than it wants to (overdrive, etc). Lightboost just seems like another hack to add to the pile. An effective one maybe, and I am consistently surprised by new advances manufacturers continue to wring out of LCD tech to this day, but I reeeally think it's time to just leave it to die and put the relentless R&D effort into something better.

It seems to me it would have been so much more conducive if all the last 2 decades of LCD research had been put into OLED instead, a technology that works by applying a charge to a medium, which emits light in response - couldn't be simpler or more natural. Instead the industry took 10 years trying to drag itself back to a point it had already reached with CRTs, and to this day has never quite got there.

just brew it! wrote:We're basically trading off blur for flicker; with fast enough frame rates the flicker will not be perceptible. Sounds like we're nearly there.

Please be careful with how much flicker you introduce though. Perceptible flicker frequency varies greatly from person-to-person, and often reaches several hundred Hz. Imperceptible flicker (i.e. faster than perceptible flicker) can continue to have neurological effects even into the multiple kHz range.

Mainly, I just want to emphasize that while using techniques like Lightboost might reduce LCD blurring effects, there are users who greatly prefer blurring over flickering. If you haven't seen the TFTCentral article here, I'd recommend it (disclaimer: I wrote some of the article).

Just switching from 60Hz to 120Hz won't reduce LCD blur, but it can reduce the effects of some issues like RTC overshoots and input lag.

just brew it! wrote:

Bensam123 wrote:I would disagree, but I'm sure you know that. A perception trick isn't the same as hardware actually capable of producing fluid motion in such a manner.

So you're not going to be happy until we have GPUs and displays capable of infinite frame rate? Not gonna happen.

120Hz would be a good start, though I'd greatly prefer 240Hz (I can dream ).

Bensam123 wrote:I'm not saying LB is a bad thing... it's just not the same as a legitimate display technology that provides fluid motion at all times.

Which, as JBI notes, means a display that has infinite frame rate.

There isn't going to be a display technology that provides "fluid motion" as you seem to define it, because so long as computers are making discrete frames we'll always be relying on a "perception trick."

The only relevant question is whether or not it's below the threshold of human perception. What you seem to be saying is essentially magic.

GrimDanfango wrote:It seems to me it would have been so much more conducive if all the last 2 decades of LCD research had been put into OLED instead, a technology that works by applying a charge to a medium, which emits light in response - couldn't be simpler or more natural. Instead the industry took 10 years trying to drag itself back to a point it had already reached with CRTs, and to this day has never quite got there.

Speaking as someone who vividly remembers the weight and deskspace required by CRTs, particularly the halfway decent and larger ones, no thanks. I think you forget just how nasty some lower-end CRTs could truly be, i.e., the ones that were actually affordable for general-purpose users. LCD research and production hasn't taken anything away from OLED research, the problem is that OLEDs rather effectively resist the longevity improvements required for all-day-every-day use. The first person to figure out how to get at least a 30,000 hour life out of an OLED display without appreciable color degradation will be filthy rich, so the economic incentive is there.

just brew it! wrote:We're basically trading off blur for flicker; with fast enough frame rates the flicker will not be perceptible. Sounds like we're nearly there.

Please be careful with how much flicker you introduce though. Perceptible flicker frequency varies greatly from person-to-person, and often reaches several hundred Hz. Imperceptible flicker (i.e. faster than perceptible flicker) can continue to have neurological effects even into the multiple kHz range.

Bensam123 wrote:It's important to note that this is still a perception trick

Flicker is no more a perception trick than CRT impulses or SED impulses. It's the same modulation of photons to the human eye, at the end of the day. No difference.

I am an associate member of Society for Information Display (SID.org) and have been studying its academic papers; and have a good basic understanding of how flicker (regardless of technology) eliminates motion blur -- regardless of CRT, plasma, LCD, SED, OLED. Every display engineer agrees.

Also, you need to study up on the science of sample-and-hold displays versus impulse-driven displays.See the "Academic References" section in Science & References

GrimDanfango wrote:I'm holding out (still) for OLED monitors to come to maturity (/sensible prices). It looked like the new era was about to dawn 6-5-4-3-2 years ago, and it slips back every time, but they're finally becoming commonplace in smartphones, and there's finally some genuine 55" commercial products on the market.

I have bad news... OLED is no good for motion blur if it's sample-and-hold version (e.g. PS Vita). Notice how motion blur PS Vita isn't any better than a good 2ms TN LCD? (The colors ARE much better; but we're strictly talking about motion blur here). The good news is it's possible to impulse-drive (flicker) an OLED or Crystal LED -- which is what good ones do to eliminate motion blur.

Sorry to be the bearer of bad news. The only way to eliminate flickers *AND* keep sample lengths short (e.g. 1ms) is to have 1000fps@1000Hz -- something that's not going to happen unless motion interpolation is used (and motion interpolation is not game friendly). This is why people say CRT 60fps@60Hz (1ms samples) has less motion blur than LCD 120fps@120Hz (8.33ms samples). Sample lengths dictate motion blur. (Note: sample length is not pixel persistence. Sample length is the length of time that a refresh is visible to human eyes for. LCD refreshes are generally static for the whole refresh).

Yes, I prefer OLED over LCD, assuming equal motion blur ability (e.g. equal strobe length). The colors on OLED is so much better.

Last edited by mdrejhon on Fri Feb 08, 2013 4:19 pm, edited 6 times in total.

just brew it! wrote:We're basically trading off blur for flicker; with fast enough frame rates the flicker will not be perceptible. Sounds like we're nearly there.

Please be careful with how much flicker you introduce though. Perceptible flicker frequency varies greatly from person-to-person, and often reaches several hundred Hz. Imperceptible flicker (i.e. faster than perceptible flicker) can continue to have neurological effects even into the multiple kHz range.

Which is why I said "we're nearly there" instead of "we're there".

Correct, sir. Flicker is a tradeoff versus motion blur.

I'll use scientific terminology here; Sample-and-hold effect can only be eliminated by shortening frame sample lengths.

Sample lengths (length that a stationary refresh is illuminated) can only be reduced by two methods:(1) More samples. Higher "fps=Hz"! Extra frames and extra refreshes, either natively or interpolated (e.g. LCD)and/or(2) Black period between samples. Irregardless of how it's done (e.g. CRT phosphor decay, plasma pixel modulation, DLP pixel modulation, backlight control, black frame insertion, SED pulses, OLED pulses). It's _all_ exactly the same perception trick from the human eye perspective, the black period between refreshes. e.g. You do need at least 10:1 ratio (dark 90% of time) for 90% motion blur elimination.

Example: High Speed 1000fps YouTube Video of CRT Scanning -- A 'point' (pixel) on CRT is dark roughly 90% of the time.All CRT's do that. Every single television tube ever made. Every single monitor tube. Any tube that does a raster. Can you believe our human eyes turns that scanning mess, into a perfectly clear image? It's an amazing perception trick of the human vision system.Yes, CRT was a perception trick. Real world does not flicker like a CRT.

With an oscillscope and a photodiode, I have measured the sample lengths of a LightBoost LCD to be 2.4 milliseconds -- (1/500sec). It was also confirmed to have 3 times less motion blur than a 144 Hz sample-and-hold display. On a CRT, the sample lengths of a pixel is approximately 1 to 2 milliseconds (phosphor illuminate-and-decay cycle).

Excluding other variables (eye tracking inaccuracies, pixel persistence which is now bypassable, source-generated motion blur)Motion blur is already scientifically proven to be directly linearly proportional to sample length. e.g. Sample length of 6ms always has twice as much motion blur as a sample length of 3ms.

Also, here are my measurements from my oscillscope, with a photodiode pressed against the LCD in LightBoost mode:

PixPerAn (a motion test) also easily show a motion blur elimination directly proportional to the reduced length of the sample length -- exactly the same motion blur behavior as a CRT flicker. To the human eyes, it is exactly the same 'perception trick' regardless of how the flicker is being accomplished. Even the university and TV manufacturer research agrees (scroll down halfway for the relevant links).

Just as with all technologies, CRT is an artifical invention because the real world is not scanned/flickered in a sequential raster scan. All a perception trick. Humans vision is essentially an analog system that do not have a "frames per second". Dividing motion objects into a flipbook of discrete images, is an artifical invention necessary for recording of motion/playback of recorded motion because there's no way to record motion images without dividing them into separate/discrete frames that's played back in sequence (e.g. 24fps, 60fps, etc). If we call LCD with LightBoost a perception trick too, then it total nonsense to not call CRT a perception trick, either. That said, it's a great and wonderful perception trick -- an amazing electronic implementation of the zoetrope principle -- and when you truly think about it, it is impressive (from a vision research perspective) that our human eyes can turn a flickery/scanny CRT into a continuous motion image and with blur-free results. We're just used to not calling it an amazing parlour trick or perception trick, because CRT raster scanning is the way television has been done for more than 60 years.

Feel free to ask me any questions!

Last edited by mdrejhon on Fri Feb 08, 2013 4:17 pm, edited 1 time in total.