Yep. The math is simple. You don't need calculus or differential equations or even probability theory. Just simple multiplication and division. Fun? To each his/her own.

Why is 2.35 17/9? 17/9 is 2.33333 etc. Most widescreen stuff is actually not even 2.35 but 2.38 or 2.39. For sony discussion your choices are really 1.78. 1.85, and 1.89 for chip illumination. Those choices on the Sony are labeled normal, 1.85, and 2.35 (go figure), that should be 1.89.

For those math challenged, setting the Sony aspect to 1.85, gives you a width of 1.85 x 2160 horizontal pixels or 3996 of the maximum available of 4086. this means you will have only a total of 100 lost or 50 on each side. Many movies are actually 1.85 aspect anyway and this is a good fir for a 1.85 aspect screen. If you feed a native 1.78 (HD) aspect and zoom to fill your screen you will only lose a very small amount of image above and below. really it won't be noticeable.

there are lots of choices and small compromises regardless of what one does. The probable best solution involves getting a screen with variable side masking.

Yep. The math is simple. You don't need calculus or differential equations or even probability theory. Just simple multiplication and division. Fun? To each his/her own.

Why is 2.35 21/9? 21/9 is 2.33333 etc. Most widescreen stuff is actually not even 2.35 but 2.38 or 2.39. For sony discussion your choices are really 1.78. 1.85, and 1.89 for chip illumination. Those choices on the Sony are labeled normal, 1.85, and 2.35 (go figure), that should be 1.89.

Well that certainly is a poor excuse for continuing their erroneous ways. Using a fraction as the same thing as a two points to the right of the decimal is wrong except where it gives you the exact aspect ratio. For HD, the aspect is 1920/1080 or 1.7777 etc, 17/9 is the exact same number. Exact. So feel free to use 17/9 regardless of the continent one is on. Its even OK to round 1.7777 to 1.78 although that is not exact. There is no fraction that will give 1.85 exactly. We just use that and don't approximate it with a fraction. There is no exact fraction which will give one 2.35. So yanks or North and South Americans just use 2.35 and don't approximate it with a fraction. .You are trying to defend the indefensible. I even know one European who bet Cubanos that UHD Bluray will be out this year. Even he doesn't use 21/9 instead of 2.35.

Yep. The math is simple. You don't need calculus or differential equations or even probability theory. Just simple multiplication and division. Fun? To each his/her own.

Why is 2.35 21/9? 21/9 is 2.33333 etc. Most widescreen stuff is actually not even 2.35 but 2.38 or 2.39. For sony discussion your choices are really 1.78. 1.85, and 1.89 for chip illumination. Those choices on the Sony are labeled normal, 1.85, and 2.35 (go figure), that should be 1.89.

Well that certainly is a poor excuse for continuing their erroneous ways. Using a fraction as the same thing as a two points to the right of the decimal is wrong except where it gives you the exact aspect ratio. For HD, the aspect is 1920/1080 or 1.7777 etc, 17/9 is the exact same number. Exact. So feel free to use 17/9 regardless of the continent one is on. Its even OK to round 1.7777 to 1.78 although that is not exact. There is no fraction that will give 1.85 exactly. We just use that and don't approximate it with a fraction. There is no exact fraction which will give one 2.35. So yanks or North and South Americans just use 2.35 and don't approximate it with a fraction. .You are trying to defend the indefensible. I even know one European who bet Cubanos that UHD Bluray will be out this year. Even he doesn't use 21/9 instead of 2.35.

First thing, Mark, get the arithmetic right: 16/9 = 1.777777... (1.78 for short); 17/9 = 1.888888....(1.89 for short). The former is HD (or UHD), and the latter is 4K.

any experience, guidance, on playback 4K video native?
I didn't really consider it for a while, but now, with at least some demo videos (like youtube) and pictures I'm wondering what is the requirement for a PC to play 4K content?

To answer your question it all depends on how you want to decode the video. There are software decoders that can be used which means you'll need a beefier (modern) CPU. Most quad-core Intel CPUs released within the past two or three years should suffice and have enough IPC performance to handle most 4K video. Though that will depend on what encoding system was used (H.264 vs H.265 or something else) and the bitrate used. To keep on the safe side I'd recommend to buy an nVidia GPU and use LAV's Video Decoder. This decoder has the option to do hardware decoding by using the nVidia card's CUDA cores. Any $200+ nVidia GPU should have enough horsepower to do the decoding with ease. Plus you'd also want to utilize MadVR's video renderer and that also requires a decent GPU for some of the processing needed. I don't think AMD's DXVA hardware decoder supports 4K video decoding so that option is off the table. nVidia all the way!

To answer your question it all depends on how you want to decode the video. There are software decoders that can be used which means you'll need a beefier (modern) CPU. Most quad-core Intel CPUs released within the past two or three years should suffice and have enough IPC performance to handle most 4K video. Though that will depend on what encoding system was used (H.264 vs H.265 or something else) and the bitrate used. To keep on the safe side I'd recommend to buy an nVidia GPU and use LAV's Video Decoder. This decoder has the option to do hardware decoding by using the nVidia card's CUDA cores. Any $200+ nVidia GPU should have enough horsepower to do the decoding with ease. Plus you'd also want to utilize MadVR's video renderer and that also requires a decent GPU for some of the processing needed. I don't think AMD's DXVA hardware decoder supports 4K video decoding so that option is off the table. nVidia all the way!

I probably should have added what kind of information I'm looking for.
I have two PC's both with i5, one with onboard GPU and one with a GForce 660TI.

Is it as simple as connecting the HDMI port and 4K with max. 30Hz works?
Or is Dual DVI required or displayport to HDMI?

Interesting read, Mark; thanks for posting it. I seem to see, though, that most BD's that I plop into the player say '2.40' on the back of the container, but I suppose this is within the 'error bars' of 2.35.

To answer your question it all depends on how you want to decode the video. There are software decoders that can be used which means you'll need a beefier (modern) CPU. Most quad-core Intel CPUs released within the past two or three years should suffice and have enough IPC performance to handle most 4K video. Though that will depend on what encoding system was used (H.264 vs H.265 or something else) and the bitrate used. To keep on the safe side I'd recommend to buy an nVidia GPU and use LAV's Video Decoder. This decoder has the option to do hardware decoding by using the nVidia card's CUDA cores. Any $200+ nVidia GPU should have enough horsepower to do the decoding with ease. Plus you'd also want to utilize MadVR's video renderer and that also requires a decent GPU for some of the processing needed. I don't think AMD's DXVA hardware decoder supports 4K video decoding so that option is off the table. nVidia all the way!

I probably should have added what kind of information I'm looking for.
I have two PC's both with i5, one with onboard GPU and one with a GForce 660TI.

Is it as simple as connecting the HDMI port and 4K with max. 30Hz works?
Or is Dual DVI required or displayport to HDMI?

You or someone else tried that?

-Roland

The 660TI will be able to output a 4K image up to 30hz. That's just displaying an image though. You need to worry about decoding the video. For stuff uploaded on Youtube your CPU should be able to handle perfectly fine but with other higher bitrate material I can't say that it will work 100% all the time. What i5 CPU do you have? For downloaded material I would look into LAV's Video Decoder and doing the hardware based decoding option that way you won't have an issue.

Interesting read, Mark; thanks for posting it. I seem to see, though, that most BD's that I plop into the player say '2.40' on the back of the container, but I suppose this is within the 'error bars' of 2.35.

According to Wikipedea the most common aspect ratios for films shown in commercial theaters are 1.85 and 2.39.

First thing, Mark, get the arithmetic right: 16/9 = 1.777777... (1.78 for short); 17/9 = 1.888888....(1.89 for short). The former is HD (or UHD), and the latter is 4K.

And most of the time '2.35' is actually 2.40 = 24/10.

Since you want to argue over the last decimal point you may want to consider that HD and UHD displays are specified to have square pixels then:
1920/1080= 1.77777 (so 16 x 9 is correct mathematically) but the Sony projector's native resolution supports the digital cinema 4K standard which is 4096/2160 = 1.896, which is not exactly 17/9.

As for 'scope movies older titles are typically 2.35:1 while newer titles are typically 2.39:1 but on Blu-ray discs most scope movies are either framed at 2.35:1 or 2.40:1, as per the small print on the back of the jackets I have looked at.

Doing a little more research, the SMPTE standard is actually 2.39 to 1.0. Bluray printed jacket specs round this to 2.40 to one but the actual for widescreen is 2.39. A difference of .01 for a 14 ft wide screen would be only 1.68 inches split .84 inches on wither side. A tiny bit of top and bottom overscan would result in no area unlit on a true 2.40 screen.

Mark is 100% correct. The jackets are wrong all the time. If you were to load a blu-ray onto your PC and cropped a still image on many of the films you think are exactly 2.35 or 2.40 to 1 (because it says so on the jacket) you'll often find that they are neither aspect ratio.

The problem with being 100% right most of the time, is the load it creates and the impossibility or carrying it. People expect you to be 100% right all the time and that is neigh impossible. When one goes into the batter's box or steps onto the golf tee box, one knows that the probability of getting a hit is far less than 50% or hitting a hole in one on a par 3 or a short par 4 is extremely small. Remember, being 100% right is one large step for Mark except for in a very limited number of areas.

The 660TI will be able to output a 4K image up to 30hz. That's just displaying an image though. You need to worry about decoding the video. For stuff uploaded on Youtube your CPU should be able to handle perfectly fine but with other higher bitrate material I can't say that it will work 100% all the time. What i5 CPU do you have? For downloaded material I would look into LAV's Video Decoder and doing the hardware based decoding option that way you won't have an issue.

Doing a little more research, the SMPTE standard is actually 2.39 to 1.0. Bluray printed jacket specs round this to 2.40 to one but the actual for widescreen is 2.39. A difference of .01 for a 14 ft wide screen would be only 1.68 inches split .84 inches on wither side. A tiny bit of top and bottom overscan would result in know area unlit on a true 2.40 screen.

So, I buy a VW600ES and less than a week later three used ones show up in the classifieds section. What gives? Did I not get the abandon ship memo? Did something better suddenly come along at that price point?

Cool. Will it work with any RF style glasses, or just the specific sony ones? That's what I'm trying to figure out. I prefer the ones that have arms that are 'thick' to block out some side light

My research showed that only the Sony TDG-BT500A active 3D glasses operate with the VW600ES. Nowhere on B&H Photo nor Amazon do they actually list the VW600ES as compatible, but referring to Sony's own website:

The problem with being 100% right most of the time, is the load it creates and the impossibility or carrying it. People expect you to be 100% right all the time and that is neigh impossible. When one goes into the batter's box or steps onto the golf tee box, one knows that the probability of getting a hit is far less than 50% or hitting a hole in one on a par 3 or a short par 4 is extremely small. Remember, being 100% right is one large step for Mark except for in a very limited number of areas.

So is "being 100% right most of the time" better or worse than being 90% right all of the time, or perhaps frequently 10% wrong?

My research showed that only the Sony TDG-BT500A active 3D glasses operate with the VW600ES. Nowhere on B&H Photo nor Amazon do they actually list the VW600ES as compatible, but referring to Sony's own website:

While three used ones hit the market, how many out there did not? And how many people took delivery of new ones?

Its just a chattel. I thing to buy that some think will make them happy. An expensive this or that and for some it turns out that it doesn't. So they sell it. Need the money. Whatever. Some buy things just so they can be the first and grab fame here on AVS. Then they sell early to minimize the loss or maximize the gain for Cedia post purchases. They have an older fall back projector and hat will hold them until. Then again, for me its the summer, and my projector doesn't get used until the first real NFL feetball game. I am not selling, but I could and I wouldn't even know it was gone until September.