Author
Topic: Do you have a 4K display? (Read 27625 times)

I just want to get an idea of how widespread 4K displays are at the moment.

Do you think 4K video formats will become more widespread with the upcoming DSLR generation of bodies or do you think manufacturers may only include it in top of the line models (EOS 1Dc) or more dedicated video cameras (e.g. EOS C~ line).

Lastly, do you think the Canon EOS C~ line will formally expand into a product like for example EOS 7Dc?

I actually think that cameras have in part driven 4K adoption. We've been seeing new 4K recording devices coming out of the woodwork this year. Even the Galaxy Note 3 can record 4K.If I had a graphics card with Displayport I would have already ordered the new Dell UP2414Q, but, since I'm due for a system overhaul next winter I'll wait until then.

If I were to get married today I would insist on my ceremony being recorded in 4K resolution.

I am currently using a 3K resolution (2560x1440p) display to type this post and I look forward to picking up a 4K resolution display when computer displays drop below $1000. Ideally it be 31.5-inch or wider. The Sharp PN-K321 today sells for $3,299.

As for 4K UHDTVs I see myself picking one up when there is downloadable content is available in 4K or there is data storage format that supersedes 2K resolution Blu-ray Disc.

Another possible condition of my getting a 4K UHDTV is when Sony & Microsoft releases their "slim model" of the PS4 & Xbox One in say 4-6 years.

I am one of the few guys who arent really interested in getting a video console this soon. During the last video console wars I waited until the first price cut to get one. Reason being the 1st year of a video console's life the games tend to be half baked.

I remember someone once asked me a few years ago why I was recording all my videos in 1080, and I replied that technology was only going in one direction so while at that time broadband speeds were still quite slow, and 720 was more popular on the net, just a short time later and we are talking about 4K.

I agree with Dolina that if it were an important event like a wedding, I would, if possible like it to be recorded in 4K to future proof it as much as possible, but I think mainstream 4K will still take a while to catch up.

A lot has to do with the TV companies, some countries are faster than others to deliver full HD TV. I know that the Full HD TV channels I have look great on my big TV but the rest of the channels look bad.

I could even see myself skipping 4K and going for whatever is after it.

If I were to get married today I would insist on my ceremony being recorded in 4K resolution.

I am currently using a 3K resolution (2560x1440p) display to type this post and I look forward to picking up a 4K resolution display when computer displays drop below $1000. Ideally it be 31.5-inch or wider. The Sharp PN-K321 today sells for $3,299.

As for 4K UHDTVs I see myself picking one up when there is downloadable content is available in 4K or there is data storage format that supersedes 2K resolution Blu-ray Disc.

Another possible condition of my getting a 4K UHDTV is when Sony & Microsoft releases their "slim model" of the PS4 & Xbox One in say 4-6 years.

I am one of the few guys who arent really interested in getting a video console this soon. During the last video console wars I waited until the first price cut to get one. Reason being the 1st year of a video console's life the games tend to be half baked.

In terms of the practicality of working with such high resolution... 4K RAW is quite data intensive in terms of storage and read/write speed. The H.265 codec will hopefully be released soon and promises a useful improvement over H.264 which will help with reducing storage requirements for compressed UHDV, but compressed video is not as edit-friendly. Are there any new storage-media developments that will make working with uncompressed UHDV more tolerable? At the moment it seems very much like a PITA and pocket.

Cable TV in the Philippines started offering 720p since 2009 through today so I do not see a purpose in upgrading to 4K resolution at the moment. Do not worry I also have a much newer 40-inch and 46-inch 1080p HDTV so we arent totally backwards here.

I remember someone once asked me a few years ago why I was recording all my videos in 1080, and I replied that technology was only going in one direction so while at that time broadband speeds were still quite slow, and 720 was more popular on the net, just a short time later and we are talking about 4K.

I agree with Dolina that if it were an important event like a wedding, I would, if possible like it to be recorded in 4K to future proof it as much as possible, but I think mainstream 4K will still take a while to catch up.

A lot has to do with the TV companies, some countries are faster than others to deliver full HD TV. I know that the Full HD TV channels I have look great on my big TV but the rest of the channels look bad.

I could even see myself skipping 4K and going for whatever is after it.

2014 sub-$1000 4K resolution displays are most probably using lower-end panels and I did mention that I want a display larger than 31.5-inch,correct? At the current ppi of the 27-inch iMac a 4K resolution display would need to be 46-inch wide.

I'm after quality. If i wasn't then Amazon is selling a 50-inch Seiki 4K resolution UHDTV for less than $770.

Hopefully by the time the "slim" Xbox One and PS4 comes out quality 4K resolution UHDTV will sell for under $2000. Maybe by then these "slim" updates will come with an optical disk drive that accept 4K resolution content.

Other than resolution the other motivation for me to upgrade would be the weight and power consumption. Power consumption is pretty much self explanatory as $/watt is always increasing and never decreasing but weight? It has been my dream to mount a display on the ceiling above my head. If the display is almost as light as an acoustic board then it is possible to do.

My dentist wanted to do that with his HDTV in his office so his patients can watch TV while he mucks around in their mouth but the contractor forbade it.

I don't do video, so I'm not concerned about capturing or displaying it in any particular resolution. And even though we have a couple of Blu-ray players in the house (for their web apps and DLNA capabilities), we don't own or rent any Blu-ray discs; the up-scaling of DVD content is more than good enough for us.

I may look into 4K displays for my digital darkroom work, if they've become more mainstream and affordable by the time I need to replace my current displays, if ever (they're all less than 2 years old). But in the meantime, my Dell Ultrasharp 1920x1200 displays are very much up to the task for the foreseeable future.

I work for a very large US motion picture rental company which is global so lets talk 4K.

Canon, Sony, Red (they also have 5K), Black Magic make 4K cameras and will be joined by Phantom in the new year. The most popular camera in Hollywood is the Arri Alexa its a 3.5K camera that outputs 2K and most movies in theaters are 2K NOT 4K. The 4K cameras actually dont output 4K and most dont employ lossless compression but thats another story the real issue is a 4K TV to see 4K you need to sit considerably closer to the TV than 2K (1080P / i) for the same given size and the majority of broadcast content is NOT 4K but 2K even if shot on a 4K camera. In the pipeline are 8K cameras theoretically your would need to be even closer to the screen to get the benefit so is this a case of technology over common sense yes and no. A down sampled image would give cleaner images after compression from say 8K or 4K file after allowing for concantination through the broadcast pipeline for 2K but a pure 4K file will still require a closer viewing distance for a given screen size to get the benefit the same applies to a movie theatre.

canon rumors FORUM

20/20 vision is by definition average. I'm a little better than that, but when I say that the difference between 1080p and 4K is blatantly obvious, I have confidence that it will be just as obvious for the average person reading this.

In 5-10 years when we start talking about 8K (hopefully sooner than later), it will be the same discussion all over again, and I will tell you that the difference between 4K and 8K, at the same distance you use your 1080p TV right now, will be blatantly obvious (as long as the content you're looking at is actually good quality).

Beyond 8K things will get a little fuzzy, but by my own measurements I could potentially use a 30" desktop monitor (3 foot viewing distance) with as high as 16,000x8,000 resolution before individual pixels start blending in with the inherent signal noise in my eyes. I think 8K would be a good resolution for industry to stick at, with maybe the odd 64 or 128 Megapixel screen made for special people like me.

Another clarification should be made in the type of content being viewed. There are ideal situations for taking advantage of extra detail and not-so ideal situations. You will see extra detail best in high contrast still images. Oppositely low contrast video with lots of motion is sometimes so bad the whole thing almost looks washed over. That, it seems to me, is where you get people saying they can't see the benefit in higher resolutions. If you're looking at an image already devoid of detail then of course it's not going to look any better.It may be that much of the content the average person looks at doesn't contain a whole lot of extra visual information, but that certainly doesn't mean that people have to get bigger TVs or sit closer to their screens in order to take advantage of extra detail when it is present.I have also encountered people who can't see the difference due to sheer ignorance. One time I tried to point out all the jaggies on screen to a friend of mine. His response was that he didn't know what they are so it didn't bother him.