Intel quietly revealed last week that its next-generation Ivy Bridge processors will support the 4K display resolution, with up to 4096 x 4096 pixels per monitor, potentially paving the way for Apple to introduce high-resolution "Retina Display" Macs.

The world's largest chipmaker announced the news during a technical session at its Intel Developer Forum in San Francisco last week, as noted by VR-Zone. Ivy Bridge chips will rival competing discrete GPUs by including support for the 4K resolution when they arrive next year.

The company also highlighted a Multi Format Codec (MFX) engine that is capable of playing multiple 4K videos at once. The codec is also capable of handling video processing for 4K QuadHD video, a standard that YouTube began supporting last year.

A set of performance enhancements, with special attention to graphics, should give Ivy Bridge as much as a 60 percent performance boost over the current generation of Sandy Bridge chips, according to Intel.

Intel also revealed last week that Ivy Bridge chips will include support for Apple's OpenCL standard, which should give a performance boost to next-generation MacBook Air and 13-inch MacBook Pro models when they arrive in 2012.

If Apple were to introduce a 4K resolution display with the 16:9 ratio currently used in its Thunderbolt Display, iMac and MacBook Air products, the resulting resolution would be 4096 x 2304. A 27-inch display with 4K resolution would sport a pixel density of 174 pixels per inch. Assuming a working distance of 24 inches and 20/20 vision for the calculations, a 4K 27-inch iMac or Thunderbolt display would count as a "Retina Display."

Apple first began using the "Retina Display" marketing term with the iPhone 4 last year. Then CEO Steve Jobs touted the 326ppi display as being beyond the capabilities of the human retina when used at a distance of 12 or more inches from the eyes.

In September 2010, the company released a Retina Display iPod touch. Rumors have also swirled that Apple will follow suit with a high-resolution version of the third-generation iPad, doubling the resolution of the tablet to 2048 x 1536.

Of course, Macs that take full advantage of the 4K resolution capabilities built into future generations of Intel's chips would take some time to arrive, as Apple will need to resolve price and production constraints before releasing a Retina Display desktop or notebook. But, 3200 x 2000 desktop wallpapers were discovered in a Developer Preview of Mac OS X Lion earlier this year and appear to telegraph a future resolution bump for Apple's line of Mac computers.

Also of note, Apple added 4K support to its Final Cut Pro video editing program when it released version X in June. However, Final Cut Pro X has caused a controversy, as some users have complained that the application is no longer "pro" software.

I am looking forward to the day when my 27" Cinema ("ThunderBolt") or whatever display is as incredibly sharp as the iPhone 4. For a 3.5" screen to have 326ppi, along with an LED backed IPS panel was something to behold. The iPad 3 will arguably be next to have an extremely high ppi rating at the almost 10" mark, but imagine "Retina" like numbers on screens over 20" or 25"...

Resolution is just one factor but i'd really like to see this. Of course gpu technology has a bit of a way to go, and I wish Apple would consider full displayport connectors rather than this mini displayport crap (better bandwith, tighter connection). Thunderbolt is fully compatible with displayport protocols anyway.

I wonder if this is realistic or just childish embellishing! Computer specs seem to have plateaued in recent years, but there seems to be lots of progress in low-voltage processor, SSDs, and high-resolution displays...

They are leagues behind AMD and Nvidia in GPGPU performance for accelerated OpenGL environments and OpenCL scalability.

The first GPUs to pump out 4k won't come from Intel.

Intel is to GPU's like Nvidia is to ARM chips: a whole lot of hot air, big promises long before the product ships, but consistently under-delivering compared to the competition.

Most likely Intel has their Ivy Bridge GPU's in the labs and they "rival discrete GPUs" from the current generation, ie: about as fast as AMD Fusion. I predict that by the time they get released to the market they'll still be a generation behind in performance....

Intel is to GPU's like Nvidia is to ARM chips: a whole lot of hot air, big promises long before the product ships, but consistently under-delivering compared to the competition.

Most likely Intel has their Ivy Bridge GPU's in the labs and they "rival discrete GPUs" from the current generation, ie: about as fast as AMD Fusion. I predict that by the time they get released to the market they'll still be a generation behind in performance....

Intel makes terrific CPU's but they should stop trying to do GPU's.

i think 256GB or maximum 512GB
4k screen: you wish
4GB or max 8GB RAM
ins't intel is talking about 24 hours battery life in Haswell architecture which is due in 2013?

Once monitors reach retina resolution, is there any point going any higher? Perhaps someone will invent an improvement to our eyes and we will be back at square one

Not at all. People are always impressed by specs. Look at the TVs claiming 120 Hz or even higher refresh rates - irrelevant for viewers.

Quote:

Originally Posted by mdriftmeyer

I'm so sick of Intel's BS.

They are leagues behind AMD and Nvidia in GPGPU performance for accelerated OpenGL environments and OpenCL scalability.

The first GPUs to pump out 4k won't come from Intel.

Intel never claimed that they'd have the first 4K GPUs. Nor did they claim to be ahead of anyone. What they claimed is that the integrated GPU on future chips would have greater performance than the current generation - which is true. You apparently don't understand the different applications for an integrated GPU vs a dedicated one.

"I'm way over my head when it comes to technical issues like this"Gatorguy 5/31/13

I can't speak to the hardware issues. But I have tried out HiDPI, which is available through Xcode 4.1. It seems that Apple is pretty far along with the development of this feature for OS X. It worked well for me, although people like John Siracusa say it still retains flaws.

I am guessing HiDPI will available in the subsequent OS X release to Lion.

Not at all. People are always impressed by specs. Look at the TVs claiming 120 Hz or even higher refresh rates - irrelevant for viewers.

Intel never claimed that they'd have the first 4K GPUs. Nor did they claim to be ahead of anyone. What they claimed is that the integrated GPU on future chips would have greater performance than the current generation - which is true. You apparently don't understand the different applications for an integrated GPU vs a dedicated one.

Did I touch a nerve. Wanna whip out our degrees and discuss the difference between integrated GPUs by AMD and the junk by Intel?

Sorry, but AMD's APUs graphically now and in the future run circles around anything Intel will ever produce.

That would be awesome, and Apple does have a history of putting well-above average cost displays in their products, but even with a high end GPU for anything the IGP couldn't handle, you'd be running lots of games in interlaced mode, and down-sampling full screen 1080p, etc. Plus, OSX still doesn't have full resolution independence. No, just because the IGP supports it, doesn't mean Apple will follow suit with a display. AMD's Eyefinity can support huge resolutions too, after all.

They are leagues behind AMD and Nvidia in GPGPU performance for accelerated OpenGL environments and OpenCL scalability.

The first GPUs to pump out 4k won't come from Intel.

Aren't there already GPUs that can handle 4k? 4k has been available for computers for several years now, it was just a pricey proposition when the monitors costed $6,000 and required two dual-link DVI ports.

I mean, why would anyone want to tax the entire process with such outrageously over-the-top specs?

I just can't imagine, for example, that Hollywood would be at all pleased with having to deliver 4K versions of their movies. Meanwhile Apple, making a big push to promote downloading video over physical media, would have a hard time delivering these massive files to consumers. Even if they could put in place the equipment to send the files out, service providers are already giving consumers grief over how much bandwidth they're using, at least here in Canada.

So movies are simply not a use for such a display. Books? Even with ordinary resolution on the current iPad the transition from the printed page has kicked into high gear. Besides, going retina-like on the iPad would not require 4K resolutions considering a 10-inch display is likely the biggest a tablet should go. Any bigger and all the advantages to the form factor evaporate.

Quite simply, just because you could have more resolution does not mean you should.

By the way, while I don't doubt that the iPad 3 could have a display with higher resolution, a retina display is at best a long shot. There is simply no need for it to go there and hence, why bother. Fact is, as far as the tablet market is concerned, the average consumer's response would be, "You had me at iPad 1."

I wonder what the "wow factor" of the next iPad is going to be? I don't think the resolution + speed bump from iPad 2 will be enough for me to pick one up, but I'm sure Apple has something that will simply make buying the next iPad irresistable. I really wonder what it will be?

The original quote was most movies can't afford it. I assume he meant by the large studios that use cameras priced in the $100,000s. RED makes professional equipment but keep your eye on their next camera called Scarlet. The large companies are following along with the 4k standard. Canon will announce something in the beginning of November.

4k is the new 1080p!

Quote:

Originally Posted by mstone

Affordable for anyone with an extra $100K. The camera body starts at 25K but you can't shoot a single frame until you buy 50+ pricey accessories.

Given the way the technology is implemented, Apple will double either the 1280x800 or 1440x900 resolution currently used on the 13" products, so we can expect a 13" Retina Display to be either 2560x1600 or 2880x1800. My guess is the former because yields will be better and even 2560x1600 will be stunning.

Once you go 4k you can't go back. Trust me, your eyes will see a difference. Movies in the theatre will be 4k projected quicker than you think. There is a lot going on. I believe RED was the pioneer to make it the standard. No matter though because it will be the standard and it will happen soon.

Did I touch a nerve. Wanna whip out our degrees and discuss the difference between integrated GPUs by AMD and the junk by Intel?

Sorry, but AMD's APUs graphically now and in the future run circles around anything Intel will ever produce.

No one ever said that they didn't.

Intel's iSeries chips are the overwhelming market choice and a lot of those are going into inexpensive systems that do no have a dedicated GPU. Intel is simply improving the iGPU on their chips. Why are you jumping all over them for improving their product?

"I'm way over my head when it comes to technical issues like this"Gatorguy 5/31/13

I can see it now....tons of imacs with cracked screens being brought into the Apple store by people with bloody foreheads because the resolution was so good, they tried to climb into their computers. :-)

If Apple were to introduce a 4K resolution display with the 16:9 ratio currently used in its Thunderbolt Display, iMac and MacBook Air products, the resulting resolution would be 4096 x 2304. A 27-inch display with 4K resolution would sport a pixel density of 174 pixels per inch. Assuming a working distance of 24 inches and 20/20 vision for the calculations, a 4K 27-inch iMac or Thunderbolt display would count as a "Retina Display."

By that logic, an old SD TV could "count as a 'Retina Display'" if you sat far enough away.

Really do you expect to actually see better performance from these chips driving 4K displays. An Intel chip trying to do that will fall flat on it's face.

Quote:

Originally Posted by jragosta

Not at all. People are always impressed by specs. Look at the TVs claiming 120 Hz or even higher refresh rates - irrelevant for viewers.

Intel never claimed that they'd have the first 4K GPUs. Nor did they claim to be ahead of anyone. What they claimed is that the integrated GPU on future chips would have greater performance than the current generation - which is true. You apparently don't understand the different applications for an integrated GPU vs a dedicated one.

Even their performance claims are misleading. The 60% figure comes from one bench mark, most of the balance of the testing is around 30%. So where does this leave us on 4K displays?

I'm not dismissing that Intel GPUs are good enough for some, but some can get by with 4 cylinder compacts. The only difference here is that computers have a wider array of uses than economic transportation. Thus coming up short on GPU performance is a wider issue.

I watched a few of the presentations from the last Developers Conference, using my free membership. It appears Apple is emphasizing a 2X type image increase now over the older resolution independence methods that allowed you to set a large variety of ratios. So it would work much like the iPhone, new programs could take advantage of the new tech, old programs would just pixel double everything. I think Apple was having trouble getting some of the major players to adapt there programs for RI. The old RI system works pretty well on most of Apple's programs but not on most third party apps.

Is this a .X change to Lion when the monitors are ready or is this going to require a full system upgrade, that I do not know. I think the portables will go 2X long before the 27-inch screens.

Really do you expect to actually see better performance from these chips driving 4K displays. An Intel chip trying to do that will fall flat on it's face.

Yes, I'm sure you know more about the performance of Intel's next generation of chips than they do.

Quote:

Originally Posted by wizard69

Even their performance claims are misleading. The 60% figure comes from one bench mark, most of the balance of the testing is around 30%. So where does this leave us on 4K displays?

I'm not dismissing that Intel GPUs are good enough for some, but some can get by with 4 cylinder compacts. The only difference here is that computers have a wider array of uses than economic transportation. Thus coming up short on GPU performance is a wider issue.

That might be a valid argument - if Intel only sold one type of chip and if Intel chips were never used in systems with dedicated GPUs.

Intel, OTOH, has differentiated the market and offers some chips with integrated graphics for low end systems and high end chips for more demanding needs. It's just really hard to see how improving their low end chip is a negative - just because it hasn't become a high end chip.

"I'm way over my head when it comes to technical issues like this"Gatorguy 5/31/13

Not at all. People are always impressed by specs. Look at the TVs claiming 120 Hz or even higher refresh rates - irrelevant for viewers.

True, many specs are used for marketing purposes, and are treated as more = better.

However, specifications are ultimately an engineering issue, where optimum = better.

Televisions with 120 Hz have a very specific purpose, which is beneficial to the viewer. Video is commonly available in formats that support 30, 60, and 24 frames per second. A common 60 Hz display has to play games (like 3:2 pulldown) in order to play 24fps video. While this is a very effective technique, it isn't quite correct. Using 120 Hz (24*5) allows 24fps video to be played as it was intended.

Most people are very accustomed to watching movies with 3:2 pulldown (used since forever ago to transfer movies to broadcast TV, VHS, and DVD), and either don't notice or don't care. It's a feature that mostly videophiles care about. But, when watching a 24fps Blu-Ray movie, it will make it appear slightly more "film-like" for a more authentic theater experience at home.

There are 240Hz televisions. While I am not aware of any downsides (other than perhaps cost), I am also not aware of any benefits.

It's good to be skeptical about the specs in marketing, but they aren't always bogus, either. Research and evaluate for yourself.

On-Topic: As someone who spends 8 hours staring at PC screens, Retina-class displays can't come soon enough... Sadly, I know I won't see one at work for many years...