Why USB3 when you've got Thunderbolt?

How exactly do you intend to drive 2 external displays from a single VGA socket?

Is this an often occurance with Ultabooks?

Given that ivybridge-based systems will be the first laptops capable of driving 3 displays: not very often, currently. I'd quite like to be able to drive both my external displays and my laptop's panel, though. Since Ultrabooks are targeted at the high-end, not-much-of-a-compromise laptop warrior, it doesn't seem unreasonable to expect the proportion of ultrabook users wanting to do this to be significantly higher than the general market.

Also, given that you were replying to

RAOF wrote:

Echohead2 wrote:

RAOF wrote:

DCop wrote:

Paul Hill wrote:

Any idea why the did both? Mini-HDMI has VGA pinout and converters cost <£40.

Potentially to drive multiple external monitors? i.e. HDMI on one, VGA on the other, and then the laptop display is the 3rd.

So add 2 Mini-HDMI outputs (or, better, a mini HDMI and mini-DP), so I can drive two useful external monitors?

Well, VGA would work fine for most monitors, so it isn't that bad, but kind of lame. Instead of 2 mini-jacks--just put one regular sized--a regular HDMI, VGA, DVI, DP, good lord--you name it. BUt then no adapter needed. Probably the problem is, if you have to pick one--which one? VGA will allow you the widest selection of monitors, DP probably the least (though HDMI might be close, but is becoming more popular).

How exactly do you intend to drive 2 external displays from a single VGA socket?

You should probably have some way of providing the suggested feature .

How will the soon to be released Ivy Bridge systems have optical when the shift won't happen until the end of 2012?

Quote:

This article is written as a hope, more than anything else. It has nothing to back up the fanciful claims.

I'm getting vertigo from the goalpoasts jumping around like that. You originally claimed that the article said the "soon to be released" systems would have optical TB. The article said no such thing. You were clearly wrong. Man up.

Was the piece based on hope? Maybe. Did it have backup? Hard to tell not knowing their sources - that's journalism. Either way that was not your initial claim.

How will the soon to be released Ivy Bridge systems have optical when the shift won't happen until the end of 2012?

Quote:

This article is written as a hope, more than anything else. It has nothing to back up the fanciful claims.

I'm getting vertigo from the goalpoasts jumping around like that. You originally claimed that the article said the "soon to be released" systems would have optical TB. The article said no such thing. You were clearly wrong. Man up.

Was the piece based on hope? Maybe. Did it have backup? Hard to tell not knowing their sources - that's journalism. Either way that was not your initial claim.

I am sorry for the confusion. I didn't mean to imply that the article said that. My point was that the Ivy Bridge chips/MB/etc. were coming out in a couple of weeks--but the optical TB wasn't comign out until late 2012. The article said "Lenovo, Asustek and a number of motherboard makers are set to launch products based on Intel's upcoming Ivy Bridge platform, which will come with Thunderbolt ports that utilize optical cables, the sources indicated."

The point is that the Ivy Bridge is comign out RSN--"hence their use of upcoming Ivy Bridge"--but that really just doesn't jive so well with "late 2012 or 2013". It just is worded poorly, imo.

and really--read it again. "they are set to launch products based on Intel's upcoming Ivy Bridge platform, which will come with Thunderbolt ports that utilize optical cables, the sources indicated" So they are set to launch. They are "ready to go" (they are "set to launch")--on the upcoming Ivy Bridge. We know that part is true--Ivy Bridge items are almost here (or are already shipping). All of that implyies a readiness that the rest of the article doesn't support. The rest says late 2012. You really think they are "set to launch" now, for a late 2012 product? I don't think so.

So tell me, how are those upcoming Ivy Bridge parts set to launch with optical when it won't actually make it to market until late 2012?

The simple fact is that it is a horribly written piece that is wildly optimistic and is grossly overstating the facts. It is highly implausible taht they are really "set to launch". I have no idea why you are so worked up on this.

I am sorry for the confusion. I didn't mean to imply that the article said that. My point was that the Ivy Bridge chips/MB/etc. were coming out in a couple of weeks--but the optical TB wasn't comign out until late 2012. The article said "Lenovo, Asustek and a number of motherboard makers are set to launch products based on Intel's upcoming Ivy Bridge platform, which will come with Thunderbolt ports that utilize optical cables, the sources indicated."

The point is that the Ivy Bridge is comign out RSN--"hence their use of upcoming Ivy Bridge"--but that really just doesn't jive so well with "late 2012 or 2013". It just is worded poorly, imo.

and really--read it again. "they are set to launch products based on Intel's upcoming Ivy Bridge platform, which will come with Thunderbolt ports that utilize optical cables, the sources indicated" So they are set to launch. They are "ready to go" (they are "set to launch")--on the upcoming Ivy Bridge. We know that part is true--Ivy Bridge items are almost here (or are already shipping). All of that implyies a readiness that the rest of the article doesn't support. The rest says late 2012. You really think they are "set to launch" now, for a late 2012 product? I don't think so.

So tell me, how are those upcoming Ivy Bridge parts set to launch with optical when it won't actually make it to market until late 2012?

The simple fact is that it is a horribly written piece that is wildly optimistic and is grossly overstating the facts. It is highly implausible taht they are really "set to launch". I have no idea why you are so worked up on this.

Thanks for a clearly worded retraction. It's refreshing.

Not worked up at all. Agree it's poorly written. But given the amount of time that it takes to prepare products for launch I don't think it's unreasonable to assume that systems integrators are preparing their fall launches now.

Either way, the take home is that the number of systems that ship with TB is set to expand in the fall and optical seems to be the nectar that is attracting the bees. Frankly I'm surprised that they'll be releasing optical this soon. When they went with copper for version 1.0 I took that to mean optical was slightly further out. Didn't think they'd follow it up this soon (18 months). Of course that's one of the advantages of putting the control circuitry in the cabling - the transition from copper to optical is seamless from the POV of the system. This makes me more bullish on TB than I was before.

I didn't retract anything because I just said the same thing again! But no reason to say it is refreshing, I do all the time when I am wrong.

The point is and was that their article made little to no sense.

Quote:

Either way, the take home is that the number of systems that ship with TB is set to expand in the fall and optical seems to be the nectar that is attracting the bees. Frankly I'm surprised that they'll be releasing optical this soon.

I highly doubt that it will any where nearly as widespread as that blurb tried to imply. The reason you are surprised is because you should be. TB on copper hasn't taken off much at all, so why would optical take off any faster?

It is a hype blurb. IMO it is posted to either try and build mindshare or test the waters and see what people think.

Frankly I'm surprised that they'll be releasing optical this soon. When they went with copper for version 1.0 I took that to mean optical was slightly further out. Didn't think they'd follow it up this soon (18 months). Of course that's one of the advantages of putting the control circuitry in the cabling - the transition from copper to optical is seamless from the POV of the system. This makes me more bullish on TB than I was before.

Unfortunately in this case, optical cables are the same speed, and will be more expensive. They make much longer cables possible, but cable length is not a bottleneck for consumer TB adoption. This will be useful for stuff like conference rooms where a long cable run is needed.

Frankly I'm surprised that they'll be releasing optical this soon. When they went with copper for version 1.0 I took that to mean optical was slightly further out. Didn't think they'd follow it up this soon (18 months). Of course that's one of the advantages of putting the control circuitry in the cabling - the transition from copper to optical is seamless from the POV of the system. This makes me more bullish on TB than I was before.

Unfortunately in this case, optical cables are the same speed, and will be more expensive. They make much longer cables possible, but cable length is not a bottleneck for consumer TB adoption. This will be useful for stuff like conference rooms where a long cable run is needed.

But even then, how big of a market is that? And if the trade off is more expensive cables, that is probably a bad trafe-off. They are already expensive (from what we have seen). It is just a move to make the product even more niche.

Given that ivybridge-based systems will be the first laptops capable of driving 3 displays: not very often, currently. I'd quite like to be able to drive both my external displays and my laptop's panel, though. Since Ultrabooks are targeted at the high-end, not-much-of-a-compromise laptop warrior, it doesn't seem unreasonable to expect the proportion of ultrabook users wanting to do this to be significantly higher than the general market.

Laptops with discreet graphics are capable of driving three displays as are the upcoming AMD Trinity Laptops with Eyefinity.

Given that ivybridge-based systems will be the first laptops capable of driving 3 displays: not very often, currently. I'd quite like to be able to drive both my external displays and my laptop's panel, though. Since Ultrabooks are targeted at the high-end, not-much-of-a-compromise laptop warrior, it doesn't seem unreasonable to expect the proportion of ultrabook users wanting to do this to be significantly higher than the general market.

Laptops with discreet graphics are capable of driving three displays as are the upcoming AMD Trinity Laptops with Eyefinity.

Yes, current 15 & 17" MacBook Pro (like mine) can do this via Thunderbolt. Otherwise two ports and sufficient graphics muscle will do it on various makes. Add a third-party box and daisy-chain your broadcast monitor too. The current 13" MBP and both Airs with integrated graphics can only do one external, obviously.

None of the MacBook cables are photoshopped. The Magsafe power comes with the display, and the monitors daisy-chain. The monitor power cables are hidden of course. If I were Hat I'd be more worried about the anti-gravity feature — after all, they're floating on thin air. Aren't they?

The TB cable also can pass ethernet which is built into the monitor - though in that screenshot, the laptop's wifi is on. It's fairly trivial to dress the cabling of power and ethernet going into the monitors so that all you see is the single cable coming out for the laptop.

Really--the laptop is powered through the screen? Or the screen is powered by the laptop? how does that work?

And exactly why do you say that the wired network is in the monitor? Or that it has to be? Or that you have to use wired network?

The laptop is powered through the screen. You can see the power breakout from the cable between the display and the laptop in that image.

TB displays are 'docks', including ethernet, firewire, USB, and TB (for daisy chaining). You don't have to use it, but if you want wired network access, or any other non-mobile connection, the best way is to leave it plugged into the TB display.

TB displays are 'docks', including ethernet, firewire, USB, and TB (for daisy chaining). You don't have to use it, but if you want wired network access, or any other non-mobile connection, the the best way is to leave it plugged into the TB display.

I'm thinking that particular image would be better if they had not hidden the power cord.

Sure, it -looks- better esthetically.

But for demonstrating the "I'm a dock!" aspect, they'd be better off explicitly showing the power and ethernet going into the monitor. With subtle labels.

But really if this is your use case is a regular dock a bad thing? It's less plugs than the Mac setup (1 instead of 2) - albiet a "bigger" plug...

While a dock is better than manually plugging in a bunch of cables, personally I think I'd prefer the TB setup pictured over docks I've used in the past, all else being equal.

Aesthetically it's definitely better -- functionally though you are sure integrating quite a bit into that one monitor (it's now your monitor + dock + power supply) -- which means if you want a larger/higher dpi montior there's a bigger overhead (As you have to re-purchase the dock, etc.) -- and even worse if you need to upgrade part of the dock (say, to get USB 3.0 functionality instead of 2.0) you need to re-purchase the monitor (or buy a separate adapter - devaluing the dock a bit).

But I will admit if I had to make the decision the aesthetics still might win me over

But really if this is your use case is a regular dock a bad thing? It's less plugs than the Mac setup (1 instead of 2) - albiet a "bigger" plug...

While a dock is better than manually plugging in a bunch of cables, personally I think I'd prefer the TB setup pictured over docks I've used in the past, all else being equal.

Aesthetically it's definitely better -- functionally though you are sure integrating quite a bit into that one monitor (it's now your monitor + dock + power supply) -- which means if you want a larger/higher dpi montior there's a bigger overhead (As you have to re-purchase the dock, etc.) -- and even worse if you need to upgrade part of the dock (say, to get USB 3.0 functionality instead of 2.0) you need to re-purchase the monitor (or buy a separate adapter - devaluing the dock a bit).

But I will admit if I had to make the decision the aesthetics still might win me over

So, I just had a brainfart.

Where is the standard for driving higher DPI monitors of this size. My understanding was that a single displayport connection couldn't push a heck of a lot more pixels than are currently in the TB Display. So high DPI waits on a new connector and deals with the chicken and the egg situation as both the computers (MPro, MBPro,Air,mini) get this new connection and the monitor. Or... they use the PCI side of TB and stick a GPU in the monitor.

Yes it would require some software updates, but it would be a huge win if apple released an updated TBMonitor with high DPI display that was useable from day 1 by all of their existing TB capable computers. Getting away from the chick and the egg situation that's going to plague PC upgrade to high DPI would give Apple a very long window of superiority.

But really if this is your use case is a regular dock a bad thing? It's less plugs than the Mac setup (1 instead of 2) - albiet a "bigger" plug...

While a dock is better than manually plugging in a bunch of cables, personally I think I'd prefer the TB setup pictured over docks I've used in the past, all else being equal.

Aesthetically it's definitely better -- functionally though you are sure integrating quite a bit into that one monitor (it's now your monitor + dock + power supply) -- which means if you want a larger/higher dpi montior there's a bigger overhead (As you have to re-purchase the dock, etc.) -- and even worse if you need to upgrade part of the dock (say, to get USB 3.0 functionality instead of 2.0) you need to re-purchase the monitor (or buy a separate adapter - devaluing the dock a bit).

But I will admit if I had to make the decision the aesthetics still might win me over

So, I just had a brainfart.

Where is the standard for driving higher DPI monitors of this size. My understanding was that a single displayport connection couldn't push a heck of a lot more pixels than are currently in the TB Display. So high DPI waits on a new connector and deals with the chicken and the egg situation as both the computers (MPro, MBPro,Air,mini) get this new connection and the monitor. Or... they use the PCI side of TB and stick a GPU in the monitor.

Yes it would require some software updates, but it would be a huge win if apple released an updated TBMonitor with high DPI display that was useable from day 1 by all of their existing TB capable computers. Getting away from the chick and the egg situation that's going to plague PC upgrade to high DPI would give Apple a very long window of superiority.

I will assume this is one of the 27in 2560x1440 monitors. I dont think a 5120*2880 monitor is very likely in the next 3years (id love to be wrong). I have a PC with dual 460GTX 1GBs hooked up to my dell 27in, and going from 1080p to max res is a hugeeeee performance hit (obviously). So if all of a sudden we go to 4x more pixels... I think to do anything worthwhile on it youd need one of the new generation nvidia cards as a baseline.

Where HiDPI is probably coming is 15in laptop screens. 1440x900 * 2 = 2880*1800, which is probably attainable at this point, but holy shitte, playing a game or doing fullscreen 3d on these screens is prolly going to be a drag.

So, curiosity struck, and I turned on HiDPI mode in 10.7.3 on the 27in dell monitor. Surprisingly useable, the "resolution" is 720p (1280x720), and of course theres no more pixel density, but, a lot of stuff looks really good.

One odd thing was, safari first rendered text in a blurry mess, while chrome was a bit nicer about it, after the screenshot, safari went to rendering it correctly (second screenshot)

I agree that big displays are the last things that will go hi-res (they already exist in expensive niches, of course) and that laptops are the obvious next step. with existing 1650x and 1920x screens performing reasonably a step up to 2880 seems feasible. For Apple, the usual careful attention to energy consumption and forthcoming CPUs will help. Efficiency hasn't been the headline feature in GPU, but I'd guess it's coming.

Ohhh yeahhhhh.... I seem to remember a bit of drool on that thing as well.

Funny what, that 3840×2400/2 = 1920x1200 (or would be 1080 in 16:9)....

Honestly, the thing keeping these from selling was probably that it makes whatever you're looking at for normal desktop use way to small too be useful (oh an yeah I /guess/ needing 2 dual link DVI ports...), but now... (and, this would be the prime resolution to shoot for in a desktop monitor, what with the prevalence of 1080p)

A screen that resolution is a fucking pain in the ass to drive. You can kiss goodbye any hope of 3D gaming (even 2560x1440 is hard to drive). Worse in things like the iMac; the GPUs there are already underpowered. Upping the resolution further will only exacerbate that.

At reasonable viewing distances a 27" LCD would need about 125 dpi, which is a much more reasonable 2900x1700 or so for the 27" iMac (which I think is currently about 110 dpi). Problem is, that means you can't do nice simple pixel doubling to make existing 96 dpi apps work at the higher resolution. Pixel doubling is attractive because it essentially solves app compatibility.

But really if this is your use case is a regular dock a bad thing? It's less plugs than the Mac setup (1 instead of 2) - albiet a "bigger" plug...

While a dock is better than manually plugging in a bunch of cables, personally I think I'd prefer the TB setup pictured over docks I've used in the past, all else being equal.

Aesthetically it's definitely better -- functionally though you are sure integrating quite a bit into that one monitor (it's now your monitor + dock + power supply) -- which means if you want a larger/higher dpi montior there's a bigger overhead (As you have to re-purchase the dock, etc.) -- and even worse if you need to upgrade part of the dock (say, to get USB 3.0 functionality instead of 2.0) you need to re-purchase the monitor (or buy a separate adapter - devaluing the dock a bit).

But I will admit if I had to make the decision the aesthetics still might win me over

As far as the user is concerned, nothing really changed, as the price stayed about the same.

Expensiveish Apple monitor gained new functionality without the pricepoint changing. Really, it became a better value.

A screen that resolution is a fucking pain in the ass to drive. You can kiss goodbye any hope of 3D gaming (even 2560x1440 is hard to drive). Worse in things like the iMac; the GPUs there are already underpowered. Upping the resolution further will only exacerbate that.

So how does the GPU on the iPad do on driving the 2048x1536 display? I mean that 2560x1440 which is hard to drive is only 17% more pixels than ipad.