Looks like Apple has taken some cooling lessons from SilverStone. They've demonstrated that using large (180mm) intake fans and having them blow air directly across the CPU and GPU heatsinks can get great thermals at reasonable noise levels.Reply

You could build a DIY system of equivalent specs cheaper, but its noise levels and aesthetics probably wouldn't be as good. Keep in mind that this system has an Ivy Bridge-E CPU (and the LGA 2011 processors generally have pretty high TDP) plus *two* powerful AMD FirePro graphics cards, but is cooled by just one large fan. This is only possible because of custom engineering (the motherboard and the 2 graphics cards are in sort of a triangle arrangement with one massive shared heatsink in the middle).Reply

Even more importantly, you can build a DIY system which will be much more powerful than this.

Take one of Patsburg 2P motherboards (such as ASUS Z9PE D16), stick two 12-core Ivy Bridge EP Xeon-s (successors to the 2687W) getting you full 24 cores / 48 threads performance and, say, 4 NVIDIA Teslas and add 256 GB of RAM for a good measure (8 x 16 GB ECC DDR3) and you got a machine which is double the performance (possibly even more than double, as I am sure Mac Pro will use limited-TDP Xeon E5 V2s).

Yes, that machine will cost money - but that particular thing is not the point in the particular market segment these systems cater for.

First generation Mac Pro was pretty much the showcase of how to build good workstation based on Westmere Xeon platform - if you wanted, you could expand it with the maximum what Intel's then current 2P platform could offer. This new Mac Pro is crippled from the start.Reply

Thanks for catching this - I meant 16 DIMM slots - Z9PE D16 has 8 per single CPU socket.

256 GB would also be possible with 8 memory slots, but one would have to use 32 GB ECC FBDIMM-s, which would cost a little fortune for 1866 MHz right now (way way above $1K per DIMM). Currently 16 GB FBDIMMs are sweet spot, and 1866 MHz is still expensive but I expect that the price will drastically fall when Intel launches Ivy Bridge EP since that would be the first big-volume server platform which needs 1866 MHz.Reply

If you're in a situation where you are concerned about compute density, you get a rack server system. NO ONE seriously concerned with computer density would ever think even for a tiny moment that circles are somehow an "efficient" way of taking up space...

you must be using that quantum math, because the price of the new Mac Pro is unknown.So any "fraction" of that price would only be a probability.Besides the myth of the Apple tax has been disproven many times. Reply

There is indeed something odd about seeing a circular/cylindrical computer chassis... Kudos for packing that much hardware inside a case that small and being able to manage all that heat, but a prismatic case makes more sense imo.Reply

Is the fact that the new Mac Pro does not have any internal expansion slots means that Thunderbolt 2 expansion devices will not be limited to storage/monitors???For example an AMD GPU as a Thunderbolt device.Reply

That said, it's still a decent chunk of bandwidth, and there are lots of things you can do with a GPU that aren't as concerned with bandwidth.

It's worth noting that Thunderbolt isn't limited to storage and monitors today. There are a variety of peripheral adapters available, some multi-connector docks, at least one company is working on a thunderbolt GPU box, etc.Reply

Ok maybe I am missing something here, but whats the point precisely with the GPU expansion over Thunderbolt if you can plug 2 FirePro GC internally (?) Just this is not the point as Apple already addressed the graphic bandwidth. Use TB for other expansion needs like storage, capture cards or other devices.Reply

Well this seems stupid to me, the mac pro was the last and only mac that was somewhat upgradeable / expandable. Now they try to shove it in a smaller case and claim it takes up less space, except if you need any expansion now you got all sorts of thunderbolt cables and junk hooked up to it. RIP mac pro. Also the whole unified single heat sink is dumn, I guess the RDF has now conqured the last apple product. Reply

At long last apple isn't robbing people blind by selling the extremely dated Mac Pro. As a system builder, I am not to fond of the new design though. It certainly is very apple in that once is built you can't really change anything about it internally.Reply

"With 20Gbps up/down on Thunderbolt 2, you should have enough bandwidth for any PCIe expansion."

This statement is somewhat hyperbolic. That'd be equivalent to PCIe 3.0 x2.5 - a very far cry from what any current high-end dGPU would demand, let alone what the GPUs of the future demand. Even at 40Gbps, it's still a bottleneck on the card.

IMO, this is pretty awful news for pros. It's pretty and small, but will force you to buy a whole new system after a few gens of GPUs.Reply

Let me remind you, until a year ago, all graphics cards were PCIe 2.0, and x16 at that. That means 8GB/s. That's Thunderbolt 1 spec. PCIe 3.0 as I understand it, gives 1GB/s per lane, so double PCIe 2.0. That means 16GB/s, which is within Thunderbolt 2 spec. What gives? And where are you getting the idea that each PCIe 3.0 lane is worth 8GB/s?Reply

This guy knows his stuff. It's amazing to me how uninformed people are on Thunderbolt and the power it brings by basically extending the PCI Express buss outside of the chassis of a computer. Its just as fast as if you plug a card into a PCI Express slot inside a computer. Hopefully, the Mac Pro when it comes out will eliminate this weird perception that somehow connecting a card using TB is slower then connecting via an express slot.Reply

Yes, all the GPUs were PCIe 2.0 x16, which is PCIe 3.0 x8, almost four times faster than Thunderbolt 2. They showed in an article about the 7970 last year that PCIe 3.0 x2 would bottleneck a 7970: http://goo.gl/r2YW9 . What happens in 3 years when the GPUs require even *more* bandwidth? Tbolt 2.0 will be an even *larger* bottleneck.Reply

Ah, ok. Thunderbolt is Gb, not GB. However, your own link to the article backs me up still.

"In our informal testing ahead of the 7970 launch we didn’t see any differences between PCIe 2 and PCIe 3 worth noting, and our formal testing backs this up. Under gaming there is absolutely no appreciable difference in performance between PCIe 3 x16 (16GB/sec) and PCIe 2 (8GB/sec). Nor was there any difference between PCIe 3 x8 (8GB/sec) and the other aforementioned bandwidth configurations."

In certain cases, there was bottlenecking, but for the most part, it just didn't matter, even on a PCIe 3.0 x2 connection, which is within the limits of Thunderbolt 2.

I'd say this is much ado about nothing, but change does cause people to worry.Reply

Your still confused. Thunderbolt runs of PCIE x4 not x16 so using the comparison between pcie 2 x16 and 3 x8 is invalid. You need to be concerned with the bandwidth available to a pcie 2 x4 slot. We are talking multiples of difference between what thunderbolt can supply and what your video card requires for full performance. Reply

You're off by a multiple of 8. 16 lanes of PCIe 2.0 is 8 giga-BYTES per second. Thunderbolt 2 is 20 giga-BITS per second, or 2.5 giga-BYTES per second. See how slow thunderbolt really is now compared to 16 lanes?Reply

There are SIX Thunderbolt 2 ports. Each has 2.5 x 2 = 5 GB per second bandwidth. Thus FIVE can be combined and given to an expansion box for 25 GB per second bandwidth - more than PCIe's 16 GB/s.. Reply

"Besides, you're supposed to upgrade both the GPU and CPU together. Pros upgrade their machines every 5 years anyways. They never upgrade the GPU separately."

...What? Some pros never upgrade the internals, some do. Say one needs to do lots of GPGPU work and two years down the line cards are so much more powerful for him it would be worth upgrading the video card, but not throwing out the whole computer. Reply

The only pros that ever upgrade are the ones that didn't bother to get the top-end graphics card in their system in the first place.

The current Mac Pro pretty much has the same graphics card lineup for the last 3 years. Anyone that upgrades would have upgrade to something that was available 3 years ago.

Apple went ahead and made sure the default GPU was high-end enough so that you would never even THINK about upgrading, because you weren't given the option of buying the low-end card in the first place.Reply

Did you miss the part about "Up to" in the graphics options? Apple always offers low end GPU's so your assessment of the GPU being "high-end" is false right off the start.

And to even say that the highest GPU they offer is "high-end enough" after saying that those who upgrade later never bothered to get the best to begin with is rather silly, because you're saying that what they offer is "enough" not that it's the best. They will not offer the best, and that's a fact. Especially seeing as, in the compute world, applications are so sensitive to what hardware they are running on that there is no "best".

And of course, the idea that a GPU that is 3 years old is some how not worth upgrading is also quite laughable. GPU's can change a lot in the time frame, and many people find that the massive gains they would get from the latest generation are well worth the investment.Reply

4 GPU's anyone? Pretty sure that's a better upgrade. Put it on a rackmount server and you have REAL compute density.

And no, you don't upgrade the CPU's and GPU's at the same time. There are MANY compute heavy tasks that are reliant on GPU's way more than CPU's. It's usually very cost ineffecient to upgrade CPU's on a GPU bound task, and therefor many people just upgrade GPU's (this is true for both professional compute and gaming). As for upgrade frequency, 5 years is really the max, most upgrade way more frequently than that.Reply

The bandwidth constraint is less of an issue for GPU use. PCIe3.0 x2 isn't a problem for most games; only 2/8 games AT tested showed significant penalties at x2, and only one was hurt badly enough I'd consider it a showstopper. Where it's a potential killer is GpGpu workloads that need significant levels of back and forth communication with the CPU. Some (ex Einstein@Home) will take a significant slowdown even from only having an x8 slot on higher end GPUs.

I'm wondering though if multiple TB ports can be aggregated so that an enclosure with 2 TB connections could offer an effective x4 link.

What planet are you on? PCIe3 just came out last year and has a MAX bandwidth of 16Gbps (1Gbps x16 lanes). On top of that only the highest of high end single GPU cards on the market use up more than 8 lanes (8Gbps) of throughput, and even doubble GPU cards do not come close to maxing out the full 16Gbps, so there is plenty of room to grow. This means that you can easily add a duel GPU (single or doubble cards) setup via TB2 without issue, while still daisy chaining a massive HDD raid enclosure... and that is only on one of the three provided TB2 ports!I am not exactly sure what one does with the other ports. I suppose you could just hook up 6 high end GPUs to it? I mean, if you have the money to blow on apple products then why not?Reply

I agree with you in principle that no computer should be a "pro" without expansion capabilities, but honestly how many mac pros ever saw an upgrade/expansion outside of ram/hard drive.

For better or worse that is the trend in modern computers. I know at my job, we almost never do upgrades to machines. They spend 3-4 years in the labs that need the most power, get wiped and repurposed else where in the building, rinse and repeat.

Its a similar debate to the removable battery for phones. Yes it is a nice feature, but how many people have ever bought a second battery for their phone? I have used cell phones since ~2000 and I don't think I have ever bought a secondary/replacement battery. Reply

I don't agree, but for the sake of the argument, even if it were true that very few people really used the mac pro to its full expansive potential, this is no argument to remove this potential.

There already is a line of Apple computers with limited upgradeability - and they're called imacs. Users who are happy with whatever comes out of the box, should be on imacs. Users who need more flexibility, should be on mac pro.

With this new design, users who need more flexibility out of their computer can never get it anymore. There's nothing good about that. It's a needless limitation and a downgrade of what used to be a top of the class line of machines.

If they continue in this direction I fear Apple may soon suffer a downfall like it did in the 90s, when advanced users moved to PCs by other vendors, because the self-absorbed and incompatible-with-anything-else toy computers made by Apple could no longer satisfy their needs.Reply

Really. THAT's your analysis?Perhaps you'd like to further enlighten us as to how no-one would call a computer "pro" if it doesn't have a punch card reader attached to it, or can't support at least four floppy drives?

The world moves on. Criticize the RAM expansion if you have a valid case. Criticize the TB expansion. But don't waste our time with some pathetic cri de coeur about how computers no longer look like they did when you were a youngster and everything was skittles and roses. Reply

2 words: cable hell. I know you youngin's don't have an appreciation for it right now, but having your disk drive, your modem, and your tape drive all external was a lot less fun than it sounds. The specific peripherals have changes over time of course, but the concept is the same. Core components that are rarely (if ever) replaced should be inside, where they are protected and move with the computer.Reply

But as a Pro user, aren't you always upgrading your internal components to the latest and greatest hardware, wether that be a new Graphics Card or a faster SSD Card or heck, even a faster Optical Drive bay (if there is one)?? This whole "cable hell" thing is crap in my opinion. I'm old enough to remember desktops / workstations that had a lot of the capabilities of the computer outside, i.e. the modem, optical drive, and that was NEVER a problem for me. Yes, it might now look great and be a clean looking as say an All-In-One Desktop like an iMac but let's be real here, pro users have never really cared about how things look on a desktop. All they care about is Performance and if the new Mac Pro and TB2 are up to snuff, then who cares about a few extra cables.Reply

I don't know a "pro" who doesn't care about peripheral cable hell. Everyone I know much prefers everything to be contained because peripherals are things that get changed so frequently that having to go through many cables to access the ports on your computer is extremely annoying. From what I've seen, people want as much as possible to be INSIDE the computer, which is where we get things memory card readers PCIe wireless cards that you can buy in form factors that integrate into the chassis. Having to connect storage through outside cables is particularly incredibly annoying. Pros have enough clutter as it is, it's ridiculous to start taking parts out of the computer and connecting them through ports to the outside...Reply

I have to agree with Joel here. Also, being a Pro user I like the ability to easily Rig my system based on the amount of Graphics Power my system needs and if TB 2 can my it even easier, due to its modular design, to swap out an older graphics card with a better more robust graphics card then what is the problem here?? To me it gives the convenience of of Plug-N-Play that you get with USB , with the power and throughput (which now is @ 20Gps both ways on TB2) you get with Thunderbolt. To me... it's a winner.Reply

How much would that limit more "Rendering" based applications though? It still might be more than enough bandwidth for GPUs used in that configuration (in fact people have done that with Mac Minis - used a thunderbolt to PCI expansion chassis for After Effects GPU accelerated rendering).Reply

The oddity is that only three 4K displays are supported even though there are six Thunderbolt ports. A single W9000 can drive six 4K displays via DP 1.2. With Falcon Ridge supporting DP 1.2 pass through, all of six of the TB ports should be able to provide 4K resolutions in addition to the HDMI 1.4 port. Reply

I think the DUAL GPU's (standard equipment in the Mac Pro Tube) obviate the need to use external graphics. Six thunderbolt ports are a bit overkill. They must expect that you will use two or more of them as display ports, 4K video capture, RAID box...etc. Reply

Sure are a good number of uninformed people commenting here. People will believe everything from Apple is awesome I suppose. Thunderbolt is not a good alternative for pro level graphics especially when multiple GPUs are involved. It can barely handle one, and not even the highest end cards at full speed. Make that two or more and you run into extreme limitations.

Another comment stated that nothing is better than 2 graphics cards (already included)... well actually cheaper PCs running any other OS can take 4 cards, though they may not look like a cylinder. I am aware of diminishing returns on more than 2 cards, but it's still true, especially where compute is concerned rather than graphics.

And seriously GB vs Gb... it's 2013. People should understand this by now.

And as to pro level people only upgrading every 5 years.... um.... no. While CPUs really haven't gotten enormously faster in that time, GPUs sure have. Anyone who's a "pro" running a 5 year old graphics card is going to be hurting to keep up unless they're only doing CPU intensive things.Reply

Has it actually been determined that the internal GPUs are not upgradable? The chassis seems easy to open although it's unclear how easy the boards are to get out. Even if Apple doesn't bother providing first-party GPU upgrade options, nVidia and ATI have been directly selling their own GPUs to Mac Pro users for years and they would presumably be interested in continuing to do so in the future if the GPUs in the new design are accessible. It's a bit early to tell if future GPU updates are forced onto Thunderbolt. Maybe Anand can clarify this while he's at WWDC.Reply

The G4 cube failed because it was so much more expensive than the regular Power Mac G4, for no extra performance or features, just the design. We'll see what the situation with the new Pro is. But in this case, they won't even update the old chassis Pro along with it, so there's nothing to cannibalize its sales. Reply

It's like the next gen version of the Cube to me. It's like Ives said, "I want to do something Jobs couldn't do."

So he set out to take his idea of simple basic shape and use modern technology to do what he couldn't do.

To which, I say, "Eh." I don't call this innovation because I don't see the value in the shape being different. It's just different. That doesn't make it better and there are a lot of reasons it's worse, so...

Yes I do. It is interesting, and I hope it is effective but it is not scalable. As a workstation, what the previous Mac Pro's were, it fails. By limiting it to one processor and only 4 memory slots they have ceded the high end workstation market to Dell and HP. Reply

Man. The thermal profile just scares me. That's at least 100W of CPU and likely 400W of GPU heat that has to be removed. It looks fresh air is sucked in the top and exhausted out the bottom - I suppose the whole center of the chassis is heatsink. At any rate, I like the design but it will be either loud or toasty hot. Reply

Apple's site has some good pictures of the "thermal core". Basically the whole center of the tube is a triangular heat sink with a GPU or CPU on each face. The fan on top draws air from the bottom, through the heat sink and out the top. I'm not much of an Apple fan but I have to admit that the cooling design is really elegant. We'll see how it works in practice though. Reply

Probably the closest thing in the DIY PC market is the SilverStone FT02, with its stack-effect cooling design enabled by rotating the motherboard 90 degrees. Tests show it does extremely well on thermals, at least if you select appropriate heatsinks and orient them properly (so the intake air goes through the fins rather than being blocked by them). I suspect you could get away with just using the intake fans on a reasonably good CPU and GPU if you had coolers with wide-spaced fins like the HR-02 Macho and Accelero S1 Plus. In fact, I'm going to try something like this for my next build.Reply

In the top and out the bottom would be horrible. The G4 Cube was convection cooled, that is to say it used the natural movement of hot air upwards to suck cool air in the bottom and create an effective fanless cooler.

This is another take on that concept. Cool air is sucked in the bottom, through the fins in the thermal core. They have a puller turbine which blows the hot air out the top. The natural convection combined with turbine assist tends to make this sort of design very efficient and quiet.

With the machine idle you can probably stop the fan entirely. Fan speed required to expel 500-600W of heat wouldn't be higher than 1500rpm or so (educated guess), and probably even slower. It is a neat design, but as we can all see a tremendous number of custom parts are required to make it work.Reply

It is possible and it means REPLACING a complete board off this machine. You can see that there are really two large boards and a smaller power-supply board in the triangular formation. Smaller bits like ram runs off the main board. The cpu seems "bolted" to the main board so it will be very expensive. ie buy 1/3 of a new machine rather than a new cpu. The new IVB Xeons they are using has 12 cores with hypertreading, so Apple records this is enough for this "workstation" so liken as a desktop really considering the design to be placed on a table rather than on a floor.Reply

If they use one of the lower TDP ones then that's even worse, they will get well left behind by companies fitting their workstations with dual E5 v2 Xeons for 24 cores running well over 3 Ghz. Meanwhile R2-D2 here will hobble along with 12 cores probably at 2.5 - 3GHz.

What they've basically done is built a very fancy Mac Mini. That's great, but it's not a Mac Pro.They should have also made a larger version with much greater hardware possibilities of multiple CPUs, multiple harddrives, optical drives etc etc.Reply

It looks very cool. Apple always does have great visual designs in their products, but I just don't like proprietary designs. As someone who has built every computer I've owned since my first 486 DX system, I'll pass on owning one of these. It will be interesting to see how many people buy it without really needing that kind of computing power.Reply

Mine was 286-8Mhz!. Actually started with the original 8088 clone machines that are large desktop cases with twin floppy drives. And those form factors have not changed much in over 2.5 decades!. Many will buy the machine just because they "think" they are up for an upgrade and it seems the only option for them. There are few people who would proper research the options available to them.Reply

What offends me the most, as a years long Mac Pro user, is the fact they've obviously ditched the philosophy of the Mac Pro line.

Part of the beauty in owning a Mac Pro, apart from the fact it's a mighty machine, beautifully designed which runs a fancy system like OSX, has always been the fact that unlike the toys we call imacs, the mac pro is highly configurable and extendable. You can modify its hardware to your heart's liking, and to me that has always been a sign of respect towards the user - there's nothing I hate more than being treated as a moron incapable of doing anything other than pushing the ON button.

Reading the description of this new design - I can just say one thing: this is not a Mac Pro. I don't know what it is, but a Mac Pro it isn't. It's more like an imac with a testosterone injection, except uglier and taking up more space.

All indications prior to this were that the Mac Pro line was already dead anyway. This may be a disrespectful consolation to a lot of people, but it's better than nothing.

Modern Apple's philosophy has always been elegance and function over user expandability. If anything, the Mac Pro was a striking exception to that rule. I don't see how anyone could be surprised that Apple would go this route. Reply

The design is, as typical, striking. But now it shares the same flaw as every other Apple product-it's horrible or impossible to service/upgrade.

Apple still needs normal desktop systems at normal prices, and they still need a high end M17x-like notebook. Yes they sell the Macbook Pro at high end prices (it's more expensive than I paid for my GTX 680 equipped M17x), but it's mid range hardware.Reply

M17x-like notebook will always be a niche product, there's absolutely zero point to make a gaming-oriented notebook with Mac OS onboard. I don't care about regular MBP, but MBPR 15" is superior in almost every other aspect but raw GPU power (with CPU performance being more or less equal). It has vastly superior screen, better trackpad, is also much thinner, lighter and has better battery life. All this with state-of-the-art quad core, 16GB Ram and fast mid-range 650M. In fact, GPU is the only real mid-range thing and certainly not because of price issue.Reply

Well come to the nighmare machine if you need another PCI-Ex Card for some apps like ProTools, Avid... What's the solution? buy a thunderbolt PCI Ex enclosure and spend $$$ for the design whim. That's the Apple fan way of life...Reply