And if the iMac reaches the point of acceptable ram, gpu and cpu (as I feel the current top end is...ie as you say, you're no longer in physical pain due to the limitations of said machine) surely you plug into Raids/GPU stacks/Modular Render stacks? And/Or the 're-factored' Pro could.

IMO, that's the ideal setup. Get a specialised render box like the Boxx Pro rendering machine but with GPUs and get the front-end to be as nicely designed and lean as possible.

Quote:

Originally Posted by Lemon Bon Bon.

Great link by the way, Marvin. Enjoyed reading about Nv and Weta working together to take Avator and move fx to the next level.) Yes. A fascinating read. Made me think of the implications, if any for APple's software and hardware.

One of the phrases that stood out for me was "The complexity of Avatar motivated us to think about rendering differently". They didn't take a linear approach of adding more CPUs like everyone else, the limitations forced them to think outside the box - something Apple has promoted for as long as it has existed.

Apple could give a serious boost to the high-end rendering scene if they just connect a few dots together. NexT used to ship Renderman in their OS to do 3D graphics. It is pretty much the benchmark for 3D post-production. What better way to supplement it than offer the same capabilities Weta describes.

They have Thunderbolt to connect external boxes and they have OpenCL to run on them along with their Core software architectures. All they have to do is get Pixar to develop a piece of core raytracing tech e.g Core Illumination and allow scanline engines to access the pre-computed data.

Motion, Final Cut, Maya, Lightwave, you name it, they just get the shadow, reflection, refraction, whatever data from the pre-compute step and do the final render on the CPU, which takes a fraction of the time.

Quote:

Originally Posted by Lemon Bon Bon.

The numbers you point to show quite an immense time saving going GPU over traditional cpu render farms by a factor of 6-7 to 1!!! That's pretty immense. Where will computer/Apple computers be in ten years time?

GPUs are going up by a staggering amount. NVidia has a roadmap suggesting this year's Kepler GPUs will do 5 DP GFLOPs per watt and Maxwell (2013-2014) will be 15 DP GFLOPs per watt. This means that a 10W mobile Maxwell GPU in 2013-2014 will outperform the current high-end 6-core Xeon X5870. This year's mobile Kepler GPUs will be half of a Xeon chip.

Current Xeons can do about 100GFLOPs in 100Watts i.e 1 DP GFLOP per watt. NVidia has a white paper noting that the Tesla C1060 is 8x faster than the Xeon W5590 (p15) with around the same power draw:

Rather whatever Apple does in the future it has to appeal to the wider desktop audience. mac pro already has a significant sales disadvantage due to its limited appeal. Specializing a box even more will not help out Apple desktop sales.

Quote:

Originally Posted by Marvin

IMO, that's the ideal setup. Get a specialised render box like the Boxx Pro rendering machine but with GPUs and get the front-end to be as nicely designed and lean as possible.

You are assuming that everybody with a desire for a desktop has a work load that fits into that sort of arrangement. I suspect that is a small number of pros using Apple hardware right now.

Quote:

One of the phrases that stood out for me was "The complexity of Avatar motivated us to think about rendering differently". They didn't take a linear approach of adding more CPUs like everyone else, the limitations forced them to think outside the box - something Apple has promoted for as long as it has existed.

That is mostly marketing hype and self promotion. Many industries have been looking at and incorporating GPU processing into their work loads.

Quote:

Apple could give a serious boost to the high-end rendering scene if they just connect a few dots together. NexT used to ship Renderman in their OS to do 3D graphics. It is pretty much the benchmark for 3D post-production. What better way to supplement it than offer the same capabilities Weta describes.

The volume isn't there. This is why a generalized approach of building desktops that support OpenCL is a priority at Apple. By doing so they can ship hardware to a wider audience that allows for reasonable pricing. If apple where to start building specialized GPU processing hardware the cost of this hardware would make the Mac Pros look cheap.

Thankfully Apples approach is paying off and they had the influence to get Intel to build in OpenCL compatibility on their new hardware. So Apple will soon have coverage of OpenCL from the extreme low end to the high end. Notably at reasonable prices.

Quote:

They have Thunderbolt to connect external boxes and they have OpenCL to run on them along with their Core software architectures. All they have to do is get Pixar to develop a piece of core raytracing tech e.g Core Illumination and allow scanline engines to access the pre-computed data.

Thunderbolt isn't exactly the best interface for this sort of thing, at least not in the large scale. What TB might do for Apple is to allow for small scale clusters that could potentially be very useful in large computational projects. Remember at this point if mac Pro is replace, the new machine will likely be close to twice as fast CPU wise and leery interesting GPU wise.

Quote:

Motion, Final Cut, Maya, Lightwave, you name it, they just get the shadow, reflection, refraction, whatever data from the pre-compute step and do the final render on the CPU, which takes a fraction of the time.

Again this is all interesting but Apple can't be in the business of special purpose boxes. If they really want to offer up something like this it has to fall out of a more general purpose architecture that can ship in volume.

Quote:

GPUs are going up by a staggering amount. NVidia has a roadmap suggesting this year's Kepler GPUs will do 5 DP GFLOPs per watt and Maxwell (2013-2014) will be 15 DP GFLOPs per watt. This means that a 10W mobile Maxwell GPU in 2013-2014 will outperform the current high-end 6-core Xeon X5870. This year's mobile Kepler GPUs will be half of a Xeon chip.

This isn't lost at Apple, in fact I'd say they have had an eye on this reality for some time. Think about it, OpenCL and some of the other Apple initiatives have been with us for awhile now. The pay off in good GPU performance is recognized and if some of the rumors about AMD's LLano in Apple hardware where true Apple has been struggling to get the optimal hardware in place. Good things don't happen overnight.

Quote:

Current Xeons can do about 100GFLOPs in 100Watts i.e 1 DP GFLOP per watt. NVidia has a white paper noting that the Tesla C1060 is 8x faster than the Xeon W5590 (p15) with around the same power draw:

Yeah more specs. You have to remember the devil is in the details. GPU's currently are very good at certain types of floating point processing but they aren't the solution to ever problem. So while amazing GFLOP numbers can be thrown around by NVidia and others just realize that GPU's need the right sorts of data to work on to get those numbers.

The S1070 used on Avatar is a 1U rackmount server with 4x high-end GPUs connecting to a host via PCIe. This link could be replaced by Thunderbolt as it's just a compute engine (not real-time).

With the right hardware, a Mac Pro, you could just plug in a compute card and bypass the slow TB link. Even better would be a future where every machine comes with a GPU with the resources to do compute.

Quote:

We can see this dramatic leap of power year after year with devices like the iPad. You never hear that the CPU jumps up to 9x performance in a single year.

Yes but you have to understand they whys here. The things that a GPU does in a system are highly parallel. That means one gets almost 100% out of every processor added. You can't get that sort of advantage from most CPU applications no matter what you do. The fact that GPU hardware can be applied to a narrow range of other problems is sort of icing on the cake.

Apple can't really build or define hardware in terms of what the movie industry needs. Rather whatever Apple does in the future it has to appeal to the wider desktop audience. mac pro already has a significant sales disadvantage due to its limited appeal. Specializing a box even more will not help out Apple desktop sales.

It looks like you may have missed the point. Apple do not need to produce specialized hardware. All Apple need to do beyond the current Thunderbolt and OpenCL support is provide some addition software, as Marvin has specified above. Third-party hardware manufacturers would then produce the Thunderbolt accessory hardware desired by the market. Once the solution is in place for rendering, third-party software developers would exploit it for other applications. The result is that an ordinary unspecialized Mac Mini could be used (with the addition of third-party Thunderbolt hardware) to do computations which currently require a Mac Pro.

It looks like you may have missed the point. Apple do not need to produce specialized hardware. All Apple need to do beyond the current Thunderbolt and OpenCL support is provide some addition software, as Marvin has specified above. Third-party hardware manufacturers would then produce the Thunderbolt accessory hardware desired by the market.

My point is that Apple nor a third party can sustain on such a product! Or to phrase it as a question how can such a product compete against a PC outfitted with a couple of compute cards. Mind you a company building the specialized hardware would need to be able to finance on going operations from said hardware. At best a company might have a chance if the hardware is general purpose enough to attract a wider audience.

Quote:

Once the solution is in place for rendering, third-party software developers would exploit it for other applications. The result is that an ordinary unspecialized Mac Mini could be used (with the addition of third-party Thunderbolt hardware) to do computations which currently require a Mac Pro.

I can't ever see this being a success. You already have companies making compute optimized servers, some with GPU acceleration, in rack type clusters. An alternative as described would need to be cheaper and high performance, something I don't see happening. Even for a single box implementation I don't see a big advantage over a Pro with GPU acceleration cards. Maybe I'm missing something but I just don't see the economics.

Apple could give a serious boost to the high-end rendering scene if they just connect a few dots together. NexT used to ship Renderman in their OS to do 3D graphics. It is pretty much the benchmark for 3D post-production. What better way to supplement it than offer the same capabilities Weta describes.

They have Thunderbolt to connect external boxes and they have OpenCL to run on them along with their Core software architectures. All they have to do is get Pixar to develop a piece of core raytracing tech e.g Core Illumination and allow scanline engines to access the pre-computed data.

Motion, Final Cut, Maya, Lightwave, you name it, they just get the shadow, reflection, refraction, whatever data from the pre-compute step and do the final render on the CPU, which takes a fraction of the time.

GPUs are going up by a staggering amount. NVidia has a roadmap suggesting this year's Kepler GPUs will do 5 DP GFLOPs per watt and Maxwell (2013-2014) will be 15 DP GFLOPs per watt. This means that a 10W mobile Maxwell GPU in 2013-2014 will outperform the current high-end 6-core Xeon X5870. This year's mobile Kepler GPUs will be half of a Xeon chip.

Current Xeons can do about 100GFLOPs in 100Watts i.e 1 DP GFLOP per watt. NVidia has a white paper noting that the Tesla C1060 is 8x faster than the Xeon W5590 (p15) with around the same power draw:

The S1070 used on Avatar is a 1U rackmount server with 4x high-end GPUs connecting to a host via PCIe. This link could be replaced by Thunderbolt as it's just a compute engine (not real-time).

Edit: You aren't suggesting that they rendered Avatar on a single 1U server are you?

Avatar had a ton of cpus there too. Some of what you're mentioning requires clarification, and editing down of the NVidia marketing kool aid that they licensed from Apple (okay I couldn't resist suggesting that, the concept makes me laugh).

I don't know where gpu based rendering and number crunching will be in a couple years. It's still in its infancy. I can't recall if Renderman supports it at all. Mental Images incorporated some gpu functions into Iray with select gpus (although I haven't used iray). VRay RT incorporates it as well for draft renders. That's used more in visualization and advertising (as is Mental Ray) with film seemingly split between Mental Ray and Renderman. Not every studio uses Renderman or uses it exclusively. Maxwell Render also implemented a real time gpu rendering preview in some of their stuff, and I think Modo grabbed a couple gpu features. Anyway a lot of this stuff started to pop up in 2010 or so. It's still in its infancy today, and there are a number of limitations into what the gpu can run. I haven't found enough information on it to understand the progression entirely, but regarding Apple, they've never supported that market very well.

My real issue with Apple isn't that the only solution must be the mac pro. It's that some of the heavier hardware options aren't fully supported under other solutions, and the quality control in the imacs is atrocious (especially displays). Apple seems to go cheap on things that the general consumer doesn't understand. They figure out what people will tolerate.

I don't think they'll see many gains here. If anything Linux is a better point of migration for that industry. As of right now half of the plugins and scripts don't run on OSX versions of stuff compared to the Windows side, and many studio pipelines aren't built to support macs.

Quote:

Originally Posted by mcarling

It looks like you may have missed the point. Apple do not need to produce specialized hardware. All Apple need to do beyond the current Thunderbolt and OpenCL support is provide some addition software, as Marvin has specified above. Third-party hardware manufacturers would then produce the Thunderbolt accessory hardware desired by the market. Once the solution is in place for rendering, third-party software developers would exploit it for other applications. The result is that an ordinary unspecialized Mac Mini could be used (with the addition of third-party Thunderbolt hardware) to do computations which currently require a Mac Pro.

Ugh... you really need to look at your details. GPUs can receive data much faster than they would over thunderbolt currently, and i'm not sure what protocols thunderbolt supports anyway. Apple likes to tell us that it supports everything, but there are already a bunch of fringe examples where it simply doesn't work. They're lying or leaving out information. As to Marvin's suggestion, not even a proof of concept exists at this point, much less a functioning rig. You're basically turning the mini into a slim client, which is pointless. I don't know how to hammer this in any better. The point of a workstation is for work that can't be centralized onto a server. You simply don't have that bandwidth with thunderbolt, and thunderbolt in itself draws it bandwidth specifically from available PCI lanes meaning that you'll never exceed that limit on a given machine. Further no thunderbolt standard that matches the current transfer rate an internal gpu can receive is coming up in the near future, and when it does, there's no guarantee that such bandwidth would run off the board type used by the mini. It's more likely that it would be throttled to a slim chip like the macbook air.

Quote:

Originally Posted by wizard69

You are assuming that everybody with a desire for a desktop has a work load that fits into that sort of arrangement. I suspect that is a small number of pros using Apple hardware right now.

That is mostly marketing hype and self promotion. Many industries have been looking at and incorporating GPU processing into their work loads.

The volume isn't there. This is why a generalized approach of building desktops that support OpenCL is a priority at Apple. By doing so they can ship hardware to a wider audience that allows for reasonable pricing. If apple where to start building specialized GPU processing hardware the cost of this hardware would make the Mac Pros look cheap.

Thankfully Apples approach is paying off and they had the influence to get Intel to build in OpenCL compatibility on their new hardware. So Apple will soon have coverage of OpenCL from the extreme low end to the high end. Notably at reasonable prices.

Thunderbolt isn't exactly the best interface for this sort of thing, at least not in the large scale. What TB might do for Apple is to allow for small scale clusters that could potentially be very useful in large computational projects. Remember at this point if mac Pro is replace, the new machine will likely be close to twice as fast CPU wise and leery interesting GPU wise.

Bleh the point being missed by him was largely that server potential was there for non localized functions already. Thunderbolt doesn't really change this, and the bandwidth isn't high enough to approximate that of a local connection type anyway. I don't see where the misunderstanding lies between workstations and servers here, but it's definitely there. People only use workstations for long server like number crunching when it benefits them. This means that if they own one box already, it can use either unused cpu cycles to crunch numbers in the background or be allocated to this purpose when not in real use. This is a completely separate thing from functions which must be run in a localized manner due to IO bandwidth and protocol constraints. If it wouldn't have been feasible in the past to offload something via fibre channel, what makes thunderbolt a game changer there? I'm not sure where the misunderstanding lies, but putting together a clunky solution isn't going to attract more power users to the platform and promote sales growth of anything. It doesn't even make remote sense. You can do the same thing with a typical i7 PC desktop and simply have a moderately powerful desktop gpu there. Why downgrade to a mini for the same money, and if you're powering a stronger gpu elsewhere, your power savings on desktops won't be much considering the heavier external gpu.

Also in his reference to Boxx, they sell a lot of overclocked single socket workstations. They don't run with minis or laptop like devices. I think they have a slim client or two. Anyway the point was that externalized hardware only works for part of the process, and Apple isn't even a huge contender there. In desktops the mac pro has remained in use simply because it lacks some of the issues of the others, especially in integrating with existing hardware. Apple goes really cheap on ports, and thunderbolt is still lacking much development a year later. I just don't see the case to chop everything down to a mini when there are better solutions even to the problems suggested by Marvin which aren't commonly0 run on Macs anyway.

My point is that Apple nor a third party can sustain on such a product! Or to phrase it as a question how can such a product compete against a PC outfitted with a couple of compute cards. Mind you a company building the specialized hardware would need to be able to finance on going operations from said hardware. At best a company might have a chance if the hardware is general purpose enough to attract a wider audience.

I can't ever see this being a success. You already have companies making compute optimized servers, some with GPU acceleration, in rack type clusters. An alternative as described would need to be cheaper and high performance, something I don't see happening. Even for a single box implementation I don't see a big advantage over a Pro with GPU acceleration cards. Maybe I'm missing something but I just don't see the economics.

What you seem to be missing is software. People don't buy a Mac Pro rather than a PeeCee just because they like Apple hardware. People buy Macs because they want OSX and the software solutions which it provides. If the software is sufficiently better, the hardware doesn't need to be the cheapest.

Quote:

Originally Posted by hmm

Ugh... you really need to look at your details. GPUs can receive data much faster than they would over thunderbolt currently, and i'm not sure what protocols thunderbolt supports anyway. Apple likes to tell us that it supports everything, but there are already a bunch of fringe examples where it simply doesn't work. They're lying or leaving out information. As to Marvin's suggestion, not even a proof of concept exists at this point, much less a functioning rig. You're basically turning the mini into a slim client, which is pointless. I don't know how to hammer this in any better. The point of a workstation is for work that can't be centralized onto a server. You simply don't have that bandwidth with thunderbolt, and thunderbolt in itself draws it bandwidth specifically from available PCI lanes meaning that you'll never exceed that limit on a given machine. Further no thunderbolt standard that matches the current transfer rate an internal gpu can receive is coming up in the near future, and when it does, there's no guarantee that such bandwidth would run off the board type used by the mini. It's more likely that it would be throttled to a slim chip like the macbook air.

Bleh the point being missed by him was largely that server potential was there for non localized functions already. Thunderbolt doesn't really change this, and the bandwidth isn't high enough to approximate that of a local connection type anyway. I don't see where the misunderstanding lies between workstations and servers here, but it's definitely there. People only use workstations for long server like number crunching when it benefits them. This means that if they own one box already, it can use either unused cpu cycles to crunch numbers in the background or be allocated to this purpose when not in real use. This is a completely separate thing from functions which must be run in a localized manner due to IO bandwidth and protocol constraints. If it wouldn't have been feasible in the past to offload something via fibre channel, what makes thunderbolt a game changer there? I'm not sure where the misunderstanding lies, but putting together a clunky solution isn't going to attract more power users to the platform and promote sales growth of anything. It doesn't even make remote sense. You can do the same thing with a typical i7 PC desktop and simply have a moderately powerful desktop gpu there. Why downgrade to a mini for the same money, and if you're powering a stronger gpu elsewhere, your power savings on desktops won't be much considering the heavier external gpu.

There are some computational applications for which Thunderbolt has ample bandwidth and the tighter integration of a Thunderbolt connection may offer advantages over an Ethernet connection. It doesn't have to be the best solution for all customers in order to be the best solution for some customers.

Again, I'm not suggesting that Apple make any specialized hardware. I believe it's enough for Apple to provide some software. If there is sufficient market demand, someone will build Thunderbolt GPU compute engines. If demand, is insufficient, then they won't be built.

While I'd like a bigger 'mini'...(I'd buy a Cube tomorrow...) I don't see Apple doing that. (I'm not sure what it can give me that an iMac can't.) Otherwise it wouldn't be a mini. The iMac isn't going anywhere...it firmly occupies the 'mid-tower' ground from £995 to £2k. So a mini-Pro? That would be overlap. Hey, they USED to sell the entry tower at £1495 (something like that...) but now they don't. Again. I don't think that's an accident.

Are they really going to develop a new desktop when they just canned their cheaper laptop entry model, the 'Macbook?' (A good servant for the laptop sales...but they canned it all the same.) Looks like the iPad will take care of the 'netbook'/cheap laptop area.

If they redesign the Pro...to a mini-Pro...where's it going to sit, price wise?

I'd gladly pay up for a cube that DOESN'T max out at 16GB RAM (4-64GB please), single processor (dual processor option please), average graphics ('nough said) and difficult to replace HDD/SSDD (3/4 bay with hardware RAID please). Price is less relevant. Please give us some more headroom Apple! A Mini-Pro (in cube form) would be awesome.

What you seem to be missing is software. People don't buy a Mac Pro rather than a PeeCee just because they like Apple hardware. People buy Macs because they want OSX and the software solutions which it provides. If the software is sufficiently better, the hardware doesn't need to be the cheapest.

My point is that systems like this, that is computational clusters, always have a cost consideration. That is why so many are built out of generic hardware and Linux.

Now could Apple create a captive market around niche hardware and software? It is certainly possible but unlikely to be profitable unless they can successfully charge a great deal for said hardware. I don't see it happening.

Now ohers have suggested leading with software attached to the hardware to I guess wedge themselves into the high performance computing market. That is start with video and make sure the tools are there to support more general needs. This isn't impossible but this market is already well served by others and it is still a niche market. Apple can play in this market much easier by simply equipping their machines with OpecCL hardware and by having a slot to support another GPU card. I just don't see the feasibility or attraction in a headless GPU computation box.

Quote:

There are some computational applications for which Thunderbolt has ample bandwidth and the tighter integration of a Thunderbolt connection may offer advantages over an Ethernet connection. It doesn't have to be the best solution for all customers in order to be the best solution for some customers.

There well may be in the future TB advantages and for small systems that may be now. The problem is that Ethernet and other standards have a tremendous amount of read to go infrastructure that can scale. Depending upon the tech you might have hundreds of nodes on an Erhernet based system. It isn't clear to me that TB can support such systems. In fact I'm pretty sure it can't.

Now TB might effectively interface you to a couple of boxes or a small cluster if the "hub" MAc had several TB ports. That has significant potential, however you wouldn't be doing Avatar on it.

Quote:

Again, I'm not suggesting that Apple make any specialized hardware. I believe it's enough for Apple to provide some software. If there is sufficient market demand, someone will build Thunderbolt GPU compute engines. If demand, is insufficient, then they won't be built.

Apple already provides much of the software required. The problem is nobody builds clusters or compute engines around Apple hardware and software. At least not recently. Even if somebody where to go after this market Agressively there would be a huge cost advantage to simply supporting a GPU compute card on the PCI Express bus.

Your post boils down what is needed in a desktop to a few essentials. This is really key to a viable desktop from Apple.

Quote:

Originally Posted by McCrab

I'd gladly pay up for a cube that DOESN'T max out at 16GB RAM (4-64GB please),

I don't currently have need of RAM past 16GB but I'm not at all pleased with artificial restrictions here. So support for more slots would be very nice.

Quote:

single processor (dual processor option please),

This I debate with myself a bit. I see dual socket up port as a bad idea. It adds size and expense that we are trying to get away from.

Quote:

average graphics ('nough said)

Average means different things to different people. Ideally the machine would support a descrete GPU working in conjunction with an integrated GPU. At least on a low end model that uses an Ivy Bridge type chip.

Why? Well Intel GPUs really aren't that great and probably never will be. However the combination of a integrated and nielsen of the road descrete GPU could lead to very nice performance and be easily leveraged for compute usage. I just see a modest amount of hardware doing very well for the price. Especially if the GPU is AMDs or NVidias latest architectures which they are now debuting in lower performance versions.

Difficult to replace? No way, sadly apple does better with their laptops these days. Apple needs to pull its head out of you know where and actually make a serviceable desktop machine.

As to disk drives hardware RAID really isn't needed anymore. With all the cores in today's processors software RAID is fine. While drive bays are still needed I want to see a move to a high speed PCI Express port for solid state storage cards. Today that would be a boot and app drive but the pay off is huge.

Quote:

Price is less relevant. Please give us some more headroom Apple! A Mini-Pro (in cube form) would be awesome.

Yeah something like that. It would likely be a bit bugger and it would have to support a slot in at least one form. The important thing is that it must depart from legacy devices and be as minimalist as possible. Price is important as we don't want a failure on our hands like the Mac Pro.

At least people are recognizing the Pros very old design these days. Take a ten year old laptop and compare it to today's, things have changed a lot. Not so much with the Mac Pro where each revision just freshens the architecture while not innovating.

Good question which one needs to have confidence in guesses about current Mac Pro sales. I could see sales of less than 10k a month. Obviously internal information would be required to say for sure, and frankly the upper tier models are the only ones some would consider. So sales actually could be biased towards the high end.

Please get a grip here the MacBook was replaced by something that better serves the same market. Every time I see comments like this I shake my head, it was pretty clear where the Mac Book was going the first time Apple debuted the refactored AIRs at a more reasonable price.

This constant falling back to performance is a waste of time, because for every case that one can find for an iMac smacking a Pro around a like number can be found where the Pro smacks the iMac. Notably at lot harder than the iMac can manage to smack the Pro.

More importantly I don't really see a lot of Pro sales going to people simply focusing on performance, at least not CPU performance. They are instead buying for the other capacities that the machine offers.

It all depends upon how you look at the numbers. Apples Mac sales are cooling off and the desktop market for Apple in the US is absolutely flat. Also look at the stores and what do you see most directly - the low cost products. It isn't so much that devices are selling well because of portability but rather because of cost.

Yes but look back to all of through all of those years and realize that Apple did have reasonable desktop offerings. Even the Mini was a more reasonable offering. Not that the Mini is extremely bad just that one could go to Apple and get a complete system at a reasonable cost. Today theMini is more expensive and the monitor solution is terrible. It is no surprise then that the Mini is the sales darling of the online industry where it is packaged with reasonable screen solutions.

PPC became a joke performance wise, so they got beaten up simply due to real world performance. It is the pricing of the low end that really bothers me. Apple simply doesn't have anything remotely competitive Asa midrange desktop machine.

How in the hell can you make any of those statements???. There is nothing about an XMac that would be unprofitable in the least. The Pro is likely the least profitable machine in Apples line up. Further a properly built XMac would be more of a successor to the Apple 2.

Exact specs - how about using your imagination? There are many ways for Apple to realize an XMac. What you need to see though is that this in part would be the total overhaul of the Mac lineup. Gone would be the current Pro at the high end and the Mini at the low end. In stead we would have a scalable platform that could cover the entire performance spectrum.

Beyond that Macs don't have anymore momentum portability wise. The surge or better yet the ramp will be biased towards iOS devices more and more every quarter. If momentum was there Apple wouldn't be sending me ads for AIRs and other Mac hardware.

A model range that replaces both the Pro and Mini and takes a few iMac sales along with it could easily sell 2 million or more a quarter.

Why would you bring up the Cube which back then was a terrible value relative to anything else just like most of today's Mac Pros? If anything Apple learned nothing from the Cube as they repeated that insanity with the original AIRs. At least with the AIRs they pulled their collective heads out of a dark place and produced something the market could eventually accept.

tides go in and out, currently the tide is receding.
Let me count the ways.

A much larger screen that fits my needs.

The ability to actually add disk storage to the base configuration.

Serviceability.

Speed, a desktop can easily exceed the performance of any portable.

Expandability or configurability. And no I'm not talking GPUs here.

A desktop actually reduces clutter. Throw a laptop on a desk and you have to connect a bunch of wires to it.

I need someplace to store everything and to backup all of those portable devices.

This makes no sense at all. Average consumers shop and thus are more sensitive to value than your so called die hards. They can perceive a bargain and a screwing.

Do you really understand business at all? I can't tell you how many time I've been pulled off primary tasks to solve the problem of the moment. We aren't even talking consummer electronics here.

Apples condition is no more different than John Dears, they sell big and little tractors and everything in between. They do not give up on one segment just because another is hot.

Just because you can constantly repeat something does not make it true. IMac can't possibly smack the Pro unless you are very narrowly focused on optimal benchmarks. Your constant obsession here tells me you have no idea or appreciation of the Pros advantages.

I'm with you on pricing though. The whole discussion about XMac revolves around the idea of a far more cost effective desktop line up.

It would be very bad voodoo for Apple to abandon the desktop.

If this was true I'd almost be happy. The fact is iMac has soaked up nothing. Apple is in fact loosing sales. People that want a desktop are simply going elsewhere.

The iMac is not a replacement for either the Mini nor the Pro, it is as simple as that.

Considering that the iMac is in a category of its own and is the only machine with some success there is no reason to can it. The Mac Pro and Mini on the other hand could die tomorrow and no one would bat an eye.

Again you have a very narrow view of reality. There is still a strong demand for performance that can't be had in a portable. That is the driver for desktop machines. Apple doesn't need a Tower in the sense of the Mac Pro to deliver such performance any more, that we can agree upon. However neither the iMac nor the Mini can deliver such performance. The obvious response is a midrange capable machine.

Interestingly you are half way there to a XMac. Just throw in modern technology and you would be all set. In fact you seem to have pretty much verified in my mind why an XMac is needed. You dismiss XMac one moment but then turn around and realize that some of the points offered up are important and suggest what could be considered to be an XMac.

The more thought that people put into this the more obvious it should become that Apple needs to do something about the Mini and the Pro. XMac is simply a concept that they can build a new family around.

2 million desktop sales? Fantasy. Turn the tide from 4:1 to 2:1? Sure. And Apple won't charge a premium on the said line like they currently do with everything you want in it?!? You think the extra processor will add cost? Ya think?

Unlikely. *Passes a 'reality' sandwich for Wizard to get a 'grip' on.

Build their desktop around your 'X-Mac.' A complete 180 in design direction? Very unlikely.

*Apple don't want you to service your machine.

*Apple don't want to sell dirt cheap desktop machines.

*Apple don't want them too expandable.

*The market is buying laptops. (They sit onto desks much easier! And you can even carry them!
They have most of the desktop's power these days! Want a bigger screen? Buy one. Apple will sell you one too!)

*That's the 'desktop' they're selling most of these days. It's not your desktop. But Apple doesn't care because they're selling 5 million Macs a quarter. ie Record numbers. ie without this fabled X-Mac.

This is how they make their money. Buy one every 3 years and buy another.

So keep dreaming for those features. That's where you'll find them.

www.apple.com Check out the store. That's what they're selling. There is no X-Mac.

You'll *maybe' get a choice of gpu and HD at point of sale. That's your expansion internally.

If you want to add some more? You can just about do a ram module yourself. That's about as much as they want a user to do.

I guess you can add an external hard drive. There's Firewire, USB and Thunderbolt.

Ironic that you're clinging onto a Macbook Pro. Keeping clinging to your 'desktop'. Because you'll be waiting for the X-Mac far longer than I did...

No harm in wishing for this 'X-Mac.' You'll be waiting in hope.

The 'desktop' is changing. Just not in the direction you want.

'It doesn't matter how many times you say it...it doesn't make it...' happen.

As for performance? Why do you want this X-Mac then? Don't cite performance of a desktop over a laptop and then say it isn't important. Apple will sell you decent performance at a premium price. They don't offer bleeding edge performance machines. They offer compact design. Elegance. Zen. But you'll pay for it. 'It doesn't matter how many times...etc.' you rail against it Wizard. That's what they're offering.

Try their feedback page and see how long it takes for the X-Mac to hit the stores.

Get the Ivy Bridge Mac Mini when it lands and be happy. What kind of work are you doing anyhow?

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

Beyond that Macs don't have anymore momentum portability wise. The surge or better yet the ramp will be biased towards iOS devices more and more every quarter. If momentum was there Apple wouldn't be sending me ads for AIRs and other Mac hardware.

Sure. That's why they're selling more Mac laptops than ever before in RECORD NUMBERS. *(Dragged by the halo of iOS and Apple Store.)

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

The fact is iMac has soaked up nothing. Apple is in fact loosing sales. People that want a desktop are simply going elsewhere.

Back that up with some facts. That's pure conjecture because you don't like the iMac. An absurd statement. Apple is losing sales? From who? They're selling Macs in record numbers. (Oh. I guess, they're losing a sale...uhm...to you..?)

Yes. The 'people' who used to want a desktop are going elsewhere. Laptops.

Uhm...if Apple were to drop the prices on their overpriced desktops they'd sell more? Well, yeah...

Apple used to sell about 1 million Macs a quarter once upon a time. It used to be in favour of desktops. But that changed over time to laptops. That's because laptops over enough power for 9/10 uses for most people.

Once the affordable iMac came on the scene it impacted PowerMac sales. As the iMac grew in power and occupied the mid-tower range and the Mac Pro moved on up...iMac sales impacted it more. There's a reason why Apple don't do sales breakouts anymore. Because certain lines would look weak. In this case, I'm guessing the Pro would.

The World Economy is in the crapper. Newsflash. People want value. As far as desktops are concerned? That's the iMac. For the desktop, it's the Air.

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

Apple just can't offer a screen less desktop at a reasonable price. Look at the inflated price of the mini. No k/b. Mouse or screen. £529 to get started. Add in the 27 inch Apple monitor and you get that squeezy feeling in your wallet.

It's *almost* always been this way. *(Remembers the Candy iMac getting down to £545 and the PPC Mac Mini at sub £400 notable examples of a cheap desktops Apple offered a long time ago.) Their all in ones whether desktop or laptop offer 'better' value...relative to the Pro or Mini...and if you're happy with Apple's premium pricing to start with.)

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

Again you have a very narrow view of reality. There is still a strong demand for performance that can't be had in a portable. That is the driver for desktop machines. Apple doesn't need a Tower in the sense of the Mac Pro to deliver such performance any more, that we can agree upon. However neither the iMac nor the Mini can deliver such performance. The obvious response is a midrange capable machine.

www.apple.com. That's Apple's reality. And their 5 million in sales of Macs in record numbers make your 'narrow' view of reality precisely that. A minority as far as Apple customers are concerned.

Apple are a portable computing company. All their promo shots show laptops even with Pro(sumer) software like Final Cut. 4 Million laptops+.

Apple will release the iPad tomorrow. Zillions of ipads sold already.

That's Apple's reality. Not so narrow.

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

Ridiculous. They've out grown the PC market for how many quarters now?

Lemon Bon Bon.

That's the one are where I think you need a little perspective. Their numbers have been good there, but the Apple fan sites like to spin them in whatever way sounds best. I doubt they are picking up much in imac growth, and studies seem to include the ipad in laptops whenever it specifically suits them (again it would be more appropriate if it was a functional standalone machine).

That's the one are where I think you need a little perspective. Their numbers have been good there, but the Apple fan sites like to spin them in whatever way sounds best. I doubt they are picking up much in imac growth, and studies seem to include the ipad in laptops whenever it specifically suits them (again it would be more appropriate if it was a functional standalone machine).

So you're saying Apple is lying about selling more Macs every quarter than previous?

So you're saying Apple is lying about selling more Macs every quarter than previous?

Nooo I'm saying the guys that publish these articles spin the numbers to look bigger or smaller. When they talk about desktop growth there, Apple is a relatively small player in that area compared to some of the others, and the acceleration on growth in that market segment even within Apple is winding down. Also note that when it's based on total macs, the laptops significantly outpace everything else. Apple could spread the love around on their line a bit considering it's a smaller product line regardless for a company of that size. They seem to leave some of the older designs to the momentum of the brand itself.

If apple where to start building specialized GPU processing hardware the cost of this hardware would make the Mac Pros look cheap.

Apple wouldn't have to build that part themselves and it wouldn't be that expensive. Not much more than a Pegasus RAID system.

Quote:

Originally Posted by wizard69

Remember at this point if mac Pro is replace, the new machine will likely be close to twice as fast CPU wise and leery interesting GPU wise.

Doubtful. We are just getting Sandy Bridge just now so not even close to double after nearly 2 years.

Quote:

Originally Posted by wizard69

If they really want to offer up something like this it has to fall out of a more general purpose architecture that can ship in volume.

Fast GPU options will ship in far higher volumes than 12 CPUs. The GPUs can at least be used for visual tasks.

Quote:

Originally Posted by wizard69

So while amazing GFLOP numbers can be thrown around by NVidia and others just realize that GPU's need the right sorts of data to work on to get those numbers.

Like in Apple's Core software architecture. They can do it for encoding too - Core Compressor. This can be an API used by all apps to do batch image and movie compression. FCPX exports can be done in a fraction of the time.

Quote:

Originally Posted by wizard69

With the right hardware, a Mac Pro, you could just plug in a compute card and bypass the slow TB link.

You can't fit 4 or more high-end GPUs in a Mac Pro so you have to install a card that effectively connects to an eternal box anyway. You may as well use Thunderbolt. When it comes to computation, the link bandwidth only matters if you have a huge dataset that you are accessing on the host. The compute box will have its own storage so the link speed doesn't matter.

Quote:

Originally Posted by wizard69

Even better would be a future where every machine comes with a GPU with the resources to do compute.

Yeah but you'll always be able to stick 4 or more on the outside of the machine.

Quote:

Originally Posted by wizard69

The things that a GPU does in a system are highly parallel. That means one gets almost 100% out of every processor added. You can't get that sort of advantage from most CPU applications no matter what you do. The fact that GPU hardware can be applied to a narrow range of other problems is sort of icing on the cake.

There are only a small set of very highly computationally intensive tasks though. That's highlighted in the Weta example. The final render is done in Renderman on CPUs. The heavy raw computation is done on the GPU.

Quote:

Originally Posted by wizard69

Or to phrase it as a question how can such a product compete against a PC outfitted with a couple of compute cards?

How many USB tuners would you expect Elgato sell compared to PCI-based Win-TV tuners? When you have a product you can buy off the shelf and plug into a port compared to opening up a machine and installing it in a PCI slot, your audience is vastly bigger, not least because you can appeal to laptop owners.

Quote:

Originally Posted by hmm

You aren't suggesting that they rendered Avatar on a single 1U server are you?

Nope.

Quote:

Originally Posted by hmm

i'm not sure what protocols thunderbolt supports anyway. Apple likes to tell us that it supports everything, but there are already a bunch of fringe examples where it simply doesn't work.

It's the same as PCI. What examples are there where it can't be used?

Quote:

Originally Posted by hmm

As to Marvin's suggestion, not even a proof of concept exists at this point, much less a functioning rig. You're basically turning the mini into a slim client, which is pointless.

Render farms with thin client controllers have a similar setup. The server manager isn't going to go round with a USB pen and copy files to each server separately. They are all hooked up to a central location. The setup I suggest is simply a way for a single workstation user to have a powerful main machine and a simple compute cluster for intensive tasks.

Quote:

Originally Posted by hmm

The point of a workstation is for work that can't be centralized onto a server.

It's not really on a remote server though. It's more like a co-processor. Think of the following scenario:

You have a 6-core Xeon Mac Pro and working in Maya. When it comes to rendering, you plug in your Thunderbolt S1070 compute box and you start the heavy computation step. When it is finished, do your test renders on the CPU - it takes very little time as the heavy computation was done 25x faster than the CPU would have done it.

Instead of paying $1500 for a second 6-core Xeon, pay $1500 for a GPU cluster. The results are shown in CPU vs GPU configurations:

You can see the single CPU 6-core Mac Pro in the CPU list with a score of 360. This CPU costs $1200 from Apple.
4 x Radeon HD7970 scores 3205. Those cost $480 each.

Long story short, if you get a 12-core Mac Pro for rendering, it's not the best use of money by a long way. Same for scientific computation. What's the point in shipping a 12-core Mac Pro if heavy computation is far better suited to the GPU?

Quote:

Originally Posted by hmm

If it wouldn't have been feasible in the past to offload something via fibre channel, what makes thunderbolt a game changer there?

Thunderbolt isn't just an IO protocol. You can't run a GPU over a Fibre Channel link.

Quote:

Originally Posted by MacRonin

Inserting Renderman into OS X as a Core component would be pretty sweet

And the simple fact of it being a bundle-in with the older NeXT OS, which became Mac OS X, which became OS X; could be a sort of leverage

Petition to the Mouse?!? ;^p

I would say that Renderman shouldn't be the core component though as it needs to be improved constantly, rather just a very specialised ray casting engine that pre-computes lighting. This is by far the slowest process. It can then bake the data and allow any number of final render engines to use it. It would be like Core Image in a way. It's not a case of having Core Photoshop but Core filters that process specific effects that can be used by apps like Quartz Composer or Pixelmator without starting from scratch.

Yeah, $2000 per seat in the volumes they ship. Color used to cost $25000. Software is only priced in such a way that it makes a profit after paying the people who developed it. Apple sells 20 million Macs a year and before Pixar was bought by Disney, their earnings were:

"In addition to film revenue, software licensing contributed $14.4 million to full year 2005 revenue."

They could sell Mountain Lion for $31 with Renderman bundled and make more profit than that. Even though some companies have servers with thousands of machines, the license isn't a yearly cost. It would be nice if companies could do that without being anti-competitive. If Adobe convinced Microsoft and Apple to ship the Adobe CS Suite with the OS, they could charge less than $5 on top of the cost of the OS. It's not that much more anti-competitive than Apple bundling iMovie.

Still, I wouldn't advocate bundling software like that as it's too complex, I was talking about core computation engines that complex apps can link up to like the Core Image or Quicktime frameworks. After Effects doesn't need to roll its own whole media API for rendering out to a movie like it would on Linux. The same could apply for rendering lighting, although the more complex the computation, the less reusable it is generally.

Apple wouldn't have to build that part themselves and it wouldn't be that expensive. Not much more than a Pegasus RAID system.

It depends if we're talking generic gpus or something comparable to Teslas which remain quite expensive. Someone did a comparison on this. Oh also in some of the applications which can harvest gpu functions, one of the limitations seems to be vram, which could become less of an issue if this gains popularity.

Quote:

Originally Posted by Marvin

Doubtful. We are just getting Sandy Bridge just now so not even close to double after nearly 2 years.

Well you can still buy real growth, but at comparable price points, it's remained fairly static.

Quote:

Originally Posted by Marvin

Fast GPU options will ship in far higher volumes than 12 CPUs. The GPUs can at least be used for visual tasks.

Like in Apple's Core software architecture. They can do it for encoding too - Core Compressor. This can be an API used by all apps to do batch image and movie compression. FCPX exports can be done in a fraction of the time.

That would be pretty cool.

Quote:

Originally Posted by Marvin

You can't fit 4 or more high-end GPUs in a Mac Pro so you have to install a card that effectively connects to an eternal box anyway. You may as well use Thunderbolt. When it comes to computation, the link bandwidth only matters if you have a huge dataset that you are accessing on the host. The compute box will have its own storage so the link speed doesn't matter.

You can't in a mac pro. There are apparently a couple workstations that accommodate such a thing, but you will pay a lot. Just the power needed for the gpus alone is pretty intense.

Quote:

Originally Posted by Marvin

It's the same as PCI. What examples are there where it can't be used?

It supposedly supports displayport 1.2 yet certain displays just don't work. Anandtech tested the overall performance as being lower with the connection maxed on displays. 10 bit displayport was only supported on one or two mac pro cards in Leopard. It does not seem to be currently possible over thunderbolt. This is one of my massive irritations as I would use it. Apple doesn't care because their display is unaffected as it lacks this feature.

Quote:

Originally Posted by Marvin

Render farms with thin client controllers have a similar setup. The server manager isn't going to go round with a USB pen and copy files to each server separately. They are all hooked up to a central location. The setup I suggest is simply a way for a single workstation user to have a powerful main machine and a simple compute cluster for intensive tasks.

Again I never suggested that a mac pro was primarily a render box. I said you could either set it as a low background priority to fill unused cpu cycles or run such a thing when you're not using it via a job management application. It wouldn't be the primary use. It would just be something to maximize its use.

Quote:

Originally Posted by Marvin

It's not really on a remote server though. It's more like a co-processor. Think of the following scenario:

You have a 6-core Xeon Mac Pro and working in Maya. When it comes to rendering, you plug in your Thunderbolt S1070 compute box and you start the heavy computation step. When it is finished, do your test renders on the CPU - it takes very little time as the heavy computation was done 25x faster than the CPU would have done it.

Instead of paying $1500 for a second 6-core Xeon, pay $1500 for a GPU cluster. The results are shown in CPU vs GPU configurations:

You can see the single CPU 6-core Mac Pro in the CPU list with a score of 360. This CPU costs $1200 from Apple.
4 x Radeon HD7970 scores 3205. Those cost $480 each.

Long story short, if you get a 12-core Mac Pro for rendering, it's not the best use of money by a long way. Same for scientific computation. What's the point in shipping a 12-core Mac Pro if heavy computation is far better suited to the GPU?

I'd never suggest a 12 core mac pro as a rendering box. That's a waste of money. Really any mac hardware dedicated to such a function is a complete waste of money. I need to find more details on gpu rendering, but it would have seen better growth if it didn't lack limitations somewhere or require complex code revision to make it work. You've probably seen Maxwell fire (not a big fan of Next Limit). They use the gpu for real time preview work there, yet it has no real effect on final rendering times. There has to be a reason for that, and obviously that doesn't mean it will remain this way. Apple is still coasting on their current product lines anyway. I don't think they even handle Mac Pro revisions internally. I doubt there is any Mac Pro team. It's most likely just handled at Foxconn or in a sense spun off, but none of the current lines make real sense as successors.

Quote:

Originally Posted by Marvin

Thunderbolt isn't just an IO protocol. You can't run a GPU over a Fibre Channel link.

Quote:

Originally Posted by Marvin

I would say that Renderman shouldn't be the core component though as it needs to be improved constantly, rather just a very specialised ray casting engine that pre-computes lighting. This is by far the slowest process. It can then bake the data and allow any number of final render engines to use it. It would be like Core Image in a way. It's not a case of having Core Photoshop but Core filters that process specific effects that can be used by apps like Quartz Composer or Pixelmator without starting from scratch.

That makes more sense although I don't understand the methods of calculation there well enough to know how that would work. I noticed comments on some of the earlier gpu rendering implementations of it taking a certain amount of time to compile the scene for the gpu. Certain shaders and things were not supported as well. I can't find completely consistent information in terms of limitations there.

Quote:

Originally Posted by Marvin

Yeah, $2000 per seat in the volumes they ship. Color used to cost $25000. Software is only priced in such a way that it makes a profit after paying the people who developed it. Apple sells 20 million Macs a year and before Pixar was bought by Disney, their earnings were:

"In addition to film revenue, software licensing contributed $14.4 million to full year 2005 revenue."

They could sell Mountain Lion for $31 with Renderman bundled and make more profit than that. Even though some companies have servers with thousands of machines, the license isn't a yearly cost. It would be nice if companies could do that without being anti-competitive. If Adobe convinced Microsoft and Apple to ship the Adobe CS Suite with the OS, they could charge less than $5 on top of the cost of the OS. It's not that much more anti-competitive than Apple bundling iMovie.

Still, I wouldn't advocate bundling software like that as it's too complex, I was talking about core computation engines that complex apps can link up to like the Core Image or Quicktime frameworks. After Effects doesn't need to roll its own whole media API for rendering out to a movie like it would on Linux. The same could apply for rendering lighting, although the more complex the computation, the less reusable it is generally.

I'm aware of this stuff. Power animator used to cost $30k or whatever. Smoke was close to $100k with the dedicated hardware. I got that stuff. It is common practice with other stuff too. Mental Ray ships default with several packages although it has a lot of quirks. If standalone MR was the default, it would most likely cost more and see lower adoption rates. Personally I thought the lawsuit over Windows being bundled was dumb when they also offer Linux options.

Actually that kind of development could be a good thing, but this is a departure from your previous statement that a mini was ideal. I really don't care what the box looks like. In the case of thunderbolt, if the available IO is suitable to such a rig, it could be good. To do any kind of box with 4 GPUs you'd need a really beefy power source though and a lot of cooling right there (although according to my email boxx just created one that's probably over $10k fully configured). Then of course in terms of IO, you still require enough lanes to pump out that much TB bandwidth, as they subtract from available PCI lanes. I wasn't speaking so much of rendering stages anyway, which are commonly done on a server or in a distributed manner (as in workstations could be put to use at night if possible for such a task). I was saying that it's typical for functions that are run as close to real time as possible to remain localized as opposed to split off which generally works better for longer number crunching sequences. The only times I've really seen people use their workstation as a primary rendering box is when they get a new one (they'll sometimes put the old one to that task). If you mainly deal with stills and stuff for print compositing (something I plan to move away from), you can do that part off almost anything. 6-8k stills beat up the ram more than anything else. It's just working in a high poly scene without hiding a bunch of stuff or working with simulations can suck either way (even for stills you can still frame grab and take snapshots).

In general, I don't like that they put display output and PCI over the same cable. I would have preferred USB 3 and Thunderbolt to be combined. No going back now though.

Quote:

Originally Posted by hmm

I'd never suggest a 12 core mac pro as a rendering box. That's a waste of money. Really any mac hardware dedicated to such a function is a complete waste of money.

I agree and it's why I think the dual processor models are unnecessary now. The real-time tasks are better served by fast storage and more RAM.

Quote:

Originally Posted by hmm

You've probably seen Maxwell fire (not a big fan of Next Limit). They use the gpu for real time preview work there, yet it has no real effect on final rendering times. There has to be a reason for that

"While other interactive render solutions providing GPU based interactive previews force you to buy expensive graphics cards to achieve the desired results, Maxwell Fire is CPU based, and no special hardware is needed.

While GPU hardware has become more capable of handling some of the calculations a complex render engine requires, they are still not ready to efficiently accommodate all Maxwell Render features."

Being assured that a big enough user-base has the right GPUs and drivers is a problem at the moment but it shouldn't affect modern Macs and over time, all computers will adopt the standards like they have with OpenGL.

In general, I don't like that they put display output and PCI over the same cable. I would have preferred USB 3 and Thunderbolt to be combined. No going back now though.

Bleh it's typical Apple. Consolidate things and claim that it has zero disadvantages simply because the majority of their users will never encounter them. Routing it over usb would have made too much sense. I still think the job title "director of thunderbolt technology" is hilarious (from the intel videos).

Quote:

Originally Posted by Marvin

I agree and it's why I think the dual processor models are unnecessary now. The real-time tasks are better served by fast storage and more RAM.

They definitely have a smaller market now. They'll either remain or they won't. Partially I would guess it'll depend on upcoming software. RAM is commonly misunderstood. 32 bit applications meant that you couldn't take advantage of as much of it. Now silly people can't see a point in anything past 4-8GB of ram, yet they feel an SSD makes everything so much faster. I wonder why. Fast storage is great too especially if you can get away without using file compression which in some applications remains a single threaded process. Anyway still waiting on a mac pro.

The things in terms of quality control and features that would be required to make me even remotely interested in an imac would probably push the price up considerably. If they still ignore 10 bit displayport, I may actually switch. I thought I was safe buying this given that it showed signs of development at the time, but rather than continuing, that died with Snow Leopard and hasn't shown up in Lion. Things like that irritate me considerably on expensive hardware.

"While other interactive render solutions providing GPU based interactive previews force you to buy expensive graphics cards to achieve the desired results, Maxwell Fire is CPU based, and no special hardware is needed.

While GPU hardware has become more capable of handling some of the calculations a complex render engine requires, they are still not ready to efficiently accommodate all Maxwell Render features."

Being assured that a big enough user-base has the right GPUs and drivers is a problem at the moment but it shouldn't affect modern Macs and over time, all computers will adopt the standards like they have with OpenGL.

Doh! You're right... I swore their marketing kool-aid said otherwise. Maxwell studio always looked cool, but the idea of letting stills bake for 24 hours to clear noise. It's not like it's that difficult to tune any of them these days anyway. When maxwell came out initially it was extremely slow with limited features, but it could do some amazing stuff out of the box, at least from what I saw produced by others.

Maxwell studio always looked cool, but the idea of letting stills bake for 24 hours to clear noise. It's not like it's that difficult to tune any of them these days anyway. When maxwell came out initially it was extremely slow with limited features, but it could do some amazing stuff out of the box, at least from what I saw produced by others.

It was one of the first commercial engines to do unbiased rendering, which uses physical lighting algorithms. It's very slow but accurate. Luxrender has an experimental OpenCL version using the same technique:

As they note on their benchmark site, it's not really a case of CPU vs GPU but rather CPU+GPU, which only OpenCL allows. OpenCL lets you use power you already have to almost double the compute power of your machine or more (depending on what GPU you have).

Unbiased isn't really needed though - if it takes 24 hours now to get clean output, it will take 10 years to get it under an hour. Feature film effects just use scanline or raytracing:

Composition helps with the realism and I'd say it's better to allow an artist to tweak an inaccurate picture quickly than allow a computer to generate an accurate image slowly.

It all comes down to results. That's what I think will play the biggest role in deciding the future of the Mac Pro. Can an iMac allow Apple's customers to achieve the results they need? If not, the Mac Pro lives on.

It was one of the first commercial engines to do unbiased rendering, which uses physical lighting algorithms. It's very slow but accurate. Luxrender has an experimental OpenCL version using the same technique:

As they note on their benchmark site, it's not really a case of CPU vs GPU but rather CPU+GPU, which only OpenCL allows. OpenCL lets you use power you already have to almost double the compute power of your machine or more (depending on what GPU you have).

Unbiased isn't really needed though - if it takes 24 hours now to get clean output, it will take 10 years to get it under an hour. Feature film effects just use scanline or raytracing:

Composition helps with the realism and I'd say it's better to allow an artist to tweak an inaccurate picture quickly than allow a computer to generate an accurate image slowly.

It all comes down to results. That's what I think will play the biggest role in deciding the future of the Mac Pro. Can an iMac allow Apple's customers to achieve the results they need? If not, the Mac Pro lives on.

I noted that maxwell made a lot of bandaid fixes to their workflow so that it can spit out multiple versions from a single render so that trying to get a bunch of lighting angles for a product shot or something like that (which seems to be a popular area for it) doesn't have to take days or a massive render farm. The workflow still seems weird and some of the cool features are extremely cpu intensive. They even suggest against the use of things like the physical sky rather than a skydome because of the amount of time required. The skydome is probably fine though. It can't be any less correct than lighting via spherical hdr imagery.

The imac and other machines really aren't designed to cater to such a market, and it's not just at a level of raw power. Lack of serviceability on a couple basic components, quality control, and durability aren't really there. It doesn't even save you on price if it means you have to buy all new supporting hardware. In the case of the Z1, HP was aiming at different customers. I'm not comparing pricing here. It's quite expensive. My point is that the imac was never designed with such a market in mind. Apple really ignores it quite a bit on some things. It's just been little stuff for quite a few years like these hard drives make your machine unable to sleep (or cause problems when waking) or the gpus don't support certain features that would be incredibly useful to some people. They take the theory of dangling a shiny object in front of the general population. You don't notice that you can't adjust the imac height until later. Display issues tend to pop up after the one year mark.

I get the issue that the workload isn't growing exponentially for the majority of their customer base, but it's unlikely that they'll fix some of these issues with the other machines just for that portion of their users.

My initial thought of RenderMan in OS X had to do with the history of it being included in the NeXT OS, of which OS X is derived

From a geek perspective, it would be neat if RenderMan was somehow a Core component, and an expensive app store purchase could buy a full seat (your choice; RFM, Studio or Pro Server)

It would definitely be cool to have some Pixar software integrated into the Mac system given the history of the companies. Renderman came to OS X in 2003 and they even had Ed Catmull from Pixar in one of their keynote videos:

The above keynote highlights a dramatic shift in Apple's focus. The full hour keynote was dedicated to the G5 with companies like Luxology (Modo) and Wolfram taking part. Benchmarking their computer on stage against the Xeon PCs and going into all the technical details of the CPU (some people these days would claim that isn't Apple's way but it very clearly was).

There is quite a sad part where the guy from Wolfram says about working with Steve for 15 years and wondering if in another 15 years, they'd be shuffling around introducing a 'nanotube' Mac.

Jon Rubinstein was there too talking about the G5 system. It's amazing to see how exciting they were able to make the technology and enclosure of the high-end tower but they can't really do this any more. The excitement now comes from getting something so powerful into an ever smaller package.

Even with a small package, it's hard to impress people. You can see at the iPad launch where they showed off the flying game. It looked nowhere near as good as Ace Combat for the PS3 and people could tell. The reaction was just 'meh'. All the computing power of that awesome G5 tower in the palm of their hands in just 9 years and the reaction goes from loud applause to 'meh' because now it's just incremental.

They take the theory of dangling a shiny object in front of the general population. You don't notice that you can't adjust the imac height until later. Display issues tend to pop up after the one year mark.

I get the issue that the workload isn't growing exponentially for the majority of their customer base, but it's unlikely that they'll fix some of these issues with the other machines just for that portion of their users.

I agree to an extent. I think the iMac displays should have a 3 year warranty like Dell's and they should improve the serviceability of the memory components but when you look at the iPad, how often do you hear about display failures or storage failures? No beachballing, rarely dead pixels, no failure to boot, no overheating GPU, you name all the issue on the desktop and they don't exist on the iPad. That's where computers are going.

Serviceability introduces risk. Steve Jobs said it once about code - 'the least expensive, most bug-free line of code is the one you didn't have to write'. We simply won't always have to open computers up just like we don't open up microwaves, ovens, fridges, toasters. Their functions are basic but the parts have evolved to fulfil their purpose without service or improvement and this will be the same for computing.

Computing hasn't reached that pinnacle yet but the idea is to cull the lines that have no growth at the right time because they're going to die anyway.

Think of the fact that the same power has gone from the G5 tower in 2003 to a phone in 2012. By 2020, the same thing will have happened, possibly even more dramatic as manufacturers explore different materials. Imagine a Mac Mini in 2020 - 16x faster CPU, 32x faster GPU, 1TB SSD, 50-100Gbit Thunderbolt, up to 128GB RAM and cheap as dirt so serviceability doesn't matter. Sure, a tower could house 4x the performance and allow you to remove storage but no one would bother buying it when you can just toss a Mini out when it breaks just outside its warranty like you would a broken microwave.

It would definitely be cool to have some Pixar software integrated into the Mac system given the history of the companies. Renderman came to OS X in 2003 and they even had Ed Catmull from Pixar in one of their keynote videos:

The above keynote highlights a dramatic shift in Apple's focus. The full hour keynote was dedicated to the G5 with companies like Luxology (Modo) and Wolfram taking part. Benchmarking their computer on stage against the Xeon PCs and going into all the technical details of the CPU (some people these days would claim that isn't Apple's way but it very clearly was).

There is quite a sad part where the guy from Wolfram says about working with Steve for 15 years and wondering if in another 15 years, they'd be shuffling around introducing a 'nanotube' Mac.

Jon Rubinstein was there too talking about the G5 system. It's amazing to see how exciting they were able to make the technology and enclosure of the high-end tower but they can't really do this any more. The excitement now comes from getting something so powerful into an ever smaller package.

Even with a small package, it's hard to impress people. You can see at the iPad launch where they showed off the flying game. It looked nowhere near as good as Ace Combat for the PS3 and people could tell. The reaction was just 'meh'. All the computing power of that awesome G5 tower in the palm of their hands in just 9 years and the reaction goes from loud applause to 'meh' because now it's just incremental.

Heh...I actually predicted the ipad and ipod would be big. I figured the same with the iphone, but I still underestimated how big. Looking at ipods, the non removable battery did suck. I had several die. Gaming has never been a strong benchmark for Apple. They never put a lot of effort into working with the developers who release gaming engines and stuff like that. It was more a side thought of something that would be nice. I'm surprised they used one in a keynote for reference.

Regarding what they do with the towers, you should consider that we currently have three aging desktop designs. With the mini they definitely pushed it to the low end. It's below the spec of available laptops and desktops. It's definitely intended as their budget model there. My issue with the imac and serviceability is that it's not that reliable, and they do still make them today. With the mac pro the argument is that it's dated. If Apple wanted to refactor it in any way, especially given their tendency to make everything as compact as possible, Haswell seems like a more logical time if Intel lives up to their hype. This would put it somewhere in late 2013 to 2014. If they're releasing an updated unit this year that probably takes virtually no development as Foxconn can handle it, that gives you a machine to address any pent up demand, and Ivy Bridge can be used on the same board design like they did with Westmere. You'll see more grumbling, but if they intend to ride the same design for a decade again, that is most likely the logical choice given that none of their other machines are ready to pick up the slack yet.

Quote:

Originally Posted by Marvin

I agree to an extent. I think the iMac displays should have a 3 year warranty like Dell's and they should improve the serviceability of the memory components but when you look at the iPad, how often do you hear about display failures or storage failures? No beachballing, rarely dead pixels, no failure to boot, no overheating GPU, you name all the issue on the desktop and they don't exist on the iPad. That's where computers are going.

Serviceability introduces risk. Steve Jobs said it once about code - 'the least expensive, most bug-free line of code is the one you didn't have to write'. We simply won't always have to open computers up just like we don't open up microwaves, ovens, fridges, toasters. Their functions are basic but the parts have evolved to fulfil their purpose without service or improvement and this will be the same for computing.

Computing hasn't reached that pinnacle yet but the idea is to cull the lines that have no growth at the right time because they're going to die anyway.

Think of the fact that the same power has gone from the G5 tower in 2003 to a phone in 2012. By 2020, the same thing will have happened, possibly even more dramatic as manufacturers explore different materials. Imagine a Mac Mini in 2020 - 16x faster CPU, 32x faster GPU, 1TB SSD, 50-100Gbit Thunderbolt, up to 128GB RAM and cheap as dirt so serviceability doesn't matter. Sure, a tower could house 4x the performance and allow you to remove storage but no one would bother buying it when you can just toss a Mini out when it breaks just outside its warranty like you would a broken microwave.

I'm going to add that server components go down too. Whenever you run something that hard, it puts more stress on the machine. They just don't really build anything designed for that outside of the mac pro right now (and again if i'm running anything that goes for a really long time, I run it at the end of the day and let the machine go to sleep when it finishes).

Three years or more is standard on displays and typical on every other workstation class machine. It made more sense to limit warranty periods when Apple was a small company, given that they probably couldn't afford it back then. The problems you mention are generally much bigger complaints with the laptops and mini than on the enormous towers. Also I don't fully agree on it being like a microwave. Apple likes to maintain higher price points wherever possible, and the top ipads are still quite expensive. The phones appear less expensive as it's worked into your mobile contract.

My issue with minis, laptops, etc. isn't entirely one of performance. Much of it is just the ability to run the machine hard without failure or hiccups being likely. Machines don't necessarily just die. You start to experience problems or they run excessively hot and shut down randomly. It's just a case of wishing to deal with a line that is much less prone to this. When they are designing such machines, they get to make choices. My whole issue with their choices is that aesthetics and simplicity of software seem to trump things like functionality, ergonomics, and reliability. I dislike their typical priorities.

Anyway the X line will be killed thing comes up all the time. It came up with the Xserve which was true. It came up with the mini. It came up with the Apple TV. Regarding reliability and the ipads, I still hear of backlight bleed issues. They could accept a slightly brighter black level and just use panel blocking to fix this, but that would also add to manufacturing costs due to the additional levels of testing needed in manufacturing and a method of calculation to adjust this for drift and backlight degradation. I'm also not sure how hard the NAND is getting hammered in the ipad. Laptop ssds experience completely different usage patterns. I don't think the ipad even caches to disk, which would explain the lack of the spinning wheel. Memory paging was a really common cause there. IOS seems to be more like earlier OS versions where if you didn't have enough memory, you had to close something (people could learn from that today).

Anyway I'm not sure what direction they'll go. Consumer demand influences quite a lot of stuff. Games drove improvements on gpu technology starting from the 90s. If everyone owns a lighter laptop as their sole or primary computer, that will influence both the requirements of future games and the relative scale of game elements due to smaller average display sizes. That was just an example.

The MacPro is becoming an embarrassment - 500+ days without any refresh/redesign. When does Apple throw in the towel and pull it from the line up? And more importantly, how does Apple upsell its customers from the high end iMac - or does it even bother? This is quite an important strategic question - I wouldn't give up on the high end. I like having a couple of good looking screens on my desk, a good slug of RAM, and some grunt in the CPU/GPU area. And I'd rather give my money to Apple because I like their stuff.

The MacPro is becoming an embarrassment - 500+ days without any refresh/redesign. When does Apple throw in the towel and pull it from the line up? And more importantly, how does Apple upsell its customers from the high end iMac - or does it even bother? This is quite an important strategic question - I wouldn't give up on the high end. I like having a couple of good looking screens on my desk, a good slug of RAM, and some grunt in the CPU/GPU area. And I'd rather give my money to Apple because I like their stuff.

*sigh* this argument comes up all the time. The lack of updated cpus and stuff isn't really the issue. We had one weak gpu generation that only made it to about half of the PC side at the workstation level. Those same machines are still using 5500/5600 series cpus too. Go compare. Right now the main sticky issue I could see is that the newest gpus aren't really shipping in volume.

The MacPro is becoming an embarrassment - 500+ days without any refresh/redesign. When does Apple throw in the towel and pull it from the line up? And more importantly, how does Apple upsell its customers from the high end iMac - or does it even bother? This is quite an important strategic question - I wouldn't give up on the high end. I like having a couple of good looking screens on my desk, a good slug of RAM, and some grunt in the CPU/GPU area. And I'd rather give my money to Apple because I like their stuff.

Aye.

I guess the next Mac Pro update will tell us a great deal. I think a Pro redesign is obviously over due but if a 100 billion cash in bank company can't prioritise a re-design it won't auger well for the Pro's future.

We're still waiting on ANY update to the Mac.

That, 3 months in, we got an iOS product first in the iPad tells us a great deal about where Apple's priorities are focused.

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

*sigh* this argument comes up all the time. The lack of updated cpus and stuff isn't really the issue. We had one weak gpu generation that only made it to about half of the PC side at the workstation level. Those same machines are still using 5500/5600 series cpus too. Go compare. Right now the main sticky issue I could see is that the newest gpus aren't really shipping in volume.

Yes. But high prices and the fact that none of the other components get updated in the meantime over nearly 2 years can't help sales on a product that, polity put, is vastly marked up.

While competitive PC towers have prices that constantly churn downwards.

Who wants to pay outrageous prices for out of date kit? Who?

Marv's comparison of the Apple display vs the Dell 27 inch display mark up. Almost £300+ in difference for the same thing?

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

Yes. But high prices and the fact that none of the other components get updated in the meantime over nearly 2 years can't help sales on a product that, polity put, is vastly marked up.

While competitive PC towers have prices that constantly churn downwards.

Who wants to pay outrageous prices for out of date kit? Who?

Marv's comparison of the Apple display vs the Dell 27 inch display mark up. Almost £300+ in difference for the same thing?

Lemon Bon Bon.

Well they aren't identical. They use the same panel. The way they do their measurements and set the levels, backlight design, secondary uniformity corrections, dithering method, internal processing, and overall supportive electronics make a significant difference. We're at a point of basically generic hardware with displays, so the difference between high end and low end isn't quite what it was, but it's still there.

I guess the next Mac Pro update will tell us a great deal. I think a Pro redesign is obviously over due but if a 100 billion cash in bank company can't prioritise a re-design it won't auger well for the Pro's future.

The whole desktop line up at Apple is an embarrassment. In football terms they need to punt.

All that cash does imply an issue with priorities as they should have no trouble highering entire staffs just to design each machine. Of course most of that money isn't in the US so maybe it isn't too easy to get to. Still it wouldn't hurt them to spend $100 million a year on Mac engineering. Mind you a $100 million ought to get you a state of the art Mac with a whole bunch of custom electronics.

Quote:

We're still waiting on ANY update to the Mac.

True but don't blame Apple.

Quote:

That, 3 months in, we got an iOS product first in the iPad tells us a great deal about where Apple's priorities are focused.

Lemon Bon Bon.

Actually the iPad debut tells us nothing. If nothing else it looks like they fitted it into an empty gap in the calandar. Really if Intel is humping the pooch instead of delivering new chips what can Apple do? Even the latest news on IB is a little screwed up, it isn't even clear if we will have chips suitable for the AIR before July.

I get annoyed too at times, but people can be weird. There are little updates I would like to see, and I'd like to see them do a lot better with display drivers. It really makes no sense to launch an update without both cpus and gpus in place though especially considering how soon we should see new AMD cards and/or Kepler become available. Also regarding Ivy Bridge keep in mind that Intel isn't the only one who was having trouble keeping on schedule with 22nm fabrication. You actually mentioned that yourself last year on this same topic.

but the majority of them are just 40% higher with the extra 2 cores. That would be an ok update over 1 year but not after nearly 2 years. I think they at least need to get the entry model up to 6-cores.

Ivy Bridge chips for the iMac and 15" MBP are due late April/May.
Ivy Bridge chips for the MBA, 13" MBP and Mini are due in June.

There's a benchmark of Ivy Bridge CPU + kepler GPU that could make its way to the 15"MBP:

I get annoyed too at times, but people can be weird. There are little updates I would like to see, and I'd like to see them do a lot better with display drivers.

Apple really doesn't have any excuses anymore. I mean this totally, there is no reason why they can't higher the people they need to get drivers, OpenGL and other features up to snuff. When Linux has better drivers and support you know something is wrong.

Quote:

It really makes no sense to launch an update without both cpus and gpus in place though especially considering how soon we should see new AMD cards and/or Kepler become available.

The AMD GPUs are ready to go so I don't think an update is being held up there. Well other than the possibility that Apple will integrated the GPU processor on the motherboard.

Quote:

Also regarding Ivy Bridge keep in mind that Intel isn't the only one who was having trouble keeping on schedule with 22nm fabrication. You actually mentioned that yourself last year on this same topic.

Yes I know and that is why I object to the blame Apple mentality. Like it or not Apple can't ship new stuff if the processor isn't there to ship in the first place.

In any event Apples problem with the Pro is targeting to small of a market considering it is Apples only viable and configurable desktop. I see this as the primary driver for a refactored Pro.

but the majority of them are just 40% higher with the extra 2 cores. That would be an ok update over 1 year but not after nearly 2 years. I think they at least need to get the entry model up to 6-cores.

I would want to see production systems from Apple before getting too excited one way or the other. I've seen numbers all over the place, some indicating a 2 X improvement in performance.

Quote:

Ivy Bridge chips for the iMac and 15" MBP are due late April/May.

I can't wait and frankly I'm not even in the market. Today my intention is to hold off another year but hey you never know.

Quote:

Ivy Bridge chips for the MBA, 13" MBP and Mini are due in June.

The interesting thing here is that they will likely be competing directly with Trinity from AMD. If that chip lives up to its billing it would be a better choice for the AIR and Mini. However I was under the impression that the Mini already used 35 watt processors

Quote:

There's a benchmark of Ivy Bridge CPU + kepler GPU that could make its way to the 15"MBP:

Frankly I hope they stay away from NVidia. Mainly because AMD has changed for the better with respect to drivers and open source. Note I said better, their drivers have a ways to go but are far better than past efforts.