Originally Posted by hmm
I don't think cloud computing will completely remove the need for optimization to control costs and cpu time requirements. You still pay for the time there. It provides smaller shops with some amount of scalability, which is very cool. It doesn't actually displace the need for workstation hardware anywhere things must be addressed in real time, especially in terms of gpu hardware. GPUs get stressed quite a bit. It's still common to see low rez proxies used to set up a scene or animate even with the use of powerful gpus.

There's also the matter of what computers sit in the cloud. Naturally, they'll try to maximize the space with blade servers and multiple GPUs but 'smaller' providers could buy 100 12-core Cubes to get a 1200-core farm for under $0.5m and charge it out by the hour. One here charges $0.7/core/hour:

That was used for a few films: Ant Bully, Superman, Die Hard 4, Iron Man, Night at the Museum, Spiderman 3.

If they had it running full load, 24/7, they could make 1200 x $0.7 x 365 x 24 = ~$7m/year minus running costs. If Apple made it easy to configure, even better e.g install one software package on a control node and just plug as many machines in and enable "compute sharing" on each and it can use CPU and GPU seamlessly. They can even have an iPad app and cable to configure a node. Take it out the box, plug into the network, plug an iPad in via USB, turn on the node and set it up. From that point, the control node would deal with the software.

It can be more affordable to have your own workstation but it depends. If you render at even 5 minutes per HD frame and you aim for a 20 second TV commercial that's 20 seconds x 30 frames x 5 minutes = 50 hours straight - it doesn't leave much room for mistakes or deadlines. You'd ideally use both solutions but you can do proxies and the actual work on any machine.

It can be more affordable to have your own workstation but it depends. If you render at even 5 minutes per HD frame and you aim for a 20 second TV commercial that's 20 seconds x 30 frames x 5 minutes = 50 hours straight - it doesn't leave much room for mistakes or deadlines. You'd ideally use both solutions but you can do proxies and the actual work on any machine.

I don't personally buy into the cube theory there, but you know that. I'm not how many commercials would have the render budget to book 1200 nodes for 50 hours. That's only a fraction of their total production cost. By the way, it should be 24 fps, and you wouldn't run such a job without tests. There would be a number of checks prior to the final run, at which point they still have the ability to tweak things at the compositing level.

It's not a theory, just an option for them. It would require a lot more engineering effort but the way I see it is their engineers are probably quite bored these days and could use a challenge. They already have tons of money, why not do something interesting? There's not much of a challenge in dropping an Intel motherboard and some off-the-shelf parts into a big box they've already engineered. Maybe that's how they like it though.

If they just went the drop-in upgrade route, you'd end up with almost the same machine but with dual 8-core Ivy Bridge (they won't use the 10-core or 12-core options at this price point) for $6200, USB 3 support, PCI 3, SATA 6G, Radeon 8970 or whatever Nvidia calls their GTX 680 next year. It's tried and tested I suppose but it's a bit lame.

Quote:

Originally Posted by hmm
I'm not how many commercials would have the render budget to book 1200 nodes for 50 hours.

They wouldn't do that most likely. If one 12-core machine can do a frame in 15 minutes, they only need 480-600 frames done depending on what framerate they use. So they'd book the 100-machines (1200 cores) for 1.5 hours, which costs $1260.

What's the alternative? If you buy 20 machines of your own, you have to spend $80,000-120,000 and it'll take 7.5 hours to do each render.

Quote:

Originally Posted by hmm
you wouldn't run such a job without tests.

Sure but tests can be done at lower quality or in shorter chunks for which you could use any suitably powerful machine.

It's not a theory, just an option for them. It would require a lot more engineering effort but the way I see it is their engineers are probably quite bored these days and could use a challenge. They already have tons of money, why not do something interesting? There's not much of a challenge in dropping an Intel motherboard and some off-the-shelf parts into a big box they've already engineered. Maybe that's how they like it though.

The bolded part made me laugh. It's extremely funny, as there may be some truth to it. Workstation and server products are often very conservative. They need to show up and work with minimal downtime, and that may make the design somewhat boring for their respective teams. I mentioned how Google is building their own servers. I think facebook was doing the same thing. The oems may not add enough value to the equation for them.

Quote:

If they just went the drop-in upgrade route, you'd end up with almost the same machine but with dual 8-core Ivy Bridge (they won't use the 10-core or 12-core options at this price point) for $6200, USB 3 support, PCI 3, SATA 6G, Radeon 8970 or whatever Nvidia calls their GTX 680 next year. It's tried and tested I suppose but it's a bit lame.
They wouldn't do that most likely. If one 12-core machine can do a frame in 15 minutes, they only need 480-600 frames done depending on what framerate they use. So they'd book the 100-machines (1200 cores) for 1.5 hours, which costs $1260.

I was under the impression we were still talking about server hardware, so I went with that. I noticed you precisely quoted mac pro with an NVidia gaming card. In terms of workstations, that is likely. I think I've mentioned this, but the ideal machine for me would be either a quad i7 or more likely the Sandy E5 version of the W3680 with a mid range Quadro card and maxed out ram. I just max ram because it's so cheap now that I don't bother splitting hairs trying to figure out the optimal amount. 16GB would be slightly low in a new machine as it's kind of low for me today. Might as well go to 32. This has little to do with Apple's lineup. I'm just saying what I'd pick if I was building one for my own use from the ground up. I've been following cloud solutions as well. They have the potential to provide a lot of extra leverage to freelancers, and if rendering time is cheaper than trying to find workarounds to brute force GI methods or increased sampling, it can make sense. This wouldn't be the case if you're working with a fixed amount of render power. I've worked on a pretty wide range of machines. I can tell you that there's still a good reason to have as much gpu power as possible. It doesn't mean everyone can justify $2000+ on a gpu, but many of them would likely derive some amount of benefit from it. This is one of those things that is still quite unique to the desktop. I don't expect it to remain that way. I'm just saying it is what it is for now.

Quote:

What's the alternative? If you buy 20 machines of your own, you have to spend $80,000-120,000 and it'll take 7.5 hours to do each render.
Sure but tests can be done at lower quality or in shorter chunks for which you could use any suitably powerful machine.

Yeah I wasn't suggesting that. Individuals and smaller companies generally can't afford that. They have to know it will pay for itself and have cash or financing lined up. The concepts of cloud computing and slim clients aren't really new at all. It's just that they're being leveraged for things that weren't practical in the past. I would suggest that 15 minutes per frame on a 12 core machine would be pretty damn fast. I know that's just arguing details, but it's not that atypical to render a little larger than the final resolution to allow for cropping and tweaks in compositing. It's also a common way of dealing with noise/anti-aliasing, as heavy use of something like mitchell-netravali filtering can be way too time consuming to resolve without flicker. If it's a print job, it's less of an issue, but that render could be running for many hours. This is a fun discussion. I don't these solutions as much in the way of competition for the mac pro outside of freelancers and small shops, but I don't know what the sales distribution is like on those machines. It's hard for me to offer a even a reasonably good analogy when I have no idea who will order the most mac pros going forward. My earlier comments on HP were more about how I think workstation class machines are a much bigger deal to them, in spite of low volume.

I got a little off track there. If you note one of my prior links, scalable render power would mean that a greater range of shops could handle projects like that involving instanced hero objects and huge amounts of geometry. You need proxies either way, but just having the ability to use geometry rather than mapped textures can help alleviate that CG look that I can often pick out at a glance.

Marvin & hmm, it is interesting that these discussions always center around rendering and other video processing uses of the Mac Pro. Many of us don't care and frankly that is not what defines a Pro computer for us. As a workstation it is easy to end up to focused on one field and end up with a very low volume machine. Low volume in the context of Apple anyways.

To prop up volume, which I see as a serious Mac Pro issue, Apple needs to consider a wider array of users. Video processing may still be important to the Mac Pro market, but moving into the future I don't see it being well supported on the massive machines we have today. This is why I see a much smaller Mac Pro in our future. If you need a machine to build a render farm on, it makes sense to me to make each node compact, high performance and cost effective. Such a node is also highly salable to those of use not in need of the big box but do want big processor performance.

Thus I wouldn't be surprised to find the new Mac Pro looking something like Marvin's drawing/rendering. Most likely it would be a it bigger, but the general idea flies with me. Done right such a box could end up being sold in groups of twos fairly close to the price of a single dual processor Mac Pro of today's design. I say fairly close because I'm still expecting each node to have a discrete GPU thus increasing cost a bit. That effectively means each node would have to sell in the $1500 to $1800 range. That might be tuff to meet depending upon the hardware selected, but I see a need for that price range if a shrunken and clusterable Mac Pro where to ever be successful. In other words such an arrangement can't end up being more expensive (significantly) than the traditional big box approach.

The goal of a redesigned Mac has to be solidifying and increasing sales. The current approach is frankly a dead end and does more to shrink sales than to drive sales. The current machines are just to expensive to be easily justified.

Marvin & hmm, it is interesting that these discussions always center around rendering and other video processing uses of the Mac Pro. Many of us don't care and frankly that is not what defines a Pro computer for us. As a workstation it is easy to end up to focused on one field and end up with a very low volume machine. Low volume in the context of Apple anyways.

They're known Mac markets, and it's easy for me to comment on what I know. It seems like the imac is what they hope will really stick at the moment. To be a healthy line, they would need to attract more people to OSX or attract more frequent purchases. We're dependent on intel to a lesser degree there, as we don't know how long it will be before they're back on a predictable cycle. Marvin suggested they should skip Ivy. I don't see that happening at all, especially as that would leave their EX/E7 processors on Westmere. There is no Sandy EX just like there was no Nehalem EX. I don't think they'd skip a tick cycle in server hardware.

I would suggest that 15 minutes per frame on a 12 core machine would be pretty damn fast.

It's going to vary a lot depending on what's in the shot of course but that was just a rough time to show it can be far more cost-effective on the highest -end jobs to move to the cloud. There are examples that show similar times:

http://www.chaosgroup.com/en/2/envyspot.html?i=15
"Although we got a powerful render farm here at Taylor James we aim to keep our render times as low as possible. With the flexibility of V-Ray you can easily tweak your render times. Normally we aim for 30 min per frame for an HD frame on a 16-core machine."

I don't these solutions as much in the way of competition for the mac pro outside of freelancers and small shops

http://www.technologyreview.com/news/425698/hollywoods-cloud/
"One such firm is Afterglow Studios, based in Minneapolis. Its owner, Luke Ployhar, is currently finishing Space Junk 3D, a 40-minute stereoscopic film about the 6,000 tons of garbage circling the planet. It’s a big project for a small firm, which has required more than 16,000 hours of computing time to animate, or render, the scenes of orbiting debris. Ployhar estimates that if he’d bought computers to do the job, he would have spent at least $50,000 on equipment. It wouldn’t have been economical for me to buy all these machines.

Last year, about 5 percent of DreamWorks Animation’s rendering was done in the cloud, but the company plans to increase that to 50 percent by the end of 2012, Derek Chan, head of digital operations at DreamWorks Animation says, rather than spend many millions of dollars to expand its existing data center."

Massive companies have their own farms or pay for the cloud. Freelancers and smaller shops can use a cloud solution too. There's a space somewhere for the large personal workstation somewhere but it's getting smaller.

Quote:

Originally Posted by Wizard69
it is interesting that these discussions always center around rendering and other video processing uses of the Mac Pro. Many of us don't care and frankly that is not what defines a Pro computer for us. As a workstation it is easy to end up to focused on one field and end up with a very low volume machine.

It's just one of those fields where there's a constant demand for more power so it fits quite well with the idea of needing powerful local machines to exist. There are other fields like music production, CAD, medical/scientific fields that use computation and so on. There are areas where you might run many small compute processes over and over and a personal workstation would work out more cost-effective or just be more convenient to use. That's where a better performance-per-dollar and smaller machine would be beneficial. They could of course get by with an iMac or MBP but a 6-core+ and a high-end desktop GPU can offer a decent speed boost for a little extra money.

I don't think $2500 is a bad starting price but it should offer a 6-core so there's an immediate reason to buy one over an iMac. With a single CPU, it tops out at $4000-4500 instead of $6200 and if you need more power, you get another box where you get a second CPU and GPU with more RAM and storage. A spare 128Gbps half-length PCI 3 slot would let you do everything else.

It's going to vary a lot depending on what's in the shot of course but that was just a rough time to show it can be far more cost-effective on the highest -end jobs to move to the cloud. There are examples that show similar times:

http://www.chaosgroup.com/en/2/envyspot.html?i=15
"Although we got a powerful render farm here at Taylor James we aim to keep our render times as low as possible. With the flexibility of V-Ray you can easily tweak your render times. Normally we aim for 30 min per frame for an HD frame on a 16-core machine."

http://vimeo.com/37747355
5-35 minutes per frame, done on a single workstationhttp://www.technologyreview.com/news/425698/hollywoods-cloud/
"One such firm is Afterglow Studios, based in Minneapolis. Its owner, Luke Ployhar, is currently finishing Space Junk 3D, a 40-minute stereoscopic film about the 6,000 tons of garbage circling the planet. It’s a big project for a small firm, which has required more than 16,000 hours of computing time to animate, or render, the scenes of orbiting debris. Ployhar estimates that if he’d bought computers to do the job, he would have spent at least $50,000 on equipment. It wouldn’t have been economical for me to buy all these machines.

Last year, about 5 percent of DreamWorks Animation’s rendering was done in the cloud, but the company plans to increase that to 50 percent by the end of 2012, Derek Chan, head of digital operations at DreamWorks Animation says, rather than spend many millions of dollars to expand its existing data center."

This is why I enjoy these discussions. I have never seen anyone who could come up with so cool links. I wouldn't have found the Ben-Collier Marsh video if you hadn't linked it. That's an amazing piece. The shaders and textures appear to have been kept somewhat simple. This doesn't mean they didn't optimize them. I'm saying it doesn't look like they used a massive shader stack or heavy texture mapping. It appears that they worked out the reflective behavior at a shader level but relied solely on geometry and lighting to produce the right highlights. It explains the comment about it requiring a mind blowing amount of geo, as creating that many interlocking parts along with bevels and thickness, each carefully labeled and ensuring a minimum of tangency across all of these sweeping surfaces is a lot of work. If you aren't careful, you end up with a lot of topology problems. It's not so much that it's difficult to build some of those individual pieces. Cogs are extremely easy to build, although it probably took some studying to ensure that they'd work together well mechanically The difficult portions would be setting it up so that everything is in appropriate scale and rigged for animation in a way that will transfer and evaluate correctly, working out some of those self illuminated shaders and lighting for highly reflective dark objects, the concepting where someone had to design such a machine even if it wouldn't be physically constructed, and the camera movements. I'm not as big on that kind of camera movement. Some of them wouldn't be possible with a physical model without deconstructing portions of it for different shots, unless I missed something. Anyway I want to know how you find some of this stuff. That is just way too cool. Back somewhat on topic, 10-15 minutes per frame is likely because in spite of the huge polygon count, the shader and texture design is extremely minimal. It may be well optimized, but they didn't use some of the things that really cause render time to skyrocket. I don't see a lot of heavy reflection blur. They seem to pulled it off through the reflectivity model implemented in their shader. I don't Not having to worry about glossy sampling helps quite a bit. The materials are very smooth. They didn't attempt to break that up with mapped textures in most places that I can see. It looks like they only beveled things where it was absolutely necessary to minimize extra edges. That is smart.

I'm just saying that time likely came from being extremely efficient in setup. If they wanted the textures to look more like real objects through some amount of wear, those times could have multiplied. Personally I like them the way they are. If they were film props that went alongside actors, they'd likely have some amount of applied wear so as not to stand out. In spite of the simple shader claim in terms of calculation time, I can't imagine how much time it must have taken testing and fine tuning them. There must have been some post work involved to get the reflections that perfect. Speaking of that, there's one place where a renderer can bite you. Some of them are far more approximative than others on sampling indirect reflections. Therefore if you need to output rawreflection + reflection filter passes, it can bite you in the ass trying to maintain a clean look. Okay I've gone on with nerd talk enough for one day.

Quote:

Massive companies have their own farms or pay for the cloud. Freelancers and smaller shops can use a cloud solution too. There's a space somewhere for the large personal workstation somewhere but it's getting smaller.

My impression was that they're not using cloud cycles for things involving setup. I was trying to maintain somewhat of a distinction between what is viable as a cloud service and what would remain on local machine time for now. I've generally been of the opinion that local machines are for where real time or near real time feedback is required, where they double as render farms for freelance individuals.

Quote:

I don't think $2500 is a bad starting price but it should offer a 6-core so there's an immediate reason to buy one over an iMac. With a single CPU, it tops out at $4000-4500 instead of $6200 and if you need more power, you get another box where you get a second CPU and GPU with more RAM and storage. A spare 128Gbps half-length PCI 3 slot would let you do everything else.

I've said that for years. If you look at the 1,1 through 3,1, the base option was a significant step up from the imac compared to what it is today. It's not the only reason to buy such a machine. It offers greater flexibility. I don't think we're about to hit a slim client only thing. There are enough markets left for performance machines. It's likely that you'll see a certain amount of consolidation, but I don't see them disappearing within the next few years. This means very little in terms of Apple's lineup. As I've said, they tend to chase high growth markets due to their size as a company.

Originally Posted by hmm
10-15 minutes per frame is likely because in spite of the huge polygon count, the shader and texture design is extremely minimal.

The specific time isn't all that important because obviously the longer it is, the less feasible it becomes to do on a single workstation anyway. One of those examples took 35 days to render. Double the time and you get 70 days. Do you want to have your workstation sitting practically unusable for over 2 months as it's maxed out just to find out there's an artifact at a higher resolution or a setting that's been done wrong:

Like I say though, there has to be computers somewhere so if Apple wants them to be Macs, they need to have a model that works for the usage scenarios. I think the Mac Pro is currently too big and expensive for both remote/parallel use and personal use.

For personal use, it should be good value (good performance per dollar) and convenient - I'm sure a few people have lifted a 40lb workstation and it's not that convenient. For remote/parallel use, it should also be good performance per dollar but also efficient space-wise and power-wise, as well as in terms of software configuration. A smaller dual-processor workstation could be best, it's up to Apple's engineers to find the right compromise.

Quote:

My impression was that they're not using cloud cycles for things involving setup. I was trying to maintain somewhat of a distinction between what is viable as a cloud service and what would remain on local machine time for now. I've generally been of the opinion that local machines are for where real time or near real time feedback is required, where they double as render farms for freelance individuals.

That's right but the real-time stuff can be done on almost any machine now. Once you get real-time feedback, you don't need better than that because we live in real-time.

Quote:

I don't think we're about to hit a slim client only thing. There are enough markets left for performance machines.

Again though you're creating an artificially large gap between the lower-end and higher-end models. A MBP and iMac are hardly thin clients.

A 27" iMac with a Fusion drive, 32GB RAM and 2GB GTX 680M is a workstation. A 15" rMBP with 256GB SSD and a 1GB 650M is a workstation.

In your mind, you have an association between the word workstation and the tower form factor but the definition has expanded over the years. There's a resistance to this just now because it's still in the early phases. We only got quad-core laptops and iMacs around 2010/2011.

Again though you're creating an artificially large gap between the lower-end and higher-end models. A MBP and iMac are hardly thin clients.

A 27" iMac with a Fusion drive, 32GB RAM and 2GB GTX 680M is a workstation. A 15" rMBP with 256GB SSD and a 1GB 650M is a workstation.

In your mind, you have an association between the word workstation and the tower form factor but the definition has expanded over the years. There's a resistance to this just now because it's still in the early phases. We only got quad-core laptops and iMacs around 2010/2011.

You misinterpret some of my words. By the way, the reason for those bloopers is that you don't animate on the real thing. It just means the rig deformations didn't transfer as expected, but you could bake those out prior to ever rendering footage. It's possible those were test renders at lower than production quality settings. I'm not hung up on definitions. I've worked on my notebook plenty of times. It does choke on far less than a desktop. The biggest choke point is the gpu. It's application dependent, but on anything remotely heavy, I'd be stuck in wireframe or bounding box views. The slim client comment was based on the idea of pushing everything out to the cloud. I said I don't think we're there yet. I don't think Apple's options beneath the mac pro are ideal for this. They're workable, but it's unlikely that this is a primary focus for Apple. It's just that if you're going to use Macs, you pick out of what is available. This doesn't mean I'll never use imac. It just means probably not today. I kind of wonder if by the time I'm interested, they'll be the next item on the chopping block.

Originally Posted by hmm
I've worked on my notebook plenty of times. It does choke on far less than a desktop.

The biggest choke point is the gpu.

You'd have to be specific about which laptop and which desktop though. Obviously a laptop with an HD3000 isn't going to do real-time graphics like a desktop with a GTX 680 or Quadro. However an iMac with the mobile GTX 680M would and the 650M in the MBP would probably hold up ok too.

Quote:

Originally Posted by hmm
The slim client comment was based on the idea of pushing everything out to the cloud.

You'd only push the most intensive things that are not very feasible on any machine. The remaining tasks should be doable on any decent performance computer.

Quote:

Originally Posted by hmm
I kind of wonder if by the time I'm interested, they'll be the next item on the chopping block.

It depends on how the market goes I guess. Without a Mini, there would be no OS X Server. Without the iMac, the prices would probably go up on the 27" Cinema displays. It could happen eventually but the iMac is still a strong seller.

How does the current iMac compare performance-wise to what you use now?

How does the current iMac compare performance-wise to what you use now?

Bad comparison as I'm quite constrained right now. You know how I mentioned the power available determining the need for workarounds? I am very familiar with many of those. It's too laggy viewing a few million OpenGL polygons in textured mode, so I do rely considerably on low res versions for positioning, wireframe, etc. I've been debating how to resolve that. There are workloads that go well beyond my own, including some of your links. My point regarding gpus like the 650m was that sometimes desktop equivalents can be several times the speed within a given application without the costs going into thousands. For certain use cases memory is also a precious resource. We don't have heterogeneous computing at this point, so it's still a factor. I suspect 1GB of vram is part of the reason both of my machines regularly choke in the aforementioned use cases. As I said it's possible to work around that, but who wants to if a decent gpu is cost effective?

Quote:

It depends on how the market goes I guess. Without a Mini, there would be no OS X Server. Without the iMac, the prices would probably go up on the 27" Cinema displays. It could happen eventually but the iMac is still a strong seller.

My point was that if the very generic version of something like a display hits the point where I no longer see a difference, that market may no longer interest Apple in terms of growth. I wasn't being completely serious though, and it implied an extremely ambiguous timeline. I've worked on imacs before. I've used my notebook to accomplish work before. I'm not commenting on anything I haven't tried.

Originally Posted by Marvin
It depends on how the market goes I guess. Without a Mini, there would be no OS X Server. Without the iMac, the prices would probably go up on the 27" Cinema displays. It could happen eventually but the iMac is still a strong seller.

Marvin,

While it is true that the market determines everything in the end, I don't see the iMac (despite the limitations that make a lot of us uncomfortable) going away any time soon.

Apple is right about one thing. There are a lot of people who not only don't ever personally go into a computer, but don't have it upgraded by someone else. I am occasionally reminded that there are a lot of people who don't even really know just which computer (Mac or PC) they have, hard as that is for us to imagine. Their response is usually something or other to the effect that it's a (fill in the blank with the name brand) if they even remember that much. At least with Macs they will usually answer its an Apple or a Mac and sometimes even the model (iMac or Macbook Pro), but seldom anything detailed about the hardware.

This is what we are up against when trying to convince Apple that they need to pay more attention to us. (Sigh)

Originally Posted by hmm
Bad comparison as I'm quite constrained right now.

Would you call what you are using now a workstation?

Quote:

Originally Posted by hmm
It's too laggy viewing a few million OpenGL polygons in textured mode, so I do rely considerably on low res versions for positioning, wireframe, etc. I've been debating how to resolve that.

"This demo proves that it is possible to explore huge scenes with lot's of textures on a consumer computer with a single GPU thanks to the efficient virtual memory manager.
New York downtown scene represented by 25 million polygons and 8 gigabyte of textures (more than 1000 high-resolution textures) and Boeing 777 scene represented by 360 million polygons (or 250 million polygons in some shots).
Laptop specs include: 200 GB SSD storage, 16 GB RAM, NVIDIA GeForce GTX 485M graphics card with 2 GB of memory (pricing for this laptop was 2000$ in 2011)."

"Those things are good and useful, but what I most want to see is direct surfacing of the memory. It’s all memory there at some point, and the worst thing that kills Rage on the PC is texture updates. Where on the consoles we just say “we are going to update this one pixel here,” we just store it there as a pointer. On the PC it has to go through the massive texture update routine, and it takes tens of thousands of times [longer] if you just want to update one little piece.

You start to advertise that overhead when you start to update larger blocks of textures, and AMD actually went and implemented a multi-texture update specifically for id Tech 5 so you can bash up and eliminate some of the overhead by saying “I need to update these 50 small things here,” but still it’s very inefficient. So I’m hoping that as we look forward, especially with Intel integrated graphics [where] it is the main memory, there is no reason we shouldn't be looking at that. With AMD and NVIDIA there's still issues of different memory banking arrangements and complicated things that they hide in their drivers, but we are moving towards integrated memory on a lot of things."

" Intel’s integrated graphics actually has impressed Carmack quite a bit and the shared memory address space could potentially fix much of this issue. AMD’s Fusion architecture, seen in the Llano APU and upcoming Trinity design, would also fit into the same mold here. He calls it “almost a forgone conclusion” that eventually this type of architecture is going to be the dominant force."

Something like the GTX 680M would still hold up pretty well with 2GB of memory but shared memory is needed for both desktop and mobile cards.

Quote:

Originally Posted by RBR
This is what we are up against when trying to convince Apple that they need to pay more attention to us.

I think they are paying attention though. They didn't need to put IPS, high-res displays into the laptop line and eliminate most of the reflections. They didn't need to put SSDs in nor quad-core i7 CPUs. If they were just targeting people classed as 'consumers' they have no reason to do that. They have no reason to put a 2GB GTX 680M and the fastest desktop i7 into the iMac - they could do what most other AIO manufacturers do. They have no reason to bother with an external PCI standard like Thunderbolt. Consumers could happily live without all of these things and have lower prices.

While it is true that the market determines everything in the end, I don't see the iMac (despite the limitations that make a lot of us uncomfortable) going away any time soon.

For the last couple of years the iMac was the only desktop Mac with an upside in sales in the USA. From my perspective that is directly related to the bad values represented by the Pro and to a lesser extent the Mini.

Quote:

Apple is right about one thing. There are a lot of people who not only don't ever personally go into a computer, but don't have it upgraded by someone else. I am occasionally reminded that there are a lot of people who don't even really know just which computer (Mac or PC) they have, hard as that is for us to imagine. Their response is usually something or other to the effect that it's a (fill in the blank with the name brand) if they even remember that much. At least with Macs they will usually answer its an Apple or a Mac and sometimes even the model (iMac or Macbook Pro), but seldom anything detailed about the hardware.

This is all true and frankly I'm happy that Apple has been able to leverage that market.

Quote:

This is what we are up against when trying to convince Apple that they need to pay more attention to us. (Sigh)

Even amongst pro users, the portion that cares deeply about these sorts of things is vanishingly small. There in lies the problem, Apple does have a good and broad Pro user base it is just that the majority of them don't push hardware as hard as you and the other guys posting in this forum.

Quote:

Cheers

Sadly I really don't know what the solution is here. It would seem to me to be extremely simple to build a chassis around a couple of mother boards that could effectively support all of their Pro and not so Pro users on one platform. For example one Haswell based board for joe average pro and a Xeon or Phi based board for joe exceptional pro user. The idea is to get more of your customers to buy the platform to shore up sales. Maintain a commonality of parts across the platform to reduce costs and maintain production flexibility. What should be obvious to everybody is that Apple desktop line up is a failure at this point. I don't buy into the idea that it has to do with current market realities, it has more to do with Apples customer base getting frustrated and basically saying to hell with Apples desktop machines.

"A 27" iMac with a Fusion drive, 32GB RAM and 2GB GTX 680M is a workstation. A 15" rMBP with 256GB SSD and a 1GB 650M is a workstation.

In your mind, you have an association between the word workstation and the tower form factor but the definition has expanded over the years. There's a resistance to this just now because it's still in the early phases. We only got quad-core laptops and iMacs around 2010/2011."

It's not retina...sure. But the 680 MX will throw this 27 inch screen around alot better than a retina would perhaps.

By the time a gpu can? The retina iMac will have arrived or the iMac will have been discontinued...

Wizard, shame you can't get past the fact that you really don't need to get inside. (Yeah, I'd liked to have put my own SSD in there and yes, Apple hosed me with a £100 price hike plus £200 for Fusion as opposed to just given me the option for an internal 256 SSD.) I'll buy a couple of sticks of 8 gigs in due course to take my ram upto 24 gigs.

Workstation?

This is. Compared to the Power Mac Pro tower i had in 1997, this is more of a workstation than that ever was. Heaps of ram, loads of cpu speed, 10th fasters gpu?

On a glass table with chrome legs...it's stunning.

Unpacking it was 'foreplay...'

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

As far as getting inside the machine, at wouldn't be an issue if Apple would configure the machines in the way I want. There is no doubt the iMac "looks" nice but that means little if your options are severely constrained when it comes to things like SSDs.

Quote:

Originally Posted by Lemon Bon Bon.

To me, it's a giant iPad in some ways. The iMac. It's simple convergence of Technology and Arts is serenely beyond mere PCs.

That is without question the most beautiful Corvette made in my lifetime. Finally, Chevy made a Corvette that looks as good as it goes.

Still, Chevy needs to EOL the Corvette. It's only a tiny fraction of their total sales and thus is not very profitable. Most of Chevrolet's business is with family cars like the Cruze, so that's where they should focus their engineering efforts. Furthermore, the Camaro can do 90% of what a Corvette can do, and for that last 10% the driver just needs to think ahead and use workarounds. Nobody needs a Corvette to get from point A to point B anymore than they need a Mac Pro to run Lightroom and Photoshop.

In a lot of the tests, it's actually faster than the desktop 680 in the Mac Pro.

"For everyone asking about heat and fan noise, I have literally been benching this non-stop for the past 5 hours and the only time the fan kicked on was while playing Crysis 2, and even then it really wasn't that loud. Some fans can just have the most obnoxious whine, this one isn't as high pitched as some macs I've owned in the past. After an hour of Just Cause 2, the back is slightly warm to the touch, but nothing as hot as last gens iMac."

No heat issues with the new design and great performance. The luxmark score is lower than the old Radeon though. NVidia really needs to work on their OpenCL support. That's a really poor show from a company that popularised GPU computing with CUDA. Even the Heaven benchmark is getting great framerates, though you'd have to run it under Bootcamp for the full test.

"A 27" iMac with a Fusion drive, 32GB RAM and 2GB GTX 680M is a workstation. A 15" rMBP with 256GB SSD and a 1GB 650M is a workstation.

In your mind, you have an association between the word workstation and the tower form factor but the definition has expanded over the years. There's a resistance to this just now because it's still in the early phases. We only got quad-core laptops and iMacs around 2010/2011."

I bought most of the components used on eBay (very carefully!) and the W3690 was huge score since I got it at what would be a great price for even a W3680. Total cost was LESS than your new iMac, and performance, expandability, and upgradability CRUSHES any iMac. A 6870 CRUSHES the iMac's ridiculous mobile graphics. It's not even close.

Granted, my Mac Pro didn't come with a new display, but I'm not in the habit of throwing out displays. I just plugged in the display I was using with a Mini. A desktop computer grows obsolete faster than a quality display, which is why the iMac design is STUPID for a high end workstation.

It is helpful to think back a bit to differences between the original 2006 Mac Pro and the current version. In 2006 [1,1 and 2,1], there was a single base model which could be customized via build-to-order options for its dual CPUs, graphics cards, and so on. This approach continued with the speed increase in the 2008 [3,1] -- one dual-CPU base model with a limited but intelligent range of BTO options.

In 2009, however, the Mac Pro line was split into two base configurations -- a single-CPU machine and a true dual-CPU Mac Pro. This was an important watershed, away from the original vision for the Mac Pro. The original Mac Pro continued, but a low-end single-CPU server replaced the base configuration. In other words, in 2009 the base configuration for the true dual-CPU Mac Pro jumped from one price point to another -- from $2499 to $3299 [4,1] -- and today it is at $3799 [5,1]. The low-end $2499 base price point was filled by a single-CPU Mac Pro that was basically a server -- a shift that became clear in late 2010 with the final discontinuation of Xserve and the official introduction of the current "Mac Pro Server" (and "Mac mini Server"). Following in the wake of this server-hardware shift was the change in the Mac OS X Server software from a separate product to an add-on in Lion and Mountain Lion.

So today the choice is explicit -- you can buy a low-end single-CPU Mac Pro that is basically a server but has all the internal storage, memory, and expansion capacities of the dual-CPU Mac Pro, or you can buy a true Mac Pro starting at a somewhat higher price point.

In short, the original essence of the Mac Pro is the dual processors. As a result, a Thunderbolt iMac could well replace a Mac Pro for some users, especially those using single-CPU Mac Pro machines now -- but it cannot replace the original meaning of "Pro" in the Mac Pro -- the iMac will never be a multiple-processor machine.

What does this history mean for the future? I think the big questions are:

[1] Does Tim Cook's Apple see a true Mac Pro (which I've defined here as having more than one CPU) as a machine worth producing? I think the answer here is yes. I don't think he would have responded last year (June) with the "we're working on something really great for later next year" statement if he did not intend to continue the line. [I know nobody here is disputing this point, but I seem to remember a few people doing so in some of the other threads.]

[2] Will Apple continue to use the Mac Pro and Mac Mini form factors for its servers? I tend to think the answer here is also yes. It took them three years (2009-2011) to make the transition from Xserve/Mac OS X Server to the current approach. Tim Cook was in charge for much of that transition. I don't see them going back.

[3] Is it possible that the Mac Pro Server will be discontinued, absorbed by increasing speed and capacity in the Mac mini Server? This one is tougher. I think it is a possibility. It's hard for me to see it happening before Haswell, though:

[A] Obviously, Apple pulled the plug on Sandy Bridge EP (E5) processors last year after almost nine months of delays (originally due Q3 2011, they weren't shipped until Q2 2012). Intel is skipping Sandy Bridge EX (E7) altogether and going straight to Ivy Bridge EX (E7). While the latter doesn't affect the dual-CPU Mac Pro, my instinct is that Apple likes to keep its development options open and the Sandy Bridge Xeon bugs that resulted in its highest-end processors being aborted scared Apple away.

[B] I don't think it is at all realistic for Haswell EP (E5) Xeons to appear in time for a 2013 Mac Pro. So we're looking at the Ivy Bridge EP (E5) Xeons, which are next up -- the Ivy Bridge E3s are already out, and the E5s will come before (or together with) the E7s.

[C] In addition, any Ivy Bridge Mac Pro design will transition easily to Haswell, the "tock" to Ivy Bridge's 22 nm "tick" -- so there is no reason to wait for Haswell's engineering advances, like transactional memory:

Those programming benefits will come in a straightforward 2014 refresh of the 2013 design -- think 2014 WWDC.

PREDICTIONS:

[I] We'll see a new dual-CPU Ivy Bridge Mac Pro this year starting at about $3299. The base configuration will have a Fusion drive. It will hold up to four 3.5-inch hard drives. The design will also leverage the difference in size between a SSD and a 3.5-inch HDD. You'll be able to put up to eight SSDs in it -- SSD prices in the future may fall to the point where people would be able to do that. I think any new Mac Pro design has to recognize this changing technology.

[II] There may or may not be a low-end single-CPU Mac Pro Server option, as there is today. I think there is still a place for a 3.5-inch-hard-drive server in Apple's lineup, but I don't know how long that will be true -- the continuing development of SSD storage capacity will eventually kill the Mac Pro Server, IMHO. The day you can fit 4 TB of SSD storage into a Mac mini Server will be the end.

[III] Finally, as you might have gathered, while I'd love to see real innovation, I don't think anything truly different (like some of the things dreamed up in these various Mac Pro threads) is going to happen now. That's further off. What we have now is Apple adjusting the Mac Pro to the present and immediate future, not something more distant.

I bought most of the components used on eBay (very carefully!) and the W3690 was huge score since I got it at what would be a great price for even a W3680. Total cost was LESS than your new iMac

Apple must release a new Mac Pro.

Assuming you bought the Mac Pro new, it was already at least $200 more expensive than the iMac without a display. If you bought everything used, you haven't paid Apple a single penny for the entire setup and yet you claim that Apple should make a new one. Presumably so you can wait until 2016 to buy someone's old 2013 model, not pay Apple any money and claim how superior it is to the 2016 iMac despite not having any warranty.

It would be like someone buying a games console and only ever buying used games. The console manufacturer makes a loss or breaks even on the console sale and the console manufacturer and games publisher make no money on the sale of the used games. Then the gamer complains about there not being enough good games to play and game studios closing up all over the place. It's a real puzzler isn't it.

Quote:

Originally Posted by Junkyard Dawg
A desktop computer grows obsolete faster than a quality display, which is why the iMac design is STUPID for a high end workstation.

So by now your 2009 Mac Pro and 2011 processor must be getting pretty obsolete now that it's 2013. What quality display are you using btw?

Quote:

Originally Posted by TenThousandThings
it cannot replace the original meaning of "Pro" in the Mac Pro -- the iMac will never be a multiple-processor machine.

That's not the meaning of "Pro" as it's used on the Macbook Pro with a single CPU and the 13" one with integrated graphics. It's just a marketing name. When you talk about dual processors, you're really just talking about more cores. A dual quad-core setup isn't any more powerful than a single 8-core chip.

People tend to forget what the core technology inside computers is and focus on the boxes. The iMac is slim and the Mini is small so they are weak. The Mac Pro is huge so obviously it crushes everything (especially your foot if you tip it over) but they all use processors that fit in the palm of your hand.

It comes down to efficiency (performance per watt). Electricity comes from the mains goes through a power supply and this goes into the tiny wires inside every one of these small chips. With more wires and higher power going in, more heat needs to be dissipated. So most of the bulk that you see is just cooling equipment. If they used better conductors to make the chips or optical components or found a way to use fewer internal wires to get the same performance, the Mac Pro wouldn't need to exist.

They can probably make these advances right now but it's a business. They don't want to put a 50-core CPU in a laptop this year because you won't buy a new one until the thing dies, which could be 10 years away. They don't mind you buying a 12-core machine so long as you pay $6000+ for it because they aren't going to be seeing you for a while. This is obviously pretty expensive for most people so the volume of buyers isn't there and that's how they like it - they'd rather that people are left wanting more because it's the only way they won't go out of business.

[I] We'll see a new dual-CPU Ivy Bridge Mac Pro this year starting at about $3299. The base configuration will have a Fusion drive. It will hold up to four 3.5-inch hard drives. The design will also leverage the difference in size between a SSD and a 3.5-inch HDD. You'll be able to put up to eight SSDs in it -- SSD prices in the future may fall to the point where people would be able to do that. I think any new Mac Pro design has to recognize this changing technology.

Great post, but going from the top-end iMac at $1999 to a starting price of $3299 leaves a pricing gap big enough to drive a truck through.

It would be nice if Apple put dual cpus in all Pros, but the reality is that there needs to be an option at least around the $2200. mark.

Apple previously had a Pro machine at $1799 that sold extremely well. Whilst that's probably dead to protect the 27" iMac, if the Pros start at $3300 Apple may as well cancel the line. Because only really big business, scientific and government clients can afford that kind of pricing in the present economic climate.

Assuming you bought the Mac Pro new, it was already at least $200 more expensive than the iMac without a display. If you bought everything used, you haven't paid Apple a single penny for the entire setup and yet you claim that Apple should make a new one. Presumably so you can wait until 2016 to buy someone's old 2013 model, not pay Apple any money and claim how superior it is to the 2016 iMac despite not having any warranty.

So by now your 2009 Mac Pro and 2011 processor must be getting pretty obsolete now that it's 2013. What quality display are you using btw?

Why would I buy a new Mac Pro? Those things are a waste of money! When Apple makes a consumer desktop I'll buy it, but until then I'm not blowing $2500 on a stripped down desktop computer. You're damn right I'll be buying a used 2013 Mac Pro in 2015 or so!

As for my Mac Pro being obsolete, it would be if I were using it for Xeon workstation tasks. Instead I use it primarily for Lightroom and Photoshop. And make no mistake, it is better than the current iMac, and it will be better than the next iMac as well. Try adding 7 TB of fast storage to an iMac for about $350. Once you add the cost of TB enclosures like a Pegasus, you're looking at $1000+, and then that iMac isn't so thin anymore - might as well have been a tower if you have to add a storage tower next to it. Ive may be a brilliant designer but he's not so bright about practical matters.

My display is just a low end Dell IPS model, but it's got plenty of years left and it calibrates nicely with an hardware calibrater. I'd like an Eizo but it's hard to justify the price since for me photography is only a hobby.

Quote:

They don't mind you buying a 12-core machine so long as you pay $6000+ for it because they aren't going to be seeing you for a while. This is obviously pretty expensive for most people so the volume of buyers isn't there and that's how they like it - they'd rather that people are left wanting more because it's the only way they won't go out of business.

That's the problem with Apple's business model. It makes sense for the sub-$1000 market, but Apple insists on making everything below $2500 a disposable computer. As a result they alienate many serious users ("Power Users") who don't have $5000 to blow on a tower computer.

I personally know of several former iMac owners who loved Mac OS X but switched back to Windows after they upgraded the iMac's HDD. The iMac's hardware design is insulting to anyone who knows even a little bit about computers.

Originally Posted by Junkyard Dawg
Why would I buy a new Mac Pro? Those things are a waste of money! When Apple makes a consumer desktop I'll buy it, but until then I'm not blowing $2500 on a stripped down desktop computer. You're damn right I'll be buying a used 2013 Mac Pro in 2015 or so!

So you want an iMac without the display, not a Mac Pro.

Quote:

Originally Posted by Junkyard Dawg
Try adding 7 TB of fast storage to an iMac for about $350. Once you add the cost of TB enclosures like a Pegasus, you're looking at $1000+, and then that iMac isn't so thin anymore - might as well have been a tower if you have to add a storage tower next to it.

You can get 6TB eSATA for $485 add it to the 1-3TB Fusion drive inside:

Keep in mind you are saving at least $300 vs the entry Mac Pro. Over $500 once you include a cheap Dell display so it's practically free storage.

Quote:

Originally Posted by Junkyard Dawg
Ive may be a brilliant designer but he's not so bright about practical matters.

Practical matters like making money? If they sold a $2200 iMac without the display for maybe $1500, all you'd do is buy the display from Dell. Very few people will go the route you chose of buying a used Mac Pro.

Quote:

Originally Posted by Junkyard Dawg
My display is just a low end Dell IPS model, but it's got plenty of years left and it calibrates nicely with an hardware calibrater

No problem with doing that but the iMac offers a better quality display. If they sold the iMac without a display for $1499, you could buy a Dell now for $323:

It's 1080p but good value and you'd save over $350 but a lot of people don't want to shop around and Apple makes it easy by bundling a color-corrected, high quality display so they don't have to.

Quote:

Originally Posted by Junkyard Dawg
The iMac's hardware design is insulting to anyone who knows even a little bit about computers.

I don't like the internal storage or fixed RAM but I understand why they do it and I don't think they'll lose many sales because of it. I think they'd lose more money by building a low priced tower because everyone who's doing it isn't making a lot of profit.

You can see where computers are going. They will all end up with a single SSD boot drive, soldered RAM, an SoC chip and everything just plugs into it. That goes for computers from the low-end right to the high-end. They have to go this route.

It's 1080p but good value and you'd save over $350 but a lot of people don't want to shop around and Apple makes it easy by bundling a color-corrected, high quality display so they don't have to.
I don't like the internal storage or fixed RAM but I understand why they do it and I don't think they'll lose many sales because of it. I think they'd lose more money by building a low priced tower because everyone who's doing it isn't making a lot of profit.

I'm not sure that most mac pro owners fall into this. None of the ones I've seen (which is quite a few) went this route on displays. I don't completely hate the imac, but you really need to understand how much of that color corrected statement is marketing. They all color correct to some degree. You can talk about what devices are used, and it really doesn't mean anything. It explains so little. I'm not getting into all of it. Dell had an issue with the U2410 a couple years ago. They calibrated the displays at the factory, but their measurements were conducted solely over a small patch in the middle. It doesn't matter what devices you claim to use. Radiometers have remarkably fine tolerance, yet this doesn't tell us the pass/fail tolerance in any criteria (reproduction, uniformity, color temperature) for each display or how many points they measure to that tolerance. It also doesn't tell you if these displays are given time to warm up. You can't confuse marketing blurbs with quantifiable details. The other thing to consider is that all displays drift over time, so a lot of engineering time is often dedicated to minimizing this and providing a suitable method to modify or stabilize the output over time to a known target. This is why software like Basicolor, Color Navigator, i1 profiler, and Spectraview exists. I've spent more time reading up on and testing these things than you can possibly imagine, sad as it is.

Quote:

You can see where computers are going. They will all end up with a single SSD boot drive, soldered RAM, an SoC chip and everything just plugs into it. That goes for computers from the low-end right to the high-end. They have to go this route.

The high end will take longer due to the nature of the market. I pointed out already how intel backed off on the idea of forcing everything to BGA due to resistance from oems and motherboard manufacturers. It loads a lot of extra support costs downstream, thus the resistance. At the higher tiers, you have cpu packages that are shared between workstations and servers. Having servers in the mix there in EP configurations will keep those models socketed for some time. It would drive service costs up either way unless intel just absorbs board development and sells it as a single package. This obviously wouldn't allow for much flexibility.

Keep in mind you are saving at least $300 vs the entry Mac Pro. Over $500 once you include a cheap Dell display so it's practically free storage.

My view of external storage shift with the application. Even today it isn't always the right solution.

Quote:

Practical matters like making money? If they sold a $2200 iMac without the display for maybe $1500, all you'd do is buy the display from Dell.

Or from a number of manufactures.

Quote:

Very few people will go the route you chose of buying a used Mac Pro.

Actually the market for used Apple products is very strong. I've considered it in the past myself. At this point in time used Mac Pros might not be the best value but historically they could be seen as a good deal.

Quote:

No problem with doing that but the iMac offers a better quality display. If they sold the iMac without a display for $1499, you could buy a Dell now for $323:

It's 1080p but good value and you'd save over $350 but a lot of people don't want to shop around and Apple makes it easy by bundling a color-corrected, high quality display so they don't have to.
I don't like the internal storage or fixed RAM but I understand why they do it and I don't think they'll lose many sales because of it. I think they'd lose more money by building a low priced tower because everyone who's doing it isn't making a lot of profit.

The need for an affordable desktop Mac has many parameters of which the monitor is just one.

Quote:

You can see where computers are going. They will all end up with a single SSD boot drive, soldered RAM, an SoC chip and everything just plugs into it. That goes for computers from the low-end right to the high-end. They have to go this route.

Yes at the low end this is true. The Mac Pro however serves a different market. While I expect a smaller more integrated machine giving up configurability in a pro machine is hard to swallow.

Originally Posted by wizard69
What are we trying to do here incite rage? What people want is far removed from the iMac.

The post I was replying to was talking about a consumer desktop. That means the iMac internals in a tower form factor, which means a motherboard that doesn't accommodate a CPU with over 4 cores.

It can obviously have expansion slots, which I suspect is what you mean but performance-wise, it really wouldn't differ from the iMac.

Quote:

Originally Posted by wizard69
The need for an affordable desktop Mac has many parameters of which the monitor is just one.

These threads always end up here though. It starts out talking about the need for the Mac Pro and how many pros are out there who need all the power of a Xeon and the bandwidth of dual processors and eventually it becomes about price. People want a cheap tower like Dell and HP offer so that they can buy a quad-core i7, put in their own RAM and SSD and a fast desktop gaming card at a very low price and attach a cheap Dell display to it.

That isn't going to happen. Ever since Apple introduced the iMac in 2000, it has outsold the desktop towers because it offers a simple, high quality solution in the price range it occupies and that appeals to a lot of people who don't want to source their own peripherals and it works better for Apple because they make a profit on more components.

The iMac was frustrating for years because they did miss out on high performance parts. They missed out the entire Core 2 Quad range of processors and the GPUs and video memory didn't go high enough in any configuration. They didn't offer expansion ports for fast storage options and PCI peripherals. They bundled a display which wasn't good value vs 3rd party options.

This isn't the case now. They offer the fastest desktop i7s, they offer GPUs with video memory that is in league with the fastest desktop GPUs you can buy, they offer SSD boot drives, you can connect fast peripherals over Thunderbolt and USB 3, the RAM limit isn't a problem and the display is good value.

It would be nicer if they didn't glue it shut, it would be nicer if they could cut the price down a bit - at the very least on the BTO options, but there is no longer an immediate need to have a cheaper alternative to the iMac in a tower form factor. The $1000-2000 tower is out for good.

Quote:

Originally Posted by wizard69
Yes at the low end this is true. The Mac Pro however serves a different market. While I expect a smaller more integrated machine giving up configurability in a pro machine is hard to swallow.

I know this isn't a comparison people will like but when you use an iPad, at no point do you think about opening it up and upgrading the internals. That's the way all computers will be one day from low-end to high-end.

"Intel remains committed to the growing desktop enthusiast and channel markets,' Intel's Daniel Snyder told the site, and will continue to offer socketed parts in the LGA package for the foreseeable future for our customers and the enthusiast DIY market.'

Snyder went on to explain that he would not be commenting on 'long-term product roadmap plans,'"

It won't be for a few years but it will happen one day. In 2020, you just won't put a PCI graphics card in your machine because you won't have to. That's why NVidia is going the SoC route right now.

The Mac Mini has the MacMate, which added ports and hard drive access.

Perhaps if some enterprising third party created an iMac-specific companion, Thunderbolt-based with multiple (bootable) drives and PCI expansion, much of the midrange Mac market would be satisfied with that option.

Originally Posted by Junkyard Dawg
No, the iMac is a weird desktop/laptop hybrid. If Apple just offered a headless iMac that would be an outrage. Mobile graphics on a desktop? Apple is mocking their users.

It comes down to how people use them. Desktop cards can offer better compute power but that comes at the price of a high power draw and heat output. Certainly the 640M/650M on the entry models aren't all that powerful but the higher ones are fine.

Quote:

Originally Posted by Junkyard Dawg

Quote:

Very few people will go the route you chose of buying a used Mac Pro.

That is not true.

You think a lot of people are lining up to drop nearly $2000 on an old tower workstation that is a few years old from someone on eBay?

Quote:

Originally Posted by Junkyard Dawg

Quote:

You can get 6TB eSATA for $485 add it to the 1-3TB Fusion drive inside

A far inferior solution to the Mac Pro's four internal SATA bays.

If you get the same amount of storage, it's not really any different. One big advantage is that it's easy to upgrade the computer. Unplug it from one and plug it into the other. Otherwise you have to transplant the raw drives and if you have any RAID system, you sometimes have to get the order right and always risk losing the RAID.

Look at how servers work these days, it's not CPUs and HDDs together because they scale differently. If you have massive amounts of video footage, 4 drive bays won't be enough. You can't fit the following inside a Mac Pro:

If you exceed the storage of an external consumer unit, you can get another one.

Quote:

Originally Posted by Frank777
Perhaps if some enterprising third party created an iMac-specific companion, Thunderbolt-based with multiple (bootable) drives and PCI expansion, much of the midrange Mac market would be satisfied with that option.

You'd be able to partition any Thunderbolt storage for multiple boot options.

It comes down to how people use them. Desktop cards can offer better compute power but that comes at the price of a high power draw and heat output. Certainly the 640M/650M on the entry models aren't all that powerful but the higher ones are fine.
You think a lot of people are lining up to drop nearly $2000 on an old tower workstation that is a few years old from someone on eBay?

If you get the same amount of storage, it's not really any different. One big advantage is that it's easy to upgrade the computer. Unplug it from one and plug it into the other. Otherwise you have to transplant the raw drives and if you have any RAID system, you sometimes have to get the order right and always risk losing the RAID.

Older computers are only scary to computer illiterate folks. If you know how to troubleshoot hardware and hang new parts in a computer, then it rarely makes sense to buy new. The money saved on a used computer is nearly always more than any repairs that computer may need in the future. Obviously if you buy a lemon you're screwed, but eBay's policies now guarantee a buyer can return an item. Just be sure to test it thoroughly. Of course a studio is going to buy new because they don't have time to waste on repairs, but at this point Apple is making it hard for them to buy new.

Regarding external storage, it's often a better solution, but nothing is easier or cheaper than sliding a couple bare drives into a Mac Pro. The beauty of the Mac Pro is that you have the choice to add internal drives AND add external drives. Part of what makes the Mac Pro a "pro" solution is it's wide breadth of options.

I noticed later in your post you argued that mobile graphics in the iMac are just fine? Whoa, that's some serious denial man. Having the option to upgrade a computer's video card can add years of life to it, or simply make using it more enjoyable. Even some Adobe Lightroom plugins are GPU accelerated. When I upgraded from an Nvidia GT120 to a Radeon HD 6870 the difference was stunning, and I'm not even doing video editing.

The argument in favor of the iMac always seems to boil down to "it's good enough". That's a fine argument for something like the Mini, but most people I know who drop $2000+ on a computer want something more than "good enough". That an upgraded 2009 Mac Pro can still smoke a current high end iMac only illustrates how great the Mac Pro is and how crippled the iMac is.

It's rather funny, the two options people do have in purchasing a Mac Pro.

There are two options:

You can invest the $3,500+ and by a really nice model. Then after 3 years, loose approximately $1,500 on eBay and get $2,000 back, that's after eBay and PayPal taxes and fees. After this you can buy a new Mac Pro (or whatever top notch computer specs you want) for $3,500 again, in effect only costing you $1,500 to buy that new Mac Pro-ish Machine. With this method you have the current best of the best, and your power level goes down year after year but is supercharged when you sell and re-up.

<or>

You can wait for a model drop in price over the years, like some do with the 2009. If you shop around you can probably get a 2009 for $1,500. So now you have a machine that is 3 years down the slope, but only costs you $1,500. After your 3 year period, the machine is (6 years old) probably worth selling on eBay for $500. In effect this rotation will cost you $1,000 every 3 years to get your "new" 3 year old machine.

So yeah at some point you have to drop a $2000 investment, but you can always have the fastest and latest machine for $1500 every three years, or you can have something slower by a 3 year factor for $1000 every 3 years

I for one don't believe using a 3 year old machine, if you are using it for work, not investing that $2000 versus $500, so you can upgrade for almost the same as the 3 year old option is not wise. The speed over the course of 3 years pays for itself in the first cycle.

You just have to decide what you want to do and I guess can do. I have been doing the old wait for the Mac Pro (Tower), but buy Current MacBook Pro (Laptop), this way my Desktop and Laptop are around the same speed and I don't feel any sluggishness when working with either one. I am changing this motif tho, because of the making money aspect, and I am not selling my old Mac Pros.

Good Luck...

But I mean honestly, isn't Apple GIVING us ample time to save for the Mac Pro 2013? And why wouldn't you want to... :D

I hate to reply to my own message but after typing this out I just realized something that PC guys have been doing for years.

Most PC guys instead of dropping, the

$1000 for the Case and Dual Socket Motherboard

$1000 for the 1st Processor

$1000 for the 2nd Processor

$500 for the RAM, HD and misc

$300 for the Video Card

----------

$3800

They wait a year, and buy

$350 processor with the fastest GHz

$100 for the Motherboard

$50 for the Case

$200 for RAM, HD and misc

$100 for a Video Card (Year Old PC Version, but fastest)

----------

$800

and for that $800 they say to themselves they have the fast machine

But in all sincerity it's a "Year Old waited on parts to drop a little" machine, and not the Fastest, Latest, and Greatest, especially in a Cores conversation...

Then they sit back and say why would I spend all that money on a Mac when I can build a PC, blah... If they wanted to build the Latest Fast PC it would cost around the $500 less than a new Mac Pro at best, but that's it.

Originally Posted by Junkyard Dawg
Older computers are only scary to computer illiterate folks.

You don't know what a computer has been through though. A computer could be water damaged but still work and then fail after a month of use. You could get a machine like a 2008 Macbook Pro with a defective GPU that has just been repaired and it fails after a couple of months. There is the potential to lose a lot of money on a used machine.

Quote:

Originally Posted by Junkyard Dawg
I noticed later in your post you argued that mobile graphics in the iMac are just fine? Whoa, that's some serious denial man. Having the option to upgrade a computer's video card can add years of life to it, or simply make using it more enjoyable. Even some Adobe Lightroom plugins are GPU accelerated.

The GPUs are still fast when you get them though. Here is a test between the lower-end mobile GPUs and the desktop ones:

The 650M is about 1/3 the speed of the 570/580 and the 680MX is about 3x faster than the 650M so it's in league with the desktop cards. You're right you can't upgrade the GPU easily as it's an MXM GPU and the screen is glued in but how often would you upgrade a GPU? Not likely before 3 years at which time it's perfectly ok to get a new iMac.

Quote:

Originally Posted by Junkyard Dawg
The argument in favor of the iMac always seems to boil down to "it's good enough". That's a fine argument for something like the Mini, but most people I know who drop $2000+ on a computer want something more than "good enough". That an upgraded 2009 Mac Pro can still smoke a current high end iMac only illustrates how great the Mac Pro is and how crippled the iMac is.

It doesn't smoke a high-end iMac, it's pretty much the same speed and it's not really a 2009 Mac Pro when you have a 2011 processor in it. All you've demonstrated is you can buy a 4 year old desktop, a 2 year old processor and pay roughly the same price to get around the same performance as an attractive AIO with a 27" IPS display and instead end up with a giant metal 40lb box with no warranty and a cheap Dell display attached.

It clearly is a setup you prefer, which is fine and I'm with you to a degree on the upgradability but you've shown it's not a route Apple should go down. It's great for you as a buyer but it made them zero profit. If they even sold a quad-i7 desktop tower with PCI slots for $1499, it would be an option for a few people but the volumes would be so low and they only make the margins on the parts in the box because people won't spend another $1000 on their display. If they do, they'd have been better off with the iMac as it's the same panel.

People (including me) always argue that Apple should do something that they want under the assumption that this want represents a large volume of people but you have to look at the sales data that is all over the place now. The volume in this market is so low now. Workstation shipments are 1 million per quarter worldwide. Consumer desktop growth is flat-lining and the biggest manufacturers want to jump ship and you say to Apple, hop on board. They've been at this a long time and they're going to do what they've always done and take the path that stretches out the furthest.