More than a year and a half passed between the introduction of Apple's 2011-model iMacs and the refresh that replaced them late last year, but the changes you got for waiting were reasonably substantial. The computer got much thinner, lost a few pounds, and ran much cooler and quieter than previous models, and it also got a decent internal upgrade courtesy of new Ivy Bridge CPUs from Intel and dedicated Nvidia GPUs.

Less than a year passed between the introduction of the 2012 iMacs and this year's quiet refresh, and the changes are accordingly much smaller. The 2013 iMac's new changes are all internal—slightly upgraded CPUs and GPUs, a new 802.11ac Wi-Fi adapter, and a switch from SATA to PCI Express solid-state drives round out a refresh that makes absolutely no external changes to last year's chassis. If you were waiting for a Retina iMac to be released this year, your best bet is to keep on hoping.

Still, we've got the $1,299 base model in for testing. And if you didn't buy a 2012 model, is there any one upgrade that will encourage you to buy a 2013 model instead, or should you be waiting for a more drastic upgrade?

Body and build quality

The 2013 iMac is externally identical to the 2012 model right down to its odd trapezoidal box and its wireless mouse (or trackpad) and keyboard, but we'll recap for those of you with older models. Like last year's iMac, the new model is extremely thin around the edges and bulgy in the back. All of the back-mounted ports remain as annoying as ever to reach around and find, but you get the same number of them as you did last year: one gigabit Ethernet port, two Thunderbolt ports (notThunderbolt 2, mind) four USB 3.0 ports, an SD card slot, and a headphone jack that can also accept input from headsets. FireWire has been dumped entirely from these newer Macs, but an adapter exists if you still need that particular interface. The optical drive is also out.

The presence of two Thunderbolt ports (and the capabilities of Intel's and Nvidia's GPUs) mean that it's easy to connect up to two external displays to the smaller iMac, something that was difficult-to-impossible back when the computers only featured a single mini DisplayPort or Thunderbolt jack. The 12.5 pound weight—about eight pounds lighter than the 2009-through-2011-era bodies—also makes the computer much easier to carry and to tilt on its stand. This stand is stable and reasonably elegant looking, but it's also still limited—you can tilt the display up and down but you can't raise it, lower it, swivel it, or pivot it.

The computer's 1920×1080 display (and the 2560×1440 display in the larger model) looks like the same panel that Apple has been using in the iMacs since they switched to the 16:9 aspect ratio. It's a bright, clear IPS panel that at 102PPI is far from Retina-class, but it still looks decent from what most people would consider to be a normal desktop viewing distance (somewhere around two or three feet away from your face). Colors are bright, contrast is good, viewing angles are excellent, and the glass is much less reflective than in the pre-2012 models. The 2012 and 2013 models fuse the LCD panel with the glass layer that covers it, which is something of a double-edged sword—on the one hand, it enables a thinner display assembly that puts the panel closer to the surface of the glass. On the other, cracking that glass means you're looking at replacing the entire screen, and that's an expensive repair.

Like the 2013 MacBook Airs, these late-model iMacs also include two built-in microphones that help with noise reduction. Last year we found them to be a modest improvement over the 2011-and-older models' single-mic setups when chatting or using OS X's dictation feature, and this year's iMac performs similarly. Finally, just like last year, the 27-inch iMac is the only model that retains user-accessible RAM slots (it has four, which support up to 32GB of RAM when fully populated). The 21.5-inch has two RAM slots on the inside (capable of supporting up to 16GB of RAM) that can't be accessed without tearing the machine apart, so if you think you'll want the memory eventually you'll probably want to cough up $200 for the upgrade when you buy the computer.

The 2013 iMac's innovations are entirely interior, and there are four upgrades of consequence: the Ivy Bridge CPUs have been swapped for newer Haswell versions; the 6-series Nvidia GPUs have been switched for an Intel integrated GPU on the lowest-end iMac and 7-series Nvidia GPUs everywhere else; the SATA solid-state drives in the SSD- and Fusion Drive-equipped models have been switched for a PCI Express version; and the dual-band 802.11n Wi-Fi has been upgraded to the better-performing 802.11ac.

The CPU

Intel's Haswell architecture increases performance relative to Ivy Bridge running at the same clock speed, but Intel's latest architecture is much more focused on battery life improvements than performance improvements. The huge battery life boost was the most impressive thing about the 2013 MacBook Air and we're hoping for similar gains in the MacBook Pros, but the benefits of Haswell for desktop users are less readily evident (aside from a perhaps slightly lowered electricity bill).

This is the third year in a row in which all the iMacs that Apple offers have come with quad-core Intel processors. Our base 21.5-inch model comes with a Core i5-4570R, which runs at 2.7GHz but can Turbo Boost up to 3.2GHz. The R in the model number indicates both that this CPU includes Intel's Iris Pro 5200 integrated GPU (more on that soon), and that it's not a socketed CPU—R-series chips are all soldered to the motherboards and can't be replaced or upgraded by the end-user. We suspect that the Venn diagram of "people buying iMacs" and "people who upgrade their CPUs" looks like two circles that aren't touching anywhere, but it's worth noting. The higher-end Macs include dedicated graphics and use more conventional socketed CPUs.

Comparing this year's base model to the base models from the last two years, the story is very much the same as last year's: Haswell is a performance upgrade over an Ivy Bridge or Sandy Bridge CPU running at a similar clock speed, but not really so much that you'll notice for most tasks.

Thanks for including the 2009 iMac in these specs graphs! I'm still on one of those, but always thinking about upgrading. I bought a 2012 27'' iMac for my workplace and love it. Couldn't care less about the missing ODD (I've used mine in the 2009 fewer times than I have fingers) or lack of expandability. Also agree about the SDD--it makes such a huge difference that my 2009 27'' at home sits unused while I'm on the adjacent couch on my MBA.

Also disappointed by the lack of included Fusion Drive. If I could go into a store and just buy the one I wanted, I'd've sprung for one already, even though I can't really justify spending $2k on a machine I don't need at all (I'm a console gamer, and any video editing / high computation processing I do at work anyway).

Yeah, agreed. I like the older model. Apple obsessions with "thin" is getting sick. And the bulge in the back is ugly. I much prefer the mostly uniform thickness of the older model. Yeah, I know: I'm looking at it wrong, eh...

Plus:

- the SD card slot needs to be relocated to the side- more USB ports need to be added- needs two USB ports on the side

It's a pain to reach behind the computer to insert a Flash Drive or SD card. Form over function all over again.

I attached a compact USB hub and a tiny USB card reader under my iMac with dual sided tape so I have front access without cables hanging around. Lots of people around copied my idea. But now with the thin iMacs you can't even do that.

I guess that means no Retina Apple monitor for one more year...Unless they decide to introduce one along the Mac Pro at their October event but that seems unlikely.

I wonder what are Apple decision criteria in this regard because it's getting pretty ridiculous at this stage. My iPad has way more pixels than all models but the 27" but they fit within much less than one third of the surface area. Even one meter away these monitors pixels look atrociously large. Why is it thatApple think that desktop users are not discriminating enough to elect to purchase a Retina option on desktops?

It makes no sense that mobile users are allowed this choice but desktop ones aren't. I know I would gladly pay extra in terms of LCD and GPU to enjoy perfect typography and razor sharp graphics. I guess they think I must be alone there.

I have last year's 27" model with SSD, upgraded RAM, CPU and GPU, and I am in love with this machine. After going four years per desktop with my own hand built computers or HP workstations for the past 12 years I can easily see myself enjoying the next four years with this beauty. Its been flawless and a pleasure to look at as I work at my desk for hours at a time each day. The hard part is not getting all riled up about new CPU's and such when I can't easily upgrade them myself.

I have last year's 27" model with SSD, upgraded RAM, CPU and GPU, and I am in love with this machine. After going four years per desktop with my own hand built computers or HP workstations for the past 12 years I can easily see myself enjoying the next four years with this beauty. Its been flawless and a pleasure to look at as I work at my desk for hours at a time each day. The hard part is not getting all riled up about new CPU's and such when I can't easily upgrade them myself.

Why get riled up over such a small boost? Heck, I've got an upgradable PC with an old-ass i7-920, and every year I look at the tests for the latest CPUs and think still isn't enough improvement to make upgrading worth the money, or the effort of tearing my computer apart.

"The iMac uses a three-stream (3x3:3) configuration, adding another antenna for a theoretical maximum transfer speed of 1.3Gbps [...]. This is even higher than is possible with wired gigabit Ethernet [...]"

Gigabit ethernet can transfer 1Gbps up and downstream simultaneously. Wireless speeds are typically expressed as the sum of the up and downstream speeds; so 1.3Gbps would equate to a maximum transfer rate of about 0.65Gbps in either direction. Take into account overheads and such (which you did mention), and you get about 80% of that, which is about 60MBps.

So actually, getting 50MBps in your tests is getting quite close to the theoretical maximum! You certainly shouldn't expect it to match or outperform gigabit ethernet.

I guess that means no Retina Apple monitor for one more year...Unless they decide to introduce one along the Mac Pro at their October event but that seems unlikely.

I wonder what are Apple decision criteria in this regard because it's getting pretty ridiculous at this stage. My iPad has way more pixels than all models but the 27" but they fit within much less than one third of the surface area.

Presumably, decision criteria as simple as it being unfeasible.

A 5120x2880 panel at 27 inches, if it exists at all, may not be available in mass quantities yet

Thunderbolt 2 includes DisplayPort 1.2. That can do 3840x2160 fine, but not the 78% more required for such the hypothetical retina resolution.

That's not even getting at the GPU power required to push meaningful content to the thing.

Quote:

Even one meter away these monitors pixels look atrociously large. Why is it thatApple think that desktop users are not discriminating enough to elect to purchase a Retina option on desktops?

Huh? Are you advocating for a smaller screen? You can have that in the form of a MacBook Pro, a 21-inch iMac, or simply a Mac mini with your own screen.

Agreed with the explanation of why retina 27" screens aren't available. These things aren't magic, (oh is it Apple and they perform miracles on a regular basis and are expected more of?) if one looks at the horse power required to push as many pixels as a 4k display - it's bringing the most advanced dual GPU setups to it's knees. And you expect more pixels pushed from a laptop GPU? Really? You think it's a viable thing? Your expectations are a little out of whack with reality.

It's worth noting that the 21" iMacs use a 2.5" HDD, so they REALLY need that SSD boost (or Fusion Drive). The 27"ers use full 3.5" drives, which do much better in the speed department. When Apple moved the 21" to 2.5", I thought for sure that they'd make Fusion Drives standard soon enough, so this is a disappointment.

Quote:

Microsoft and AMD have taken a similar route with the Xbox One, integrating 32MB of eDRAM into the console to help compensate for its slower (but cheaper) DDR3 memory.

ObNitpick: The Xbone has integrated eSRAM, not eDRAM. It serves the same purpose, and the differences aren't exactly relevant in a piece like this, but the text is wrong. I'd change it to something more general, like "memory integrated in the CPU" or something.

I think the price to pay for a cute design is very high:- literally, those things are very expensive !- the monitor base is lackluster, height and swivel, not just tilt, should be a given at that price, though it would mar the looks, I guess- very expensive factory upgrades (ram, disk, ssd...)- quasi-impossible end-user upgrades- very low gaming potential

The machine is nice and all, but that's too many limitations for too little.

I guess that means no Retina Apple monitor for one more year...Unless they decide to introduce one along the Mac Pro at their October event but that seems unlikely.

I wonder what are Apple decision criteria in this regard because it's getting pretty ridiculous at this stage. My iPad has way more pixels than all models but the 27" but they fit within much less than one third of the surface area.

Presumably, decision criteria as simple as it being unfeasible.

A 5120x2880 panel at 27 inches, if it exists at all, may not be available in mass quantities yet

Thunderbolt 2 includes DisplayPort 1.2. That can do 3840x2160 fine, but not the 78% more required for such the hypothetical retina resolution.

That's not even getting at the GPU power required to push meaningful content to the thing.

I think when Apple makes a Retina Thunderbolt Display, it's gonna be a 27" 4K display. Even the HD5000 series can output 4K res, and it should appear "Retina" at a distance of about 26".

I think the price to pay for a cute design is very high:- literally, those things are very expensive !- the monitor base is lackluster, height and swivel, not just tilt, should be a given at that price, though it would mar the looks, I guess- very expensive factory upgrades (ram, disk, ssd...)- quasi-impossible end-user upgrades- very low gaming potential

The machine is nice and all, but that's too many limitations for too little.

It's one of the cheapest, high quality AIO other there. Don't buy it!

You can get a box for cheaper that you can upgrade, with your own "high quality" components.

Why get riled up over such a small boost? Heck, I've got an upgradable PC with an old-ass i7-920, and every year I look at the tests for the latest CPUs and think still isn't enough improvement to make upgrading worth the money, or the effort of tearing my computer apart.

For what little it is worth, I upgraded from an i7-920 to a Haswell i7-4770 (not the K) and was quite happy with the added speed. I'll certainly grant that the i7-920 is still quite a decent CPU, but the incremental benefits over the past few years finally amounted to enough for me to spend the money on an upgrade.

The i7-920 is lacking the AES instructions, while the 4770 has some additional instructions you likely don't care about (AVX, AVX2, F16C, FMA3). http://www.cpu-world.com/Compare/386/In ... 7-920.html claims the 4770 is 1.68x the speed in single-threaded performance and 1.43 in multi-threaded performance. I can't really understand the difference there but hey, it's just a rough measure. It does more or less match up to my experiences, and does so while using less power and (with my specific CPU and case cooling) running quieter.

You may not care to upgrade for a 40%-70% performance boost, and I wouldn't blame you. Just giving you a single data point on someone who did.

Why get riled up over such a small boost? Heck, I've got an upgradable PC with an old-ass i7-920, and every year I look at the tests for the latest CPUs and think still isn't enough improvement to make upgrading worth the money, or the effort of tearing my computer apart.

Funny you say that. I've also always had PCs, but by the time I get around to realizing I need to upgrade it, -everything- is so out of date I just buy a new one from scratch... I treat my PCs basically like iMacs, and I'm willing to bet that a vast majority of PC box buyers do the same (particularly when you consider most box PCs are bought by non-techies + companies + schools + etc, and not hardcore computer gamers).

I've updated year-over-year a handful of times and I actually enjoy waiting 3-4 years between upgrades more--then I really notice the boost and I'm excited. I've gotten a new MBA for the last three refreshes, since it's my main day-to-day computer, and each time has been underwhelming compared to my other computer purchases (although the body improvement was very nice).

The first computer my family owned was a Macintosh Performa 6200. My first experience with MacOS X was when we upgraded our Sawtooth G4 Tower to 10.1. The first laptop I owned was a chicklet iBook that my parents bought me as a gift for a 4.0 GPA my freshmen and sophomore years of high school.

I've been using my mid-2007 iMac as my main computer since I bought it when I began college. It has done very well for the past six years, but it's definitely starting to show it's age. Lion absolutely wrecked my system speed-wise, and it looks like that while Mavericks is going to be an option, there's a laundry list of new features that simply aren't supported on a system as old as mine. The GPU is an ancient Radeon 2600 that struggles even with games that aren't graphics intensive, and the AirPort card has some issues with my new router. But I love the size of it - fitting perfectly even on my tiny desk - and OS X remains my favorite operating system.

So I've been in the market for a new desktop for a bit over a year now. I held off on buying one last spring because I knew Apple would be refreshing the lineup of iMacs. When the announcement came, my first thought was "pretty!", and my second was "I bet they have laptop GPU's". Sure enough, to make their ultra-sleek computers Apple has been shoving underpowered components into them. And that was a dealbreaker for me. I'm not a tinkerer, but I have replaced RAM and the HDD in my current iMac to keep it running relatively smoothly over the years. The lack of something as basic as user-accessible RAM in the lower-end model is an utter shame. Add to that GPU's which barely qualify as "mid range" and lack of even the option of integrated optical drives, and the new iMac line is not even an option for me.

Apple is driving me to Windows because they're prioritizing form over function. I don't buy a desktop computer the way I buy a tablet - I expect it to last a while instead of replacing it every other year. I need to be able to bump up the RAM as the OS requirements increases, I need a GPU that will be able to handle things five years from now. I really prefer the all-in-one form factor, but not at the expense of usability. I really prefer OS X, but I will go to Windows if Apple can't deliver what I need in a desktop.

Come on Apple, make a desktop that gives us some flexibility. My money is waiting.

Nice to see others pissed about the lack of a new cinema display. I know I keep busting Apple on it here on Ars, but seriously where the hell is our new monitor? It's been 2+ years (which isn't eternity, I know, but for Apple it's pretty damn abnormal).

The first computer my family owned was a Macintosh Performa 6200. My first experience with MacOS X was when we upgraded our Sawtooth G4 Tower to 10.1. The first laptop I owned was a chicklet iBook that my parents bought me as a gift for a 4.0 GPA my freshmen and sophomore years of high school.

I've been using my mid-2007 iMac as my main computer since I bought it when I began college. It has done very well for the past six years, but it's definitely starting to show it's age. Lion absolutely wrecked my system speed-wise, and it looks like that while Mavericks is going to be an option, there's a laundry list of new features that simply aren't supported on a system as old as mine. The GPU is an ancient Radeon 2600 that struggles even with games that aren't graphics intensive, and the AirPort card has some issues with my new router. But I love the size of it - fitting perfectly even on my tiny desk - and OS X remains my favorite operating system.

So I've been in the market for a new desktop for a bit over a year now. I held off on buying one last spring because I knew Apple would be refreshing the lineup of iMacs. When the announcement came, my first thought was "pretty!", and my second was "I bet they have laptop GPU's". Sure enough, to make their ultra-sleek computers Apple has been shoving underpowered components into them. And that was a dealbreaker for me. I'm not a tinkerer, but I have replaced RAM and the HDD in my current iMac to keep it running relatively smoothly over the years. The lack of something as basic as user-accessible RAM in the lower-end model is an utter shame. Add to that GPU's which barely qualify as "mid range" and lack of even the option of integrated optical drives, and the new iMac line is not even an option for me.

Apple is driving me to Windows because they're prioritizing form over function. I don't buy a desktop computer the way I buy a tablet - I expect it to last a while instead of replacing it every other year. I need to be able to bump up the RAM as the OS requirements increases, I need a GPU that will be able to handle things five years from now. I really prefer the all-in-one form factor, but not at the expense of usability. I really prefer OS X, but I will go to Windows if Apple can't deliver what I need in a desktop.

Come on Apple, make a desktop that gives us some flexibility. My money is waiting.

Buy a USB optical drive. Works good I hear. I haven't use an optical drive in a few years.

First you talk about how your 6 year old iMac that you are still using stands the test of time. But then complain about how the ram isn't upgradable. You need a GPU that could stand up for 5 years, your last did for the last 6 years. If you want a gaming rig this probably isn't the one for you. Otherwise the upgradeable to 32 gig in the 27" is probably sufficient ram for the future.

If you want some comfort and peace of mind. Most of the pain of advancement is probably gone, 64 bit, large ram, PCIe, advanced firmware. It should be more futureproof than your last iMac. The only way this would change if Apple decided to make their own Arm chips to work on their computer lines, but this transition would take 5 years or so - so again this machine would still work for you for your intended time.

The lack of an off-the-shelf SSD iMac is rather odd, but it's worth remembering that Apple offers an excellent CTO service. When I order Macs for work they're almost always custom configured, and they still get delivered in a week or so.

Apple is driving me to Windows because they're prioritizing form over function. I don't buy a desktop computer the way I buy a tablet - I expect it to last a while instead of replacing it every other year. I need to be able to bump up the RAM as the OS requirements increases, I need a GPU that will be able to handle things five years from now. I really prefer the all-in-one form factor, but not at the expense of usability. I really prefer OS X, but I will go to Windows if Apple can't deliver what I need in a desktop.

Come on Apple, make a desktop that gives us some flexibility. My money is waiting.

I also buy a desktop and then use it for five years or so, so I don't even need flexibility, just a decently fast GPU to begin with. The best available GPU in the iMac, 780M, has a notch lower level of perf than I'd like, but I would probably settle for that level if Apple managed to deliver it at a reasonable price and adequate ergonomics. The iMac is not even close to either. It's mindboggling that their usability people have allowed the visual designers (Ive?) to get away with the display foot of the iMac, especially on the 27" model. Powerful (or reasonably affordable) GPUs are obviously out of the question as long as case thickness is their #1 design priority on a desktop computer.

Beautiful looking machine, but you hit the nail on the head with the SSD issue. At normal prices, a 1TB laptop drive is $75.00, and a 256GB SSD is $200. This is at full retail, and I'm sure even better deals could be had by Apple with volume pricing.

Why they don't include it in the machines right off the bat? Not a big surprise after seeing what they charge for upgrading. And I would dare say most people would be more than happy with 256GB storage. For bulk storage, USB drive, or Thunderbolt drive.

And I would dare say most people would be more than happy with 256GB storage.

I don't think they would. With the amount of music, photos, videos and the like that most home users have these days, 500GB is probably a working minimum. When I installed our first SSD at work in 2010 I chose a 128GB for cost reasons ($400!), but it took constant vigilance to stop it from filling up — even though all our files live on a server. Now we use 256GB SSDs, which of course only run about $200, for our boot/app drives.

I think the iMac should come with a 1TB Fusion drive (128GB SSD + 1TB mechanical) as its standard configuration.

And I would dare say most people would be more than happy with 256GB storage.

I don't think they would. With the amount of music, photos, videos and the like that most home users have these days, 500GB is probably a working minimum. When I installed our first SSD at work in 2010 I chose a 128GB for cost reasons ($400!), but it took constant vigilance to stop it from filling up — even though all our files live on a server. Now we use 256GB SSDs, which of course only run about $200, for our boot/app drives.

I think the iMac should come with a 1TB Fusion drive (128GB SSD + 1TB mechanical) as its standard configuration.

Apple probably wanted to keep the price point of the unit the same. Adding a fusion drive would increase the price by $100, and you would would have people complaining how they don't need something like this and would like save the money. Maybe next year when a few other things go down on price they can stick to their price point and add fusion for nearly the same markup.

For people wanting a retina iMac, I don't think the GPU's are there yet. The retina Macbook uses Tesla GPU's and they still had issues with things like scrolling at full speed.

I understand that a software update has since fixed it at the retina Macbook's resolution, but a retina iMac would just be piling on the pixels.

Can you imagine what Apple would charge for a dual GPU iMac and the sheer amount of butthurt that price would cause?

Apple has consistently increased the resolution of the iMac displays over time as higher resolution panels have become available. With OS X, there is less reason to simply pixel double the way they did for iOS.

The 15-inch MacBook Pro with Retina display uses a Kepler (GK107) based NVIDIA GeForce GT 650M, but yes, it was barely enough horsepower to get the job done.

Assuming that they work within the limitations of DisplayPort 1.2, they could have gone with a 3840x2160 or 3840x2400 panel, and squeaked by with a GTX 760M or 765M for the base model. Sure, you wouldn't be gaming at native resolution, but the UI would be fluid enough.

27-inch panels at those resolutions are just not available for any remotely reasonable price yet, i.e. sub $1000, and I'm not sure of anyone making a native eDP panel that supports a single HBR2 main link yet.

Edit: After doing some quick math, a 27" 3840x2160 display would be around 163 ppi and would become Apple's 1 arc minute definition of "Retina" at a viewing distance of around 22 inches or 55 cm, which I would imagine is a pretty typical viewing distance for these machines.

1. NO PERFORMANCE LEAPS OVER 2012 MODEL: Blame it on Intel. The Haswell CPUs are hardly any faster than the ones they replaced. There is nothing Apple can do about that.

2. NO SSD STANDARD: At least the SSD IS AVAILABLE. The only reason for it being an option is that the iMac is for price-sensitive consumers. The standard hard drive allows the iMac to be as inexpensive as possible. Thus, this is not a weakness or bad feature as much as a feature allowing the consumer to pay as little as possible for an iMac.

Why would anyone buy this over an HP Touchsmart? The last thing you want for a kitchen PC is to mess with a mouse or trackpad. A touchscreen-enabled all-in-one PC with even a rudimentary front end is all most people need to watch videos or look up recipes. An iMac with a touchscreen that can run iOS as an overlay would be killer.

Why would anyone buy this over an HP Touchsmart? The last thing you want for a kitchen PC is to mess with a mouse or trackpad. A touchscreen-enabled all-in-one PC with even a rudimentary front end is all most people need to watch videos or look up recipes. An iMac with a touchscreen that can run iOS as an overlay would be killer.

Because they want to run OS X? Because they like the look of it? Because they don't want to get arm ache using a computer? Because most PCs don't live in the kitchen?

There's already an Apple product sporting a touchscreen that runs iOS. It's called an iPad.