In what’s turning out to be an oddly GPU-centric week for Apple, this morning the company has revealed that they will finally be giving the long-neglected Mac Pro a major update in the 2018+ timeframe. Apple’s pro users have been increasingly unhappy by the lack of updates to the company’s flagship desktop computer, and once released, this update would be its first in what will be over 4 years.

Getting to the heart of matters, Apple invited a small contingent of press – including John Gruber and TechCrunch’s Matthew Panzarino – out to one of their labs to discuss the future of the Mac Pro and pro users in general. The message out of Apple is an odd one: they acknowledge that they erred in both the design and handling of the Mac Pro (as much as Apple can make such an acknowledgement, at least), and that they will do better for the next Mac Pro. However that Mac Pro won’t be ready until 2018 or later, and in the meantime Apple still needs to assuage their pro users, to prove to them that they are still committed to the Mac desktop and still committed to professional use cases.

Both of these articles are very well written, and rather than regurgitate them, I’d encourage you to read them. It’s extremely rare to see Apple talk about their future plans – even if it’s a bit vague at times – so this underscores the seriousness of Apple’s situation. As John Gruber puts it, Apple has opted to “bite the bullet and tell the world what your plans are, even though it’s your decades-long tradition — a fundamental part of the company’s culture — to let actual shipping products, not promises of future products, tell your story.”

However neither story spends too much time on what I feel is the core technical issue, Apple’s GPU options, so I’d like to spill a bit of ink on the subject, if only to provide some context to Apple’s decisions.

Analysis: GPUs Find Their Sweet Spot at 250 Watts

From a GPU perspective, the Mac Pro has been an oddball device from day-one. When Apple launched it, they turned to long-time partner AMD to provide the GPUs for the machine. What AMD provided them with was their Graphics Core Next (GCN) 1.0 family of GPUs: Pitcairn and Tahiti. These chips were the basis of AMD’s Radeon HD 7800 and HD 7900 series cards launched in early 2012. And by the time the Mac Pro launched in late 2013, they were already somewhat outdated, with AMD’s newer Hawaii GPU (based on the revised GCN 1.1 architecture) having taken the lead a few months earlier.

Ultimately Apple got pinched by timing: they would need to have chips well in advance for R&D and production stockpiling, and that’s a problem for high-end GPU launches. These products just have slow ramp-ups.

Complicating matters is the fact that the Mac Pro is a complicated device. Apple favored space efficiency and low-noise over standard form-factors, so instead of using PC-standard PCIe video cards for the Mac Pro, they needed to design their own cards. And while the Mac Pro is modular to a degree, this ultimately meant that Apple would need to design a new such card for each generation of GPUs. This isn’t a daunting task, but it limits their flexibility in a way they weren’t limited with the previous tower-style Mac Pros.

Mac Pro Assembled w/GPU Cards (Image Courtesy iFixit)

The previous two items we’ve known to be issues since the launch of the Mac Pro, and have commonly been cited as potential issues holding back a significant GPU update all of these years. However, as it turns out, this is only half of the story. The rest of the story – the consequences of Apple’s decision to go with dual GPUs and using a shared heatsink via the thermal core – has only finally come together with Apple’s latest revelation.

At a high-level, Apple opted to go with a pair of GPUs in order to chase a rather specific use case: using one GPU to drive the display, and using the second GPU as a co-processor. All things considered this wasn’t (and still isn’t) a bad strategy, but the number of applications that can use such a setup are limited. Graphical tasks are hit & miss in their ability to make good use of a second GPU, and GPU-compute tasks still aren’t quite as prevalent as Apple would like.

The drawback to this strategy is that if you can’t use the second GPU, two GPUs aren’t as good as one more powerful GPU. So why didn’t Apple just offer a configuration with a single, higher power GPU? The answer turns out to be heat. Via TechCrunch:

I think we designed ourselves into a bit of a thermal corner, if you will. We designed a system that we thought with the kind of GPUs that at the time we thought we needed, and that we thought we could well serve with a two GPU architecture… that that was the thermal limit we needed, or the thermal capacity we needed. But workloads didn’t materialize to fit that as broadly as we hoped.

Being able to put larger single GPUs required a different system architecture and more thermal capacity than that system was designed to accommodate. And so it became fairly difficult to adjust.

The thermal core at the heart of the Mac Pro is designed to be able to cool a pair of moderately powerful GPUs – and let’s be clear here, at around 200 Watts each under full load, a pair of Tahitis adds up to a lot of heat – however it apparently wasn’t built to handle a single, more powerful GPU.

The GPUs that have come to define the high-end market like AMD’s Hawaii at Fiji GPUs, or NVIDIA’s GM200 and GP102 GPUs, all push 250W+ in their highest performance configurations. This, apparently, is more than Apple’s thermal core can handle. In terms of total wattage, just one of these GPUs would be less than a pair of Tahitis, but it would be 250W+ over a relatively small surface area as opposed to the roughly 400W over nearly twice the surface area.

Video Card Average Power Consumption (Full Load, Approximate)
GPU Power Consumption
AMD Tahiti (HD 7970) 200W
AMD Hawaii (R9 290X) 275W
AMD Fiji (R9 Fury X) 275W
NVIDIA GM200 (GTX Titan X) 250W

It’s a strange day when Apple has backed themselves into a corner on GPU performance. The company has been one of the biggest advocates for more powerful GPUs, pushing the envelope on their SoCs, while pressuring partners like Intel to release Iris Pro-equipped (eDRAM-backed) CPUs. However what Apple didn’t see coming, it would seem, is that the GPU market would settle on 250W or so as the sweet spot for high-end GPUs.

Mac Pro Disassembled w/GPU Cards (Image Courtesy iFixit)

And to be clear here, GPU power consumption is somewhat arbitrary. AMD’s Fiji GPU was the heart of the 275W R9 Fury X video card, but it was also the heart of the 175W R9 Nano. There is clearly room to scale down to power levels more in-line with Apple’s ability, but they lose performance in the process. Without the ability to cool a 250W video card, it’s not possible to have GPU performance that will rival powerful PC workstations, which Apple is still very much in competition with.

Ultimately I think it’s fair to say that this was a painful lesson for Apple, but hopefully one they learn a very important lesson from. The lack of explicit modularity and user-upgradable parts in the Mac Pro has always been a point of concern for some customers, and this has ultimately made the current design the first and last of its kind. Apple is indicating that the next Mac Pro will be much more modular, which would be getting them back on the right track.

Source: Daring Fireball

Comments Locked


View All Comments

  • Meteor2 - Tuesday, April 4, 2017 - link

    I really like the trash can Mac. Part of me thinks that's how a computer _should_ look, all custom components like a 1980s Cray. I liked the Cube too. But the G3 and G5 Power Macs were specifically designed to be upgradable and were much better designs. It's amazing that Apple made the Cube mistake twice.

    Blaming heat is a red herring. As Anandtech's basic maths showed, the design could handle the heat rejection. The cost was redesigning those complex, custom mechanical designs after they'd misjudged the direction of software development. They didn't want to write off their costs or risk the same mistake twice, so they've just sat on it.

    Also, from the Daring Fireball article, 'Ternus put it plainly: “Some of our most talented folks are working on [the Mac]. I mean, quite frankly, a lot of this company, if not most of this company, runs on Macs. This is a company full of pro Mac users.' -- Er, so, Macs aren't good enough for some or *most* of Apple's workforce? That says a lot about the adequacy of Windows today...
  • Morawka - Tuesday, April 4, 2017 - link

    To little to late apple, good luck getting those customers back. why would they trust you again and go above and beyond to change workflows when Microsoft is working great?
  • osxandwindows - Tuesday, April 4, 2017 - link

    You sure they didn't remove windows and install linux on it?
  • TEAMSWITCHER - Tuesday, April 4, 2017 - link

    Because Microsoft isn't the end all and be all anymore. I simply can't do my job anymore without a Mac. You can blame iPad and iPhone for that.
  • Meteor2 - Wednesday, April 5, 2017 - link

    That suggests to me that you're doing it wrong...
  • zodiacfml - Tuesday, April 4, 2017 - link

    As someone mentioned already, it is not because of thermals or anything technical. It is because they can't justify the time and cost to design custom parts for such a low volume product compared to their cash cow, iPhones.
    Now, they are hoping for a design that can take off the shelf PC parts and hope it won't require re-design for the next 4 to 6 years.
  • ABR - Tuesday, April 4, 2017 - link

    I knew that without any easy path for upgrading GPUs this product was dead in the water. That's where performance evolves today and is why plenty of people are still using tower Mac Pros happily that perform as well as or better than the trash cans for compute-intensive tasks.
  • Wolfpup - Tuesday, April 4, 2017 - link

    That's really interesting that there's a thermal limit...and one just below what it needs to be.

    I still wish they'd release a "consumer" version with...well using this design, lets say normal consumer CPUs + high end much as you can cram in 200 watts, and bring the prices down $1000 for all configurations for them...
  • iwod - Tuesday, April 4, 2017 - link

    Do we have any other cooling solution to fit within that space for a 50% increase in heat. ( 300W )
    Would a bigger fan have done it? or Apple think the noise is too loud?
    Would Pro still have complain if the design could fit two 300W GPU inside?

    And finally, none of these, GPU Thermals, or Modular Design, has ANY reason to do with Mac Pro still on 22nm! Ivy Bridge, DDR3 Memory, and OLD FireGL GPU. The latest top end Radeon Pro is only 175W.
  • cbm80 - Tuesday, April 4, 2017 - link

    Apple should do a Kickstarter compaign to raise funds for a new case design.

Log in

Don't have an account? Sign up now