AMD Radeon — is it 'pro' enough for the MacBook Pro?

As soon as Apple announced the new MacBook Pro, there were rumblings over the 15-inch version's AMD Polaris graphics being underpowered and insufficient for a "pro" machine, and that NVIDIA's lower power Pascal options would've been a better choice. On PC, in the Windows world where DirectX exists, I would agree with you. But this is Apple's world. The question to whether or not a GPU is sufficient has to be looked at in the context of why this machine exists in the first place.

The MacBook Pro is not and has never been built for high-end gaming. EVER. Gaming requires a different targeting of hardware than what Apple does with this machine. The MacBook Pro was designed for video editors, photo editors, and music producers who are not always at a desk and who want elevated performance in comparison to the MacBook (non Pro).

The MacBook Pro is not and has never been built for high-end gaming.

Apple wants to become self-sufficient with APIs that control its OS, as well as the hardware architecture it employs. In Apple's perfect world, the software and silicon would be 100% designed by them. AMD's GPUs give them the closest shot at achieving this, especially alongside Apple's own graphics framework, Metal.

And that's why it's highly unlikely that we will ever see another NVIDIA GPU in any Apple product again.

So, why no NVIDIA?

NVIDIA understands the customer base that provides the best opportunity to deliver value and to make money. Outside of the high end Tesla/Titan range of GPUs, you have the GeForce. Traditionally, as a developer, you would want to get the best performance out of DirectX 11/12 and, more recently, Vulcan.

These APIs are designed to run on thousands of configurations. Vulcan and DX12 are optimized more than ever, but still don't match what a developer could do if they had more access to the hardware. Why does this matter? It is the basis for understanding why NVIDIA consumer GPUs behave in the way that they do.

NVIDIA GPUs are awesome for gaming on Windows, which is primarily why they were designed. Apple did not design the MacBook Pro for gaming. So, NVIDIA spends a lot of time and money optimizing its business for a scenario that is the opposite of Apple's goals. The additional benefits such as data centers, deep learning, and other technology that NVIDIA excels in is a result of its hard work in the gaming arena over the last 20 years. As a whole, modern consumer NVIDIA GPUs (and their driver stack) are designed to run DirectX games and CUDA applications as fast and efficiently as possible in a Windows environment.

And Apple has Metal?

Yup. Because macOS doesn't have DirectX, but instead has Apple's own Metal, the playing field changes. Metal was developed from the ground up by Apple to be fast and efficient in the way that it feels is beneficial to its customers. While I'm unable to disclose their names, a few colleagues who are game developers for large organizations talk about their experience working with NVIDIA.

Metal was developed from the ground up by Apple to be fast and efficient in the way that it feels is beneficial to its customers.

NVIDIA is a class organization and is very supportive of its customers and developers. If you decide to leave the umbrella of CUDA (programming model for nNVIDIA GPUs), though, then you're on your own. Some "items" —features — that are afforded to developers are actually not built into NVIDIA GPUs to be used directly and in the traditional way. NVIDIA's world class engineers are able to bind those pieces together with CUDA, and for games, GameWorks.

NVIDIA has pushed many resources into CUDA, and the developers who ignore it lose those features and functionality. Its hardware is designed in such a way that CUDA compliments it, and that's a good thing... for everyone but Apple.

If you want to get the most out of an NVIDIA GPU, you need to use CUDA. I'm told that no matter who you are, NVIDIA will not help you if you decide to go "lower than CUDA," and will not technically support that journey.

How does AMD help?

AMD has similar goals to NVIDIA but is in a different situation financially and competitively. While AMD may not have the fastest consumer GPUs in linear speed, it won't stop you or discourage you from programming straight to the absolute lowest level of the hardware. You know, to the metal! As a matter of fact, it's encouraged.

AMD won't stop you or discourage you from programming straight to the absolute lowest level of the hardware.

AMD's Core Graphics Next (CGN) architecture is more flexible and open than any NVIDIA architecture from Fermi - Pascal. When I say "flexible", I mean the ability to develop your own path of how you want to use the GPU. Because of this flexibility, a developer, in this case Apple, could develop an API on top of a CGN-type GPU with very little restrictions, while using the silicon as efficiently as possible.

That brings us back to Metal?

Exactly. Apple's goal is to deliver power-efficient, fast, light machines, tuned exactly to their specific use cases. AMDs CGN is Apple's only free-from-fixed-function hardware on the Mac. Intel delivers fixed function x86 chips, NVIDIA would deliver highly optimized CUDA chips, but AMD delivers a chip with few fixed functions and an open platform. Apple can have its way with AMD GPUs, allowing them to give way to high-end performance with what I call "mid-range Windows parts."

But wait! Doesn't Metal work on NVIDIA GPUs right now?

Yep. But there's only so much Apple would likely be willing to do in order to move forward with any GPUs from NVIDIA because of the issue of "getting to the metal" for the reasons described above.

Will the 2016 MacBook pro with AMD Polaris graphics be able to perform at a "high-end" level?

Think of it like this: For a long time, Apple licensed designs from PowerVR to develop its GPUs inside of its iOS devices. Those devices were usually at the top of their class, outperforming most Android devices using the exact same design or similar designs. This was due to their solid implementation. Anytime you have hardware as flexible as AMD's, the stack on top of that hardware is essentially like developing a custom GPU (in theory). You have the ability to control nearly everything going in and out of that chip, as well as memory manipulation at the lowest level.

That's what AMD gives Apple on the Mac.

So, yes, the new MacBook Pro will deliver performance on a high-end level in video editing with Final Cut Pro in comparison to a Windows machine running Adobe Premiere by a large margin. It will also drive 4K and 5K displays, even dual 5K displays. It will not be a bottleneck for performance and will perform as a pro.

If you don't believe me, go try one out for yourself.

Dexter Talbert
23 Comments
  • Apple does not develop Macbook Pros for gaming. Why would they do that? There is already a mature and established game console market, and these devices are pretty good in terms of price/performance. Yet Macbook Pros are the only *portable* gaming solutions for both platforms. Buyers are not confined to a single platform and are given the capability of choosing the best games from Windows or macOS. That matters because macOS implementations of many games may be somewhat better in terms of packaging/DLC/price than their Windows counterparts. Especially the new Macbook Pros excel in game/level load times thanks to their fastest implementation of SSD in the industry (NVMExpress over PCIe instead of SATA). The Macbook Pro's Force Touch trackpad may even eliminate the need to carry a separate mouse or joystick, it is so fluent and pleasant to play a game with. The number of games supported by Intel HD graphics grows in every iteration and where it isn't enough, the AMD Polaris comes to the rescue. That mobile GPU may not give the performance of full blown desktop GPU in terms of FPS but at least you're able to carry your games along, FPS is not everything. And since Macbook Pros are built with totally low power mobile components to cope with the heat, you can play your games as long as you want without fearing of burning the motherboard of your laptop or crashing onto the processor throttling barrier.
  • The only virtual based game running under WINE a Mac game handle without the fans spinning at full speed is Duke 3D Manhattan project using Windows API calls.. If u start playing Need for speed undercover under WINE, u'r Mac turns into a 'toaster'. Could also be sloppy coding.
  • Good info totally irrelevant. You can install Windows on a separate partition then you boot into Windows and your Mac becomes a Windows laptop. This is called BootCamp. Although you can play Windows games under Parallels VM, Parallels doesn't support more than 1GB VRAM. This is why at some point BootCamp will be indispensable. Under BootCamp your Windows has full native access to MacBook's hardware, including GPU.
  • Relating to the heat issue, I was having serious issues with kernel_task throttling my 2013 2.3Ghz 15" MBP when running heavy ProTools sessions and driving 2 large monitors. The fans were constantly running at top speed and the machine was always running very hot. My 2016 2.9Ghz 15" MBP with RP 460 runs the same sessions smoothly with the fans running between 2450-2650 rpm and I can barely hear them. Informative article, Dexter!
  • Apple had a real issue with the heat in early-2011 Macbook Pro 15". The lead-free solder gave up and the GPU loosened causing display artifacts or display not working at all. Apple has initiated a free logic board replacement program. That machine was working at 203 F / 95 C of under heavy load. Last year's 15" Retina MBP 15" works at 147 F / 64 C, that is a big progress and I see from your experience that the new MBPs are even better. These examples show why the use of low power mobile components in a laptop is such a crucial issue.
  • That's a huge improvement! Way to go Apple!
  • The problem (if it affects somebody) is that FCP is pretty much the only application where Apple has this performance benefit, since big players like Adobe do support CUDA, but barely support the choices Apple made. Large Photoshop automation jobs on a Windows PC with a NVIDIA GPU do run circles around a Mac with an AMD card... As far as you can replace these products with alternatives (like Affinity) this works OK, but if you can't, Apple does not really offer anything. And while I do not suffer from this (I avoid Adobe like the plague and do use FCP), I do think that Apple, making the highest profits in that segment, should offer those who need it a choice here.
  • Do those Photoshop automated jobs use the Mac's discrete GPU or Intel's integrated GPU? If they don't support every mainstream GPU because of lazy programming this is Photoshop's fault. Also, if it is a Macbook Pro that you mention, Photoshop may default to Intel graphics to preserve battery.
  • Does not make much of a difference (you can turn GPU switching off in the MBP and thus enforce the use of the dedicated GPU). The way it looks, some operations (like Gaussian Blur) seem to use CUDA or not offload to the GPU at all. Once a computationally heavy task takes place in the CPU only, the performance difference becomes huge. Sure it is Photoshops fault (assuming they could implement that properly using OpenCL, or whatever Apple is using for that particular card), but that does not really help the person using it. Well, maybe it is only a matter of time. Adobe has never been fast with anything, and since people pay them a rent instead of choosing to pay for upgrades if they are worth it, it certainly did not get any better.
  • In this case, it would be Metal. Adobe originally planned (informally) to use Metal across CC apps on Mac. Last year I contacted them about this and never received a response. Coincidentally, in March I bumped into a few Adobe employees in Austin, TX at SXSW and asked about Metal. I was told that they are indeed working on it but not a priority. Earlier this year when Adobe CC was updated to version 15.3 it included "some" Metal support for Premiere Pro CC. I'm not sure if it's performance as I stopped using Premiere about a year ago. Also as far as Metal is concerned Final Cut isn't the only app that utilizes it. macOS as a whole uses it. It also has the ability to use multiple GPUs for compute purposes, caching, extra memory, and tons of other flexible things.There are still some legacy functions laying around and they fall back to OpenCL & OpenGL. I plan to do an in-depth Metal article and video that includes benchmarks and comparisons.
  • "It's highly unlikely that we will ever see another NVIDIA GPU in any Apple product again". I find that hard to believe, especially considering NVIDIA was hiring for engineers recently to work on future GPU's for use in Apple products.
  • While I wish this to be true based on some reports that I saw earlier this year (I'm an nVidia fanboy), nVidia would really have to open up a little as far as pricing and architecture to suit Apple's needs. Based on nVidia's roadmap we won't see a new architecture from them until 2018. So what's left? Mac Pro and iMac could use some Pascal parts in the near future. However, there have been AMD Zen rumors coming to iMac. I guess we'll see here in the coming months, Apple has shocked me before.
  • If nVidia never supports developers who don't use it's proprietary middle-layer software CUDA, then it will never be used in Apple's products.
  • So.. is that (VERY) expensive GPU good for anything other than Final Cut Pro? Seems like this device is good for video editors that use FCP, but for nobody else. It's too expensive, doesn't have great sRGB or AdobeRGB coverage (ok this can be forgiven since it has different color target), under load it lasts VERY little (55 min vs 105 min of Dell XPS 15). It's an expensive FCP editor, that has good battery only when you're using it for email and web browsing. The storage speed is very impressive though!
  • You're comparing apples to oranges. Dell XPS 15 has a Nvidia GTX 960M. And Nvidia is the whole point of the article. You're in the wrong article, this is not a laptop comparison. This is about why Nvidia has no future with Apple at its present state. If you have some comments about that, you're welcome.
  • Article: "is the AMD a good choice? Yes, because it allows Apple to come closer to the hardware to provide Metal." Other commenters: "is Metal good for anything?" "Yes, it's great for FCP! This is a video editing machine!" My comment: "yeah but it sucks at being a mobile workstation, and it's good for only one thing, FCP. Not enough!" I should've added that I think N-Vidia would be a better choice to make my comment more complete. That way it would become more than a FCP machine. For example, plenty of scientists around me use Unix machines, but rely on CUDA for protein interaction simulations. They all use Macs, and would benefit from this machine having an N-Vidia chip.
  • FCP is only an example on how Metal provides better performance than Nvidia's CUDA. Interpreting this as "AMD is only useful for FCP" is silly. Metal is an architecture and is available to every developer. Thanks to Metal, developers can fine tune the GPU, make it sing to their wish. This is exactly what CUDA does not provide and tries to prevent instead. And your protein interaction point is exactly the point of the article but in the negative sense: protein interaction simulations cannot benefit from CUDA since CUDA is optimized for *Windows games*, not for general purpose GPU usage. In order to fine tune the GPU for protein simulations, your obsolete off-the-shelf application developers must support AMD with Metal. That is the whole point of the article.
  • I can't find a single scientific article using Metal for simulations. Also do you have any examples of its usage outside of FCP? Maybe the system UI, it is pretty smooth on Macs..? And isn't CUDA meant specifically for non-game GPU uses?
  • Using Metal for simulations doesn't require scientific articles, all the info and techniques are given in Apple's developer documentation. Using that info is just a matter of insight and intent. As for your second point, according to the article, no, CUDA is not meant specifically for non-game GPU uses: "NVIDIA GPUs are awesome for gaming on Windows, which is primarily why they were designed". To check, get both CUDA's and Metal's documentation and just compare. Metal as defined by Apple: "Render advanced 3D graphics and perform data-parallel computations. Get fine-grained access to the GPU while minimizing CPU overhead.". The key point is "fine-grained access to the GPU." In contrast CUDA as defined in Wikipedia "The CUDA platform is a software layer that gives direct access to the GPU's virtual instruction set and parallel computational elements, for the execution of compute kernels.[2]". The key points here are "GPU's virtual instruction set" and "software layer". You write to a "layer" not to the GPU. The existence of some scientific applications for CUDA simply reveals that CUDA is earlier to Metal. Such precedence doesn't mean that those scientific applications cannot be better implemented in Metal.
  • Perhaps you are right and Metal is just a little too young. However, perhaps it is not used because of the extremely limited availability of the hardware: it exists only on a limited number of relatively weak GPUs, all sold by Apple at a premium price inside their machines. As a result, Metal might stay a great tool used for optimization of FCP and not much else. As a result, the choice of the GPU was poor - there are more powerful cards out there that have a broader range of applications (Vulkan, CUDA, yes even gaming). To comment on your CUDA for games use: I looked (briefly) at the CUDA documentation, and I did not see any games being mentioned.. http://docs.nvidia.com/cuda/cuda-c-programming-guide/index.html#introduc... \\edit: I am aware of Metal on iOS and its importance there. Still, Vulkan might be the future... \\edit2: I just realized the article misspelled Vulkan.. although who cares really
  • Anyway, competition will always exist, between APIs or processors. Neither AMD is necessarily weak, nor the existence of CUDA applications makes Nvidia necessarily powerful. The point here is the architectural difference between the two GPU platforms and Apple's actual commitment to one. I believe Apple's choice is the right one as we're collecting its fruits not only with FCP but also with many Metal games on iOS. Metal for the macOS is very recent too and thanks to the raw power of the new Macbook Pros, its applications will proliferate in universities too. Remember Apple is also a chip producer. CUDA for iOS? nil... Protein simulation on an iPad? Right there, thanks to Metal.
  • Another iMore article on Metal: http://www.imore.com/metal-os-x-so-huge-i-no-longer-need-mac-pro
  • "on a 'high-end level' in video editing with Final Cut Pro in comparison to a 'Windows machine' running Adobe Premiere by a large margin." LMAFO. How blind you are! With the 2799$ for the highest spec Macbook pro 15inch, I can build a "windows machine" which have the performance that makes that Macbook Pro looks like a joke. And talk about the "high-end" level. If Disney uses Macbook Pro for their Big Hero 6 project, it would take a trillion years for rendering =)) =)))