Over the past few years, you wouldn’t needto go far to find pro-level users upset at Apple over the dearth of high-end Mac hardware from 2013ish until nowish.
Things are getting better. The iMac Pros are beasts, the rumors about the next MacBook Pro update are promising, and the soon-to-be released Mac Pro is more computer than the average person will ever use or need. But while Apple’s recent efforts have all-but overcome the weaknesses in their line-up, a recent article that happened across my desk has raised some red flags.
In the Visual Effects Roundtable over at postPerspective, the following question was asked of almost¹ every industry professional: “How will real-time ray tracing play a role in your workflow?”
Currently, the only graphics cards with realtime raytracing tech are made by Nvidia.
Apple doesn't use Nvidia cards. Apple doesn't support Nvidia cards. Apple won't even sign drivers that Nvidia made on their own². Turns out that the ole' Apple-Nvidia blood fued is still alive and kickin’ at Apple Park.
Admittedly, I have some skin in the game — my Hackintosh uses a Nvidia graphics card, and the lack of driver support in both Mojave and Catalina have negatively impacted my editing workflow as I am no longer able to run to most up-to-date version of Final Cut Pro X. While getting official Nvidia support would be a boon to me, that's beside the point. Why make one of the most powerful desktops on the market if it is unable to use the most advanced graphics cards? For a VFX house looking at investing in new technology, if they are even thinking about real-time ray tracing, all Apple devices are immediately out of contention. The lack of Nvidia cards will undermine the Pro in Mac Pro.
Notably, they didn’t ask the AMD representative this question.
While nothing is technically stopping Nvidia from writing and releasing their own drivers, because Apple won't sign them the OS will be unable to verify the integrity or legitimacy of the file. Understandably, Nvidia is leery of releasing them this way.