Posts tagged Apple
Abyssinia, Hackintosh

Abyssinia, Hackintosh

After two years of service, my Hackintosh has been retired. Well, the Mac half has. The Windows half is still in the middle of in the middle of Jedi: Fallen Order.

The story, for those unfamiliar (aka everybody): two years ago, the iMac that took me through film school and my first years in Los Angeles finally kicked the bucket. While I have no doubt that I killed it through a combination of multicam video editing, YouTube, and Overwatch, it doesn't change the fact that my iMac was how I looked for work, wrote my stuff, and edited the then-still-alive Blue Post Podcast.

I nabbed the first appointment I could get at a nearby Apple Store and had them take a peek inside. By the time they finished their tests, they determined that the logic board (Applespeak for motherboard) had failed. The cost to repair my six-year-old machine: $800 (almost).

So there I was, mostly broke, unemployed, and still in need of a computer. These were also the dog days of Apple's updates for the Mac, so machines on the low-end weren't great — the Mac mini of the time was basically an expensive paperweight with an HDMI port — and a full-on replacement was more than I could rationally spend on a new computer. After all, I still needed money for groceries. 

While my roommate was at work, he would let me borrow his laptop so I could research my options. Since Los Angeles is a land of entertainment professionals who swear by the Mac, I discovered a few local retailers who sold used devices. I could get an iMac that was only a few years older than the dead one on my desk. For $800.

Unless I sacrificed video editing and limited my video games to text-based adventures, I was going to spend $800 on an old machine¹. And buying an old computer that I would need to replace soon thereafer felt like literally throwing money away. And then I remembered hackintoshing.

Months earlier, while still gainfully employed, I had started planning to build my own gaming computer. Bootcamp on my iMac was fine, but the graphics card was getting old. And while my iMac could run circles around the average PC when I bought it, the fact I couldn't upgrade my graphics card put games like Grand Theft Auto V or The Witcher 3 out of reach². Having a dedicated gaming machine that I could update over time would alleviate that problem, and would enable me to spend less money on a replacement for my iMac³.

I realized that with a bit of tweaking (and a few sacrifices around the CPU, memory, and hard drive), my planned gaming system could work as a hackintosh, and it would only cost me — wait for it! — $800.

I thought about it for a few days. There are some functions of MacOS that won’t work on a hackintosh, and spending $800 to build a new machine that should run MacOS is a tad riskier than buying an old machine that will run MacOS. But my financial sense won out. I had been planning to build a gaming computer and on replacing my iMac with a new one at some point in the future. As much as it may have seemed foolish in the short term, in the long term I would end up spending less money overall — buying an old machine would have also cost me hundreds, and wouldn’t change either long term goal. By building the machine, I would be spending money I was going to spend anyway, only earlier than intended and during a period of strained personal finances.

I loved it. I loved building my computer. I loved tinkering with my computer. I can’t wait until I can build another computer⁴.

I will never use a hackintosh as my primary system ever again.

Yes, the Hackintosh was sometimes fun to tinker with⁵ — especially when I was in-between jobs and had the time — this past year was like watching the slow gradual death of my love affair with my machine. After so many Final Cut crashes and Nvidia driver glitches, I knew that the death of the Hackintosh was inevitable. It's untenable to have work come to a standstill because your machine is, well, hacked. It was simply a matter of having the money on hand to buy a new iMac.

Which I did back in December.

As much as I enjoyed building my own computer.there is something rather pleasant about buying a computer, only having to worry about changing a few options — if any — and knowing that the machine will work⁶.


  1. I could've bought a piece-of-shit Windows machine, but unemployment was miserable enough on its own.

  2. Yes, my machine could run the games, but even with my shitty eyesight, 20fps at low settings isn't playable.

  3. While Final Cut Pro or Adobe Premiere will take advantage of a graphics card, since video editing has ended up being more hobby than career, spending the hundreds of dollars Apple would charge to upgrade the graphics card wouldn't be worth it if the computer wasn't being used for gaming.

  4. My current thought is to update the graphics card every 3rd generation, and to build a new computer every six years. With the slowdown in CPU innovation we saw this past decade (looking at you, Intel), I could probably use this system for more than six years. But, if part of the fun is building the machine, why not build a new one if I can afford it?

  5. I won’t say that I’ll never build another hackintosh (hello, media server!), but one will never again be my primary machine.

  6. To provide even more context in how replacing the Hackintosh with an iMac made my work easier: The Geekbench scores between the two machines were close. The Hackintosh won both graphics and single-core, and the iMac won multicore. You’d think that’d mean I’d see little improvement, but Final Cut imports and exports of Hashtag General broadcasts went from being measured in hours to minutes. The advantage of running MacOS on a machine built to run it makes a difference.

 
Apple's Mac Pro Pro Problem

Apple's Mac Pro Pro Problem

Over the past few years, you wouldn’t needto go far to find pro-level users upset at Apple over the dearth of high-end Mac hardware from 2013ish until nowish.

Things are getting better. The iMac Pros are beasts, the rumors about the next MacBook Pro update are promising, and the soon-to-be released Mac Pro is more computer than the average person will ever use or need. But while Apple’s recent efforts have all-but overcome the weaknesses in their line-up, a recent article that happened across my desk has raised some red flags.

In the Visual Effects Roundtable over at postPerspective, the following question was asked of almost¹ every industry professional: “How will real-time ray tracing play a role in your workflow?”

Currently, the only graphics cards with realtime raytracing tech are made by Nvidia.

Apple doesn't use Nvidia cards. Apple doesn't support Nvidia cards. Apple won't even sign drivers that Nvidia made on their own². Turns out that the ole' Apple-Nvidia blood fued is still alive and kickin’ at Apple Park.

Admittedly, I have some skin in the game — my Hackintosh uses a Nvidia graphics card, and the lack of driver support in both Mojave and Catalina have negatively impacted my editing workflow as I am no longer able to run to most up-to-date version of Final Cut Pro X. While getting official Nvidia support would be a boon to me, that's beside the point. Why make one of the most powerful desktops on the market if it is unable to use the most advanced graphics cards? For a VFX house looking at investing in new technology, if they are even thinking about real-time ray tracing, all Apple devices are immediately out of contention. The lack of Nvidia cards will undermine the Pro in Mac Pro.


  1. Notably, they didn’t ask the AMD representative this question.

  2. While nothing is technically stopping Nvidia from writing and releasing their own drivers, because Apple won't sign them the OS will be unable to verify the integrity or legitimacy of the file. Understandably, Nvidia is leery of releasing them this way.

TechLogan StoodleyApple, Mac Pro
Apple’s Billion-Dollar Bet on Hollywood Is the Opposite of Edgy

Apple’s Billion-Dollar Bet on Hollywood Is the Opposite of Edgy

Lucas Shaw for Bloomberg:

However, Apple isn’t interested in the types of shows that become hits on HBO or Netflix, like Game of Thrones—at least not yet. The company plans to release the first few projects to everyone with an Apple device, potentially via its TV app, and top executives don’t want kids catching a stray nipple. Every show must be suitable for an Apple Store. Instead of the nudity, raw language, and violence that have become staples of many TV shows on cable or streaming services, Apple wants comedies and emotional dramas with broad appeal, such as the NBC hit This Is Us, and family shows like Amazing Stories. People pitching edgier fare, such as an eight-part program produced by Gravity filmmaker Alfonso Cuarón and starring Casey Affleck, have been told as much.

So, a whole lotta 'meh'.

Notes on the Apple TV 4K HDR UHD with Super DolbyVision

Notes on the Apple TV 4K HDR UHD with Super DolbyVision

Lots to digest in Apple's latest update to the Apple TV. But, first, we need to come up with something less tech nerdy than 4K. And HDR. And Ultra High Definition. I know what they all mean, and I still feel like I'm about to go cross-eyed. Just think, these are phenomenally worse than Super Retina Display, and Super Retina Display is bad. But, I digress.

As I noted on Twitter, if Apple truly needed to use a Dolby Vision theater projection system to show off the device's full capabilites, an Apple TV coupled with the right display would rival the screen at your average theater. Having seen a film projected in Dolby Vision, I can assure you that the image quality is immaculate. If the Apple TV is even close to delivering that same quality, the dream of a high-end home theater is becoming even more obtainable.

Where the Apple TV might falter compared to the theater is a question of compression. Compare the 1080 HD iTunes version of film to it's Bluray counterpart, and the Blu-ray looks better. So the question now is how do the iTunes 4K HDR films compare to the 4K HDR Blu-rays that have started to hit the market. If the quality is close, Blu-ray might be the format for cinephiles, and digital the format for the masses.

But two other things to note:

  1. Apple got (some) studios to agree to the $19.99 pricing. Disney being the notable holdout.

  2. Films previously purchased in HD will be upgraded to 4K for free.

Personally, I think the biggest problem Blu-ray faced—and the problem 4K content will face soon—was people didn't necessarily want to shell out and buy a new copy of a movie they already own. For many, a DVD is good enough. Hell, I still have a bunch of DVDs that I have no intention on shelling out the money to replace them with a Blu-ray. Good on Apple and the studios for putting this update into place. This alone makes purchasing movies through iTunes more compelling. 

Now, if the same thing applies to Digital Copy, it'll be even better. Buy once, 4K everywhere.

Logan StoodleyApple TV, Apple, 4K, HDR
Apple, Amazon Join Race for James Bond Film Rights

Apple, Amazon Join Race for James Bond Film Rights

As much as MoviePass wants to be the industry disruptor, the big disrupter right now is all the tech money pouring into content.

As a Bond fan, I'm also interested (and, to be honest, a little worried) by these lines in Tatiana Siegel and Borys Kit's pice for The Hollywood Reporter:

“In the world of Lucasfilm and Marvel, Bond feels really underdeveloped,” says someone familiar with the bidding process.

And:

Some observers feel that the franchise, by only limiting itself to theatrical movies, remains vastly under-utilized by 21st century standards...

The bidding war for Bond just got more interesting.

Apple Is Planning a 4K Upgrade for Its TV Box

Apple Is Planning a 4K Upgrade for Its TV Box

Mark Gurman and Anousha Sakoui for Bloomberg:

Apple is planning to unveil a renewed focus on the living room with an upgraded Apple TV set-top box that can stream 4K video and highlight live television content such as news and sports, according to people familiar with the matter.

For the average person, 4k is overrated. Not only is there a lack of content, given how most people situate their living rooms, they'll be sitting too far away to truly take advantage of 4K.

The new box will also be able to play content optimized for TVs capable of playing High Dynamic Range (HDR) video, which produces more accurate colors and a brighter picture.

HDR is definitely the future of display technology. From TVs, to phones, to computer monitors, it'll soon be considered standard. 

Logan Stoodley4K, HDR, Apple, Apple TV