Late last week, Apple unveiled its most consequential product in at least a decade. 10 years, 9 months, 2 weeks, and 2 days after Steve Jobs took the stage to unveil the first iPad, Tim Cook announced the release of three new Macs powered by Apple’s own M1 system on a chip (SOC)… changing the industry forever.
Hyperbole? I don’t think so…
Author’s note: In the interest of time, I won’t jump into the differences between ARM and x86 processors or Apple’s CPU history here. If you’re not familiar with ARM processors or why Apple is making this change, 9to5Mac published a great explainer earlier this year that you may want to check out before reading on.
This Changes Everything
Credit where credit is due: nobody does a keynote like Apple. Who else could get on stage, release what is essentially the same computer in three different shells (at three different price points no less) and somehow still manage to herald the beginning of a revolution in the PC industry.
I see you Craig…
There I go with the hyperbole again, but I stand by it. Because what’s important about these new Macs is not the computers themselves. Whether the first-generation M1 Macs are a cause for celebration or ridicule is inconsequential; their arrival is more about what they represent and how they’re going to change the face of the PC industry for the better by properly establishing a whole new class of personal computer that is ideally suited for working photographers—more on that in our full review.
What Apple does better than anybody else is lead-by-perfecting. It takes an idea that already exists in the wild, and turns it into something that is both functional enough and beautiful enough that people actually want to use it. This isn’t a dig, it’s a compliment—and a necessary role in the tech ecosystem.
Apple wasn’t the first to create a smartphone, but it was the first to create one most people wanted to buy in droves; it wasn’t the first to create a tablet computer, but it was the first to perfect the concept and now the iPad practically owns that market; and it’s not the first company (by any stretch) to create an ARM-based computer, but it did more for this segment in the past week than Microsoft was able to do in years.
The new M1 Macs are already out-performing expectations by a lot, but that’s not what makes this announcement “a true game-changer.”
What makes this announcement special is that Apple just single-handedly legitimized a new computing category because—to borrow a word the company loves to bandy around from time to time—it was “brave” enough to jump in with both feet.
What Happens Next
The praise (and criticism) for now is focused almost entirely on what these new computers mean for Apple. You’ll see plenty of reports about Apple’s vague performance benchmarks, why you should or shouldn’t order one of these new M1 Macs, and the imminent release of Adobe’s ARM-compatible apps. I don’t really care about any of that right now.
As far as I’m concerned, this first generation of Macs have already served their purpose, and that purpose is to light a fire under every major software developer’s a** who has been dragging their feet when it comes to developing for the ARM architecture. Great performance out of a first-generation product is just the cherry on top.
x86 is undeniably powerful. There are things that x86 processors can do that the pared-down ARM architecture may not be able to match for a long time. But it’s also bloated, inefficient, and totally unsuitable for the increasingly difficult to work with inside the powerful-and-portable-and-efficient computers that most creatives rely on for the brunt of their work.
ARM is the perfect match for exactly the scenario I’m describing, and Apple just provided the “exit velocity” needed to get the damn thing off the ground and into the stratosphere.
I’m not all that interested in these Macs per se (okay, that’s a lie…) but I am dying to see what happens next. It might take a year. It might take two. But creatives—and especially photographers—will soon have a whole new class of personal computer that is tailor-made for their workflow. Computers with 20-hour battery life, silent fan-less designs, and more-than-enough CPU performance to go around… all packed into an impossibly portable shell.
These computers will be made by Microsoft, HP, Dell, ASUS, and just about every other PC maker out there. But like it or not, when this shift finally takes place, we’ll have Apple to thank.
Like any sane columnist, I’ll admit that I could be totally off my rocker here: there is a small-but-real possibility that ARM-based Macs will become one of Apple’s failed experiments. Some day these three computers might be quirky collector’s items like the Apple gaming console or the hockey puck mouse.
It’s also possible that PC makers stick with x86, opting for the increasingly efficient new AMD and (when they finally catch up) Intel chips. The ASUS Zephyrus G14 is a great example:
Specificity is the enemy of prediction, which is why I’m staying away from outlining exactly when and how ARM-based PCs are going to emerge from the primordial oceans of tech. But I do believe beyond the shadow of a doubt that this is going to happen, and that it’s going to happen pretty fast.
Unlike iTunes Ping or the Apple Newton, ARM-based laptops are neither unnecessary nor ahead of their time; if anything, they’re overdue. Apple is just the first company to hit that requisite combination of market-driven incentives (bye-bye RAM upgrades) and market cap ($2.05 trillion) to jump in with both feet and practically force the rest of the industry to follow along.
Now that it’s happened, I’m beyond excited to see how the PC industry adapts to compete. In 5 years time, nobody will be looking back at this specific Apple Event to criticize how Apple laid out its benchmarks or point out that the same computer in three different shells maybe shouldn’t range from $700 to $1300. But we may very well revisit this keynote over and over—like we do with the reveal of the first MacBook Air—as a moment that changed the future of the PC industry forever.
A true game-changer, hyperbole not required.
About the author: DL Cade is an art, science and technology writer, and the former Editor in Chief of PetaPixel. When he’s not writing op-eds like this one or reviewing the latest tech for creatives, you’ll find him working in Vision Sciences at the University of Washington, publishing the weekly Triple Point newsletter, or sharing personal essays on Medium.