Apple announced its $3,499 Vision Pro headset yesterday, and among all of the flashy demos, it got me thinking… what does “pro” actually mean for Apple’s new headset? While the iMac Pro, Mac Pro, and MacBook Pro have all been targeted at high-level professionals in the past, the audience for the Apple Vision Pro is a lot less obvious.
It’s one of the first times we’ve seen Apple launch a “pro” device without a corresponding entry-level equivalent since the MacBook Pro in 2006. And just like the MacBook Pro, the Apple Vision Pro was a “one more thing” surprise at the end of an Apple keynote. But the original MacBook Pro was obviously designed primarily for professionals in a way the Vision Pro isn’t.
The MacBook Pro was one of Apple’s first Macs to switch to Intel, announced alongside an Intel-powered iMac that was targeted more at consumers with a built-in iSight camera, DVD burning capabilities, and a bundle of digital lifestyle apps. The MacBook Pro was all about justifying the switch to Intel for power and, in particular, performance per watt. Steve Jobs stood onstage and even showed off SPECint benchmarks for CPU integer processing power during the announcement. Apple didn’t use any benchmarks to justify its “pro” label on the Vision Pro.
That’s probably because the “pro” label has long lost its meaning across the industry since those early MacBook Pro days. OnePlus, Huawei, Xiaomi, and others started using “pro” monikers on phones before Apple decided to do the same with its iPhone 11 Pro in 2019. At the time, former Verge senior reporter Chaim Gartenberg (damn, I miss that nerd) asked what it even means for a phone to be “pro,” and here we are nearly four years later asking the same about a new headset.
While the “pro” label on iPhones has come to mean a better camera and screen, Apple hasn’t announced a regular Apple Vision headset without the “pro,” so this definition doesn’t apply here (yet). And the Apple Vision Pro isn’t clearly going after high-level creative professionals in the same way the MacBook Pro, Mac Pro, and iMac Pro have done in the past, either. In fact, Apple didn’t really show much content creation at all for the Vision Pro — it was mostly focused on content consumption, even in the work parts of its demos.
We saw the ability to drag and drop 3D content from Messages, but we didn’t see people creating that 3D content within the headset. There was a brief demo shown using a virtual keyboard to send a message, but not the complex type of “pro” interactions for text, document, and image manipulation using just your voice, hands, and eyes that we’ve come to expect from pro devices with a traditional mouse and keyboards attached.
In fact, it looks like you’ll need a physical keyboard and mouse for that precise type of control on the Vision Pro. Because, like the iPad, developers will need to adapt their apps for this new input. Apple demonstrated the ability to use Bluetooth accessories like its Magic Trackpad and Magic Keyboard for when you want to type up long emails or fill out information in a spreadsheet. You can even remotely connect to a Mac screen and make it a portable and private 4K display in the headset, running alongside apps built for the Vision Pro headset.
“This powerful combination of capabilities makes Apple Vision Pro perfect for the office or for when you’re working remote,” said Allessandra McGinnis, a product manager for Apple Vision Pro, during Apple’s WWDC 2023 keynote. We didn’t really see just how powerful these capabilities are or how well the voice, eye, and hand gestures let you control and manipulate documents. Instead, Apple showed a 10-second demo of team collaboration on a document from the headset wearer’s point of view. But it was just a static document, and we didn’t see how you can interact with or create a document. What’s perfect for the office about this? We don’t really know yet.
One area where the Apple Vision Pro looks like it will excel is video calling. FaceTime looks slick, with virtual app sharing and a room-filling interface that expands as life-size people join the call. It’s not too dissimilar to what both Microsoft and Meta have been working on for immersive meetings, but once again, it’s all about consumption, not creation. Even Apple admitted that. “This is powerful for so many activities, like reviewing a presentation, sharing photos and videos, or watching a movie together,” said McGinnis. That’s still work, but what happens when you’re reviewing a presentation and want to make edits? Again, we don’t know.
The rest of Apple’s presentation focused on the home and consumer uses like using the headset to create giant virtual monitors or TV screens to watch movies or play games. “With Vision Pro you’re no longer limited by a display,” said Apple CEO Tim Cook when introducing the don’t-call-it-VR headset to the world. The idea of having a mobile triple-monitor setup with me while I’m traveling sounds great and a killer feature for many professionals, but it’s also largely the same thing VR headsets have been doing for years now.
I don’t doubt that Apple has probably nailed text legibility here and made this immersive environment more compelling to use as a mobile workstation, but at $3,499, it’s a lot compared to the many VR headsets that can also create virtual giant workspaces and TV screens for you.
A few demonstrations did go beyond consumption. The redesigned Djay app for Apple Vision Pro looks like it will offer some impressive interaction unlike anything else Apple demonstrated.
Microsoft was also quick to pledge its support for the Apple Vision Pro headset, enabling Apple to briefly demo Excel, Word, and Teams running on the headset. Adobe Lightroom also works on the Vision Pro and was shown being controlled with the eyes and hand gestures. Having these big names on board will undoubtedly push other developers to eagerly adapt their iPad and iPhone apps for Apple’s new headset.
Apple’s headset uses the same software frameworks available on iPadOS and iOS for visionOS, the operating system that powers the Vision Pro headset. “This means hundreds of thousands of iPad and iPhone apps will be available on Vision Pro at launch,” said Susan Prescott, Apple’s VP of worldwide developer relations, during the company’s WWDC 2023 keynote. Just how well developers are able to adapt them is key to whether Apple’s “spatial computing” can replace or just assist our existing “pro” tools.
We’ve seen Apple struggle to adapt the iPad for creation over the years, even after the company blurred the lines with the iPad Pro — a hybrid device much like the Surface Pro that blends laptop and tablet. Apple spent most of its time during the iPad Pro announcement in 2015 demonstrating productivity apps like Office and Photoshop, with a focus on professionals getting work done. Almost 10 years later, I still grab a laptop when I want to get work done because iPad apps and the OS still haven’t quite caught up to macOS or Windows for multitasking and creation.
I’m not convinced Apple even knows why the Vision Pro is pro, leaving it to developers to make the case over time. (It was unveiled at WWDC, after all.) Because, without their help, what we’re looking at is a professional content consumption device for so-called prosumers that has the potential to be so much more.