Over in America, the Apple Vision Pro headset has been out for over a week, giving early adopters plenty of time to get used to a new age of spatial computing — or, more likely, reflect on whether US$3,500 was a sensible investment.
Reviews of the headset are generally positive about the potential and sci-fi technology, but quite open about the many drawbacks including buggy software and the inevitable discomfort of wearing it for extended periods.
Fortunately, the weak two-hour battery life means that the comfort issue isn’t as big a deal as it could have been. But early adopters may still find themselves mentally listing all the things they could have bought with US$3,500 in the 90 minutes the battery takes to fill back up for round two.
Still, expense, bugs and hardware niggles are par for the course with first-generation software, and things will inevitably get better over time. But you may have a long wait ahead of you.
In his latest Power On newsletter, Bloomberg’s Mark Gurman outlines his experience with the Vision Pro — which really puts the “mixed” into “mixed reality” — but he also includes an interesting tidbit about the long-term direction of Vision Pro.
“Some people in the Vision Products Group (the team working on the headset) believe it could take four generations before the device reaches its ideal form,” he writes. That’s “similar to the progression to the iPhone, iPad and Apple Watch”, he adds.
The last sentence is comforting for those who want to see a computer on every face as if Wall-E taught us nothing, but none of the examples listed were A) anywhere near as expensive as Vision Pro or B) on such a slow release schedule. Most Vision Pro 2 rumours point to a release in 2025 at the earliest, and possibly 2026.
Unless things speed up considerably, that means that the ideal fourth-generation form may not show up before the decade is out. Which is nice for those who want to enjoy their first-generation headsets without buyer’s remorse, but less nice for those who just want the benefits of spatial computing without the extreme dorkiness factor…