Apple’s Visions of the future: Four new products in the works

0

Macworld

Vision Pro may be off to a slow start, but a new report claims Apple isn’t about to let it languish for too long. In fact, a more affordable model might be coming next year.

Bloomberg’s Mark Gurman reports in the latest issue of his Power On newsletter that Apple is working on a total of four new AR/VR products to complement or replace the Vision Pro, which has been only moderately successful to date.

As early as next year, Apple will be offering a much more affordable Vision headset than the Vision Pro, which is currently available for $3,499. The price of the cheaper model should be around $2,000, the price of a decently specced 14-inch MacBook Pro. To get down to that price Apple wants to make savings on the screens–on both sides of the lenses. For example, the cheaper Vision will have to manage without EyeSight, the clever but imperfect feature that projects an image of the wearer’s eyes on an external screen, and it won’t match the original model’s exceptionally high resolution on the interior displays.

Apple also wants to cut back on the chip; instead of an M2 as in the Vision Pro, an iPhone processor such as the current A18 will be used. The company also wants to save on materials, but this could have the pleasant side-effect of making the headset lighter.

In a way, this is reminiscent of the iPod’s evolution 20 years ago: the first models were expensive niche products for Apple’s core clientele. It was only with the iPod mini in 2004 and even more so with the iPod nano in 2005 that things really took off. On the other hand, the iPod was also a self-explanatory product, and it’s not always clear why you need a Vision Pro.

Vision Pro 2 coming later, real glasses even later

It may not be selling well in the wider consumer market, but Vision Pro does seem to be popular among business users. And Apple seems keen to continue developing the premium model for this discerning target group rather than withdrawing it from the range when the cheaper version arrives (as happened, at least temporarily, to the HomePod when the HomePod mini launched). There will be a second-gen Vision Pro at some point.

Gurman admits he doesn’t know much about this device just yet: just that the second generation will be released in 2026, and use a faster chip. This should be at least an M4 by then, and could even be an M5. Either way, we should be looking at noticeable speed advantages over the M2. The new version will also presumably have 8GB of RAM to support Apple Intelligence.

Ultimately, however, the mass market does not appear especially interested in bulky headsets with high-performance chips. A more appealing prospect would be a simpler set of AR glasses that can show you films or games, step-by-step routes, or real-time translations of conversations. Meta recently presented just such a device with the Orion, while glasses such as the Imiki AR Glasses from Meizu could already be seen in action at IFA.

Apple is clearly lagging behind the market here, but the company is expected to have AR glasses ready for the market by 2027, which will be able to “scan the environment around a user and supply useful data,” not unlike the upcoming Visual Intelligence features for the iPhone 16’s Camera Control button. 

. By that point, Apple should also have established its generative AI system, Apple Intelligence. Most iPhones used in three years should then be able to offer functions together with the glasses that the competition is already promising today, and at affordable prices.

Ears get eyes

Finally, Gurman reported on a fourth AR/VR product that Apple was working on back in the spring of this year: a new camera-equipped version of the AirPods. What sounds crazy at first glance should, however, prove to be useful. Apple has developed a system for the Vision Pro that scans the wearer’s surroundings and displays useful information. Or in the case of the AirPods, possibly announces it.

The camera-equipped AirPods are not expected to be ready for the market until 2027, but upcoming features on the iPhone 16 should offer a foretaste of optical scanning. AirPods with optical sensors could also be useful for the acoustic perception of the surroundings, as Ming-Chi Kuo explained this summer: AirPods of this kind could recognize more gestures and amplify sound sources that their users point to, for example.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.