(Image credit: Apple)

Apple's new imaging technology finally makes the Vision Pro make sense

This could be the thing which makes me buy an Apple Vision Pro

by · T3

Share by:

Share this article
0
Join the conversation
Follow us
Add us as a preferred source on Google

Quick Summary

Apple has a new technology which could revitalise the Vision Pro.

It can produce a 3D render from a single 2D image.

While high profile tech releases are fairly common in this day and age, those which forge ahead into new frontiers are less so. For many, the slew of mixed reality and virtual reality headsets are the greatest indicator of a new era.

It has now been a few years since Apple launched its flagship in this area. The Vision Pro remains one of the most costly devices on the market, and some have suggested it doesn't quite live up to the price tag.

For me, a new piece of software from Apple might be the ticket to unlocking a real world use case for the device. That's a new open-source model, which can turn 2D photos into a 3D image.

Dubbed SHARP, the model predicts what a 3D render of the scene would look like, based on viewpoints within the image. Without getting too much into the nitty gritty, that decodes the depth of certain elements of the image, and uses them as waypoints within the 3D scene.

The big difference for Apple's technology is that it can produce a full 3D render from a single image. Other tools of this nature require hundreds of images of the same scene in a bid to produce a usable rendering.

Having a one-shot system means that users can affect many images with ease, making the process easier than ever.

When I first read about this technology, I was a little blasé. Sure, it sounds fun, but who would actually use it?

Sign up to the T3 newsletter for smarter living straight to your inbox

Get all the latest news, reviews, deals and buying guides on gorgeous tech, home and active products from the T3 experts

Contact me with news and offers from other Future brandsReceive email from us on behalf of our trusted partners or sponsors

Then I saw footage of the models on an Apple Vision Pro headset, and suddenly it made sense. Images captured as a snapshot of a moment in time were suddenly able to be interacted with, and moved through.

While there are definitely limitations in the design – it won't generate beyond the borders of the image, for example – the overall effect is solid, and adds a new layer of depth to old images.

TOPICS
apple