The Sony XYN Immersive Display Feels Like Apple Vision Pro Without Goggles

A computer monitor displays a snowy scene of a Japanese shrine with red torii gates and lanterns, while a TV screen above shows streaming app icons like Netflix, Prime Video, Disney+, and Hulu.

Earlier this month at the tail-end of the CP+ Show in Yokohama, Japan, I got some hands-on time with Sony XYN. Specifically, Sarah Teng and I stood in front of three Sony XYN immersive displays. What we saw felt like wearing the Apple Vision Pro, but without the headgear.

The best way to describe the technology is as a combination between what Looking Glass has created and the experience of using a Vision Pro. The XYN displays use a camera to track a viewer’s eyes and movement, and then a unique display technology to project different layered views to each of your eyes to give you the sense of depth. The result is an immersive, 3D view of a photo that doesn’t immediately make your eyes or head hurt.

Looking at the original 3D TVs and the 3D effect of the Nintendo 3DS, my head would eventually hurt as my eyes had difficulty with what they were seeing. With the XYN, my head never hurt. The depth feels real.

When you first approach a XYN display, if there is an image on the panel, it just looks like a garbled visual mess. But when you stand in front of it, and the camera sees your eyes, suddenly the photo snaps into focus, and there is immediate depth. From there, you can move around a lot, bending sideways, hunching down, or peering over, to reveal more perspectives of the scene. It’s surprisingly flexible with how much play you have, as long as your feet stay planted.

A computer monitor displays a vibrant nighttime cityscape with illuminated Japanese signs and buildings. Streaming app icons like Netflix, Prime Video, and Disney+ are visible on a TV screen above the monitor.
I wish it were possible for me to show you how this works, but I can only explain. In this image, we could see around and behind buildings, as if we were looking at a miniaturized version of the scene. The effect is pretty similar to how it feels to look at tilt-shifted photos, although without the forced defocus areas. It’s very neat.

Unfortunately, because the depth effect is based on where the camera senses a user’s eyes, it’s impossible to showcase in photos or even videos. So while I can describe to you what it’s like to use, much like needing to wear a Vision Pro in order to understand what an immersive photo looks like, I can only tell you about how cool Sony’s XYN technology is.

That might be the biggest hurdle to the growth of what Sony has created here, and it’s been one of the biggest issues to getting mass appeal to the Vision Pro, too. It’s just prohibitively expensive with the cost of products and showrooms, when every potential customer has to actually use the product in person to “get” it.

More Than a Display

I should back up, because Sony XYN isn’t just a display technology. Unlike Looking Glass, XYN is a full ecosystem designed to facilitate the entire process, from capture to display. The Vision Pro and Apple’s iPhone cameras are the closest in concept to what Sony is doing here, but while Apple relies on two cameras that mimic both human eyes to create depth, Sony is doing it with a single camera and a guided workflow.

The immersive photos we viewed at CP+ were captured using Sony Alpha cameras, and the company is currently working on a cloud-based technology to allow anyone to generate high-quality 3D assets of any object, regardless of their experience with photography. Users are guided on how to capture assets via the XYN app.

Using a proprietary technology, Sony is able to create “photorealistic reproductions of real-world spaces and objects in XR environments. This breakthrough technology faithfully captures even reflective surfaces and intricate details that were previously difficult to reproduce,” the company says.

XYN also works with mocopi, Sony’s simple motion capture sensors that attach to just six places on a body. It is, therefore, possible to create full immersive 3D renders of the movement of a human body, or use that motion capture to create 3D avatars.

Immersive Still Photos

Looping back to the images we saw at CP+, those were still images created from a set of photos, not video, which shows again that Sony has multiple ways of using its technology.

One of the most interesting aspects of the Apple Vision Pro is the ability to see depth in photos you’ve taken. Apple is able to do this by capturing or creating depth maps and then extrapolating that information with software, which is very cool and allows any photo to become immersive. But there are limits with this method, and so immersive photos only give you a slight peek at what might be behind objects. With XYN, it feels about twice as deep and immersive.

When we looked at the example photos, we could really move our heads around to look behind and around a scene significantly more than is possible inside a Vision Pro. That makes sense, since these photos were captured for the sole purpose of generating these immersive viewpoints. The tradeoff is it’s more complicated to create immersive photos to view on Sony XYN and an existing photo can’t be made immersive in the same way as it can inside of a Vision Pro.

Vision Pro also does some things better than XYN will ever be able to, such as virtual meetings. Anything that really asks a user to be fully drawn into a virtual experience is going to be better with a headset on, but everything else that can just use a display will give users a lot more physical freedom. There is a place for both technologies.

A computer monitor displays an image of a cheerful man in colorful clothing and a hat, standing on a cobblestone street with a yellow tram and buildings in the background. Reflections of people can be seen above the monitor.
In this photo, we could see around and behind the person in the photo (who also happens to be the photographer of the image).

Right now, Sony is developing XYN only with its own products, but company representatives did tell me they were looking into ways to work with what Apple is creating, too. I am curious to see how that might develop, as Apple is clearly interested in working with third parties to expand the usability of its technology: it already has partnered with Blackmagic and Canon.

While we’re still early on in the immersive space and Sony XYN is still mostly a prototype, I can see where this is going. As an artist, it would be a fun challenge to create a gallery of immersive photos for viewers to enjoy, making these types of images simply to provide a sense of awe and wonder. From a business perspective, I am sure there are plenty of ways for designers, architects, and engineers to use immersive displays to make working with 3D assets feel more like seeing them in the real world.

Whatever the case, for now, Sony XYN is just a really cool idea that was exciting to behold in person.


Image credits: Photographs by Sarah Teng for PetaPixel

Discussion