Project Starline, a holographic chat booth

Google’s Project Starline is being tested in offices this year. I got to try it, finally.
Google

Weirdly calming. Oddly immersive. Invisible? These weren’t the words I expected to come to my lips after sitting in a room for 30 minutes demoing Project Starline, Google’s prototype big-screen 3D video chat platform that’s now being tested in a few corporate offices outside of Google. Aside from a fist-bump that dissolved into pixels, it felt more real than I expected.

Google introduced Project Starline more than a year ago, during its 2021 Google I/O virtual developer conference, as a future vision of how virtual meetings might evolve. The idea feels perfectly conceived from pandemic times: a dream of how to get people feeling like they’re sitting next to each other even if they’re thousands of miles away.

Companies invested in VR and AR, like Meta, have imagined VR meetings with avatars as a way of bridging distances. VR’s tradeoff is a headset where you’re hiding your real face; video chats like Zoom let you see someone eye to eye, sort of, but you may feel glued to your laptop screen. Meta’s future avatars are designed to look nearly real, while being puppeted by face-scanning cameras. Project Starline achieves the effect in a different way, by just projecting a real-time 3D feed of your face.

Project Starline is a two-way video chat, but what stunned me was how it also felt, to some degree, like something AR, or holographic. And, eventually, sort of like nothing at all.

I couldn’t capture my own photos or videos of Starline, but this visual from Google is what it feels like.


Google

Memories of a virtual experience

I waited a day before writing about my half-hour chat using Starline, because I wanted to see how I remembered it. According to one of the two team members I talked with in my little holographic booth, I might remember my experience as a real in-person conversation.

In fact, I did meet with Google’s Starline team — Andrew Nartker, Starline’s director of product management, and research scientist Jason Lawrence — in person at first, and then they left for a second room down the hall with a second Starline booth while I sat in my own. From a short distance, between two separate rooms, we chatted holographically. Or, via 3D light field displays.

It gave me the sort of strangely real presence I’m more used to in VR with avatars, but with the real-face benefits of a video chat.

The moment I entered the Starline booth, it was intimidating. Google didn’t allow photos or videos of my experience, but it’s a tall-backed bench facing a big screen, with a counter-like wooden bar/desk forming a bit of a low wall between the two. A wild array of cameras and what looked like depth sensors surrounded the screen and the bottom of desk/bar in front of me. I counted at least 12 sensors/cameras, I think. (Lawrence explained later that the various depth-sensing cameras are working together, “a real-time depth camera technology that we developed as part of this.”)

Then, as I stared at the screen, Nartker walked in and sat down. The image appeared 3D, like he was sitting across from me. I’ve seen light field displays before that achieve similar glasses-free 3D tech, but this was particularly interesting for two reasons: It was a real-time video chat, and he appeared life-size in front of me.

The life-size part was jarring and then oddly comforting. As we sat, our eye contact was perfect. I found myself staring eye to eye so much that it felt weird, and I looked away. I talked while fidgeting a bit, and found my posture slumping. I started to relax. It felt…well, it felt like we were chatting at a coffee shop table.

“We could be anywhere in the world, looking through these magic windows and really experiencing each other in this rich way,” Nartker says to me. “You’re looking at me, making eye contact. We can’t really do that today in video conferencing.”

The video chat works using a real-time depth scan. 


Google

An interface that melts away

The scale and positioning of Starline’s display and its cameras are what disarmed me, because I didn’t need to worry about looking at a single camera to line up my view. I didn’t even know what I looked like. 

“It’s almost like the space just kind of connects. And these rooms merge together. And you I are just sitting here hanging out,” Nartker says.

Starline’s display resolution isn’t as crisp as real life, but it was good enough. The 3D appearance of Nartker looked solid, except for moments of slight pixelation or breakup at the edges at times, or when our hands reach too far forward. Starline scans a 1-meter cube, roughly, of space in which we can see each other’s actions. The low wooden wall forms a physical limit of sorts that I want to lean toward.

I felt like placing something on the counter, as if he would then reach across and take it like a magic trick. That isn’t possible, of course. Neither is putting a holographic object on the desk. 

“We put that little wall there because there’s a space where these two rooms kind of coming together, and that is exactly where these things connect. But you and I can almost feel you’d reach past that point in the wall and give each other high fives,” Nartker explains.

We did fist-bump later, showing the limits of the space scanning. His hand started to pixelate and dissolve as my hand approached. I thought about him seeing me doing the same thing on the other side. Us forming one volumetric space between us.

I didn’t realize until later that the wall behind my holographic chat partner isn’t real: it’s a virtual backdrop that seemed indistinguishable from the wall behind me, with virtual shadows added. It’s there because some of that area is technically beyond the limits of the depth-sensing camera.

Starline is designed to work over regular network bandwidth, and displays at 60Hz. The 3D feat does seem to come as a bit of resolution drop from what a big 4K TV would look like, but with that added level of present realism.

I wonder how much I’d use it to look at things other than people. Things he holds up can be seen, although in slightly-less-crisp-than-regular-video form. An apple, for instance. I hold up my wallet and car keys.

I really did feel more relaxed and — dare I say — normal after a few minutes of using Starline, although entering a special room with a booth and a camera-studded wall screen was an odd on-ramp to feeling casual. Starline’s large-scale install makes it nowhere near as instant or everyday as any normal video call you’d do on whatever device you’ve got on you already.

But as I felt the tech melting away, and our conversation, filling most of the field of view, taking center stage, I felt a lot of focus and calm. It totally did seem like some wild magic trick, but it also started to feel like a face-to-face experience versus some sort of Zoom.

What comes next?

Real-time video chat on a light field 3D display is already a feat I’d never seen before, but Starline’s questions about the meaning of presence at a distance made me think about the VR and AR world at large. No VR or AR headset has successfully tackled making a real everyday conversation between two people feel normal, simply because you’ve got gear on your face that gets in the way. Google’s first wearable smart display, Google Glass, aimed to work while being socially unobtrusive, but it didn’t succeed.

“This is a prototype, an early proof of concept for where this technology can go,” says Nartker. “This was the first prototype that we really felt created a sense of co-presence where we both felt together. When we studied this with Googlers, we found that people actually acted real. They thought it was real, and described it as real.”

Clay Bavor, Google’s head of AR and VR who also launched Google’s Daydream VR platform, leads the Google Labs effort spearheading Project Starline. Assistive smart glasses, announced at this year’s I/O conference, are also in development. Google’s in a period of work on AR and VR that seems deep in research, exploring solutions that may end up appearing in other products over time. 

Project Starline is now being installed in a limited number of non-Google test offices, two at a time. The technology is big, but the hope is for the design to find its way into other, smaller forms. Still, the life-size aspect of Starline seems like a key feature to its success, which requires a larger screen to cast a full-size body.

I thought of applications right away: maybe a customer service booth that someone could staff 24 hours a day even if no one is physically there. A meet-a-celebrity kiosk, maybe? Would it work as a brainstorming/collaboration pod? I’m not sure whether any of it, over time, would be better than a regular Zoom/Meet/FaceTime, or a session in VR. But I’m extremely curious to see how Google distills whatever it’s learning with Project Starline into what comes next, headset or otherwise.

Leave a Reply

Your email address will not be published. Required fields are marked *