Aside from a fist-bump that dissolved into pixels, it felt more real than I expected.

Meta’sfuture avatarsare designed to look nearly real, while being puppeted by face-scanning cameras.

And, eventually, sort of like nothing at all.

From a short distance, between two separate rooms, we chatted holographically.

Or, via 3D light field displays.

The moment I entered the Starline booth, it was intimidating.

I counted at least 12 sensors/cameras, I think.

Then, as I stared at the screen, Nartker walked in and sat down.

The image appeared 3D, like he was sitting across from me.

The life-size part was jarring and then oddly comforting.

As we sat, our eye contact was perfect.

I found myself staring eye to eye so much that it felt weird, and I looked away.

I talked while fidgeting a bit, and found my posture slumping.

I started to relax.

It felt…well, it felt like we were chatting at a coffee shop table.

“You’re looking at me, making eye contact.

We can’t really do that today in video conferencing.”

The video chat works using a real-time depth scan.

I didn’t even know what I looked like.

“It’s almost like the space just kind of connects.

And these rooms merge together.

And you I are just sitting here hanging out,” Nartker says.

Starline’s display resolution isn’t as crisp as real life, but it was good enough.

Starline scans a 1-meter cube, roughly, of space in which we can see each other’s actions.

The low wooden wall forms a physical limit of sorts that I want to lean toward.

That isn’t possible, of course.

Neither is putting a holographic object on the desk.

We did fist-bump later, showing the limits of the space scanning.

His hand started to pixelate and dissolve as my hand approached.

I thought about him seeing me doing the same thing on the other side.

Us forming one volumetric space between us.

It’s there because some of that area is technically beyond the limits of the depth-sensing camera.

Starline is designed to work over regular data pipe bandwidth, and displays at 60Hz.

I wonder how much I’d use it to look at things other than people.

Things he holds up can be seen, although in slightly-less-crisp-than-regular-video form.

An apple, for instance.

I hold up my wallet and car keys.

What comes next?

When we studied this with Googlers, we found that people actually acted real.

They thought it was real, and described it as real."

Project Starline is now being installed in a limited number of non-Google test offices, two at a time.

A meet-a-celebrity kiosk, maybe?

Would it work as a brainstorming/collaboration pod?