The company calls it transparent computing. Chairman and chief creative officer Richie Etwaru calls it the “Z-axis of computing,” bringing depth and dimension to the height and width of our flatscreen computers. The technology from Mobeus, a New Jersey-based startup with $24 million in funding from Accenture Ventures was intriguing enough and mature enough to cause a major aerospace company cancel a massive 40,000-unit order of Meta’s Oculus Quest VR headsets.
Mobeus’ new Airglass2 offers many of the interactive, engagement, social, and tactile experiences of a mixed-reality or virtual reality headset in a hardware-free package. Essentially, it makes every standard, normal computer, laptop, or camera-equipped monitor a potential shared 3D space, using no more hardware than when originally shipped from the manufacturer.
I recently watched a live demo of Airglass2 at TechBeach Retreat, a conference in Jamaica, and had a chance to interview Etwaru, who’s also a co-founder of the company, live on stage immediately after.
“This works on any operating system, any computer, any type of media you have: movies, PowerPoint documents, PDF files,” Etwaru says. “All of tech has been X and Y … it’s always been X and Y. If you understand tech, this is the Z-index of tech. So now you can program for the Z-index.”
X and Y are the horizontal and vertical of our computer screens. Z is the space in front of and behind our laptops, which Etwaru says his tech maps and makes programmable.
That means that when you’re working with a colleague, you can zoom into a detail on your screen just like you would in the real world: by moving your head closer. That means you can look to your colleague’s side by moving your head , just as if your screen was a real-world window. It also means you can create and manipulate 3D objects — molecules, building plans, engine components, a new conference booth — in solo or shared space. One app Mobeus has built took a dancer’s movements and mapped a sitar to her shoulders, bongoes to her hips, and other instruments to different parts of her body, enabling her to make music simply by moving.
Essentially, it’s what we’ve seen in the demos from Meta and Microsoft HoloLens and Magic Leap demos with one key difference: no headset.
And that’s potentially a game changer, for a few reasons.
First, there’s cost.
At $500 for the commercial version, 40,000 Oculus Quests costs $20 million. Select a HoloLens, or the new Meta Quest Pro, or another high-end virtual or mixed reality headset, and you can multiple that $20 million by a factor of three to five.
More importantly to my mind, there’s convenience.
I have the Oculus Quest 2, and the biggest challenge I have with it is making time to use it. As I told Etwaru, every time I feel like using it I have to charge it first, because it’s been so long since I’ve used it. Assuming it’s charged, you have to ensure you’re in a safe space with enough room, put it on, accept that you’re now blocking the world out, ensure your controllers register, start whichever app you’re using, and then begin actually doing something.
It’s a far cry from pulling out your phone with its instant-on. And it’s a far cry from simply engaging, in a work scenario, with 3D objects in a 3D space right on the existing screen on your desk.
Which brings up the canceled Oculus Quest mega-order.
“I can’t tell you the name of the person or the company, but it’s in the aerospace industry,” Etwaru says. “In the aerospace industry, a lot of the machining … and a lot of the data that you have to visualize before you hit go on a rocket or an airplane, it’s very complex.”
The company was using computer-aided design to visualize components and how they fit together, but wanted to get more immersive and interactive.
“We did a demonstration of the technology and they were like, okay, cancel the headset order,” Etwaru told me. “We’re going in this direction. If it takes these guys two or three years to get there, we’re still better than what we’re going to get from the headset.”
One other huge benefit?
Users don’t get sick.
VR sickness is literally a thing now and those who suffer from it can experience nausea and vomiting, queasiness, cold sweats, dizziness, or headache. Not everyone gets it, but some do. And while I’m not personally very susceptible to VR sickness, I have occasionally felt a little queasy after some VR fitness sessions.
This is not a problem with Mobeus’ technology Etwaru says, because it’s in essentially real space.
“When we did some of the psychological studies on this technology, we were trying to figure out: why do people throw up in a headset?” Etwaru says. “Why does it make you feel nauseous? And it turns out that the body has a built-in accelerometer.”
When what you see is out of sync with your body’s accelerometer, you risk feeling VR sickness. But with Mobeus tech, when you lean in to zoom or lean back to expand your field of vision, you’re literally doing it with your body, and the changes you see on-screen accord with the physical changes in your body’s movement and posture.
Result: no VR sickness.
So how does Mobeus achieve this with no additional hardware? Essentially by inserting software right between the camera shipped on your laptop and your graphics processing card.
“This was built on bare metal,” Etwaru told me. “It doesn’t need any permissions on the computer. It doesn’t need any specific privileges. We’re sitting between the camera and the video. And we’re dealing with the frame buffer off the camera and then relighting the frame buffer right before it hits the video card to create that.”
The result, as you can see in the video at the beginning of this story, is impressive. And it might just be good enough for companies that want 3D interactivity for their employees to sign on.
Mobeus just emerged from stealth in November of this year and thinks it has a good shot at delivering the right amount of 3D immersion while keeping people rooted in the real world, which is probably a good bet for productivity.
“I’m not ready to plug in and I think there’s a lot of people that are not ready to plug in,” Etwaru says, referencing The Matrix movies and the persistent theme of plugging in full-time to augmented or mixed realities.
But he does want tactile manipulation of virtual objects and spaces.
“I think the next generation is saying, I want to engage with tech with my hands.”