If Movidius has its way, experiencing virtual reality (VR) the “right” way in the future won’t require owning an expensive desktop PC. Heck, it won’t even require a phone, but that reality is still a few years out.
This company you probably have never heard of is extremely bullish on the notion that the key to unlocking the next generation of VR is in your pocket. Well, it will be in your pocket soon enough. Not following us? We’re talking about phones.
Doubtful? I’m not surprised. After all, Google Cardboard and Samsung Gear VR can only muster a rather stripped-down experience compared to desktop-powered, head-mounted displays (HMDs) like the Oculus Rift and HTC Vive.
So, how exactly is mobile VR expected to match, and eventually surpass, this already-high bar of graphical fidelity and positional tracking prowess?
That’s a good question, one that Movidius CEO Remi El Ouazzane believes his company can answer with its own hardware, a small processor called the Myriad 2 vision processing unit, or VPU, and its ambitious plan called VR 2.0.
CPU, GPU … VPU?
Phones can be stocked with the latest, ludicrously speedy systems on a chip (SoC), but it all falls back on the battery. And, even when you’re plugged into a charger, there’s still only so much power these pocket-sized devices are capable of.
That’s where the VPU comes in. The role of the vision processing unit is to offer a helping hand to the overworked CPU and GPU inside of today’s phones, which are responsible for every task, from processing the graphics to cueing up sensors. (It’s easy to see why even the latest phones become so hot to the touch after a few minutes inside of a headset.)
By inserting itself into formula, the VPU makes the informational assembly line even more efficient, which in turn squeezes even more out of the hardware. But that’s not all, it will support a more varied set of tools for developers to take advantage of.
Using science to push mobile VR to the next level
Instead of thinking outside of the box, the plan for VR 2.0 instead requires thinking about what’s already inside of the box and making it better.
The first challenge to overcome is tracking. Today’s mobile VR headsets support head tracking on the most basic level – just on the X and Y axis – by utilizing mainstream sensors, like the accelerometer and gyroscope.
On the other side of the spectrum, the Oculus Rift and HTC Vive can do that, but also monitor your movements within a three-dimensional space, which means you can physically get closer to in-game objects, resulting in heightened immersion.
The latter accomplishes this with what’s called the “outside-in” method of tracking. In other words, sensors outside of the headset and controllers are used to pinpoint the location of the headset and controllers within a designated space.
According to Ouazzane, today’s solution isn’t “super accurate, but it’s the best we have currently.”
That is, until VR 2.0 arrives.
Obviously, the future of mobile VR won’t entail lugging a bunch of sensors around with you, or at least, I hope it won’t. So, they must be integrated into the headset or phone itself. Inside-out tracking, as Ouazzane calls it.
But, inside-out tracking, for as cutting-edge as it sounds, needs to feel natural if it’s going to stick around. It can’t be the least bit jittery or A. people will barf, and B. those who have already tried good, desktop-based VR will never come back.
To ensure this doesn’t happen, Ouazzane calls upon a piece of aircraft-grade tech called the inertial measurement unit (IMU.) It’s a device that utilizes modern sensors, like accelerometers and gyroscopes, to accurately track three-dimensional positional data. The minute ups and downs, dips, dodges and twists of your head can be detected with this sensor array.
The Microsoft HoloLens already contains this sensor unit, and soon, so could tomorrow’s mobile VR headsets.
Watching your every move
OK, so tracking in tomorrow’s mobile VR headsets will likely be a lot better than it is today. But, that still doesn’t solve the power issue.
“Mobile doesn’t have the luxury of having a beefy graphics card,” Ouazzane states, but the devices of tomorrow might not need one to push high-quality graphics.
“By reallocating precious resources [to the VPU], something surprisingly close can be accomplished,” Ouazzane says.
One of those VPU tasks includes eye-tracking, which isn’t a new concept by any means, but the manner in which Ouazzane sees the eye’s positional data being used very much is.
First seen in the Fove virtual reality headset, fovea tracking is what Ouazzane wagers will be the breakthrough needed to squeeze even more performance out of mobile handsets.
A bit of background first. The fovea is the location on the retina that sees the greatest amount of visual detail. It’s the part of your eye that explains why everything in focus is clear, while what sits outside in the periphery is blurry. You probably see where we’re going with this already.
Fovea tracking will back tackled by the VPU, providing a break for the embedded GPU by only rendering your focal point with high resolution textures and anti-aliasing. Everything that exists in the periphery of the environment will be lacking in detail and appears blurrier, just as your eyes naturally do in the real world.
This trick could, theoretically, allow phones to operate more efficiently, which only means good things for mobile pain points like battery life high operating temperature.
Paired along with the aforementioned IMU, the inside-out tracking methods proposed by Ouazzane for upcoming VR 2.0 headsets seem sound. And in a authored post on VentureBeat, he even goes as far as to state that they “… are all technologies that are arguably ready to be implemented in next-generation devices.”
VR 2.0 or bust, but are we there yet?
Later this year, Google and its partners will bring Daydream-ready headsets to market. It’s a big move to that will give mobile VR an even bigger push. Samsung’s latest iteration of Gear VR will take Google Daydream head-on by expanding its accessory support beyond standard controllers, meaning that motion controllers are probably coming down the line.
But to Ouazzane, this isn’t VR 2.0. The controller tracking capability built into each is still “entry-level” and lacks any sort of cutting-edge tracking methods proposed by the grand vision. Despite what it sounds like, he’s not dismissive of these products.
If anything, Ouazzane thinks they are “totally necessary” in the scheme of things, and each illustrates that we are slowly getting closer to bringing greater levels of immersion to mobile VR without sacrificing the one aspect that makes it so unique: mobility.
Movidius is currently developing a suite of VR-centric products in partnership with Lenovo. Though none of them have been announced as of yet, each will utilize the Myriad 2 VPU in ways that look to raise the bar for mobile VR even higher.
In addition, Movidius told TechRadar that several other companies will be releasing VPU-packed devices in the near feature. VR headsets? Livestreaming cameras? Phones? We’ll have to wait and see.
But, we were left with some tantalizing clues. Ouazzane predicts that all-in-one devices will become the norm in the VR 2.0 era. Think Microsoft HoloLens in the sense that it isn’t reliant on a smartphone, but something that’s capable of high-performance VR experiences.
It’s clear that virtual reality and mobile are a fruitful match. They each serve to push the other to new heights. But will Movidius’ grand VR 2.0 plan ever come together? The CEO shared some bold parting words.
“If VR 2.0 doesn’t take off, then VR as a whole won’t take off.”
No pressure, Myriad 2.
This article is part of TechRadar’s Silicon Week. The world inside of our machines is changing more rapidly than ever, so we’re looking to explore everything CPUs, GPUs and all other forms of the most precious metal in computing.
Article continues below