2017-01-17



At CES 2017, Intel was showing off more of its Project Alloy VR headset, including a roomscale multiplayer demo featuring ‘merged reality’, where the game environment adapts to your physical surroundings.

Project Alloy

Project Alloy is Intel’s VR headset reference design. The company expects device makers to use it as a starting point to create their own VR headsets (based on Intel chips, of course). Alloy is an all-in-one headset which does everything—game rendering, computer vision, processing, and more—directly on the headset. It’s literally an x86 computer running Windows 10 that you wear on your head.




That means the headset is completely untethered, and with inside-out positional tracking, it can track your movement without the use of external sensors.

Merged Reality Multiplayer

At CES 2017, Intel was showing off Project Alloy with a roomscale ‘merged reality’ multiplayer demo. The idea behind merged reality (AKA virtual augmented reality) is to make the virtual world account for the physical environment around you. In the demo, that meant turning the physical objects in the room into virtual objects in the game world that could be used for cover. You can see the experience in action in the video at the top of this article.

After putting on a pair of the fairly bulky prototype headsets, a colleague and I saw a virtual version of the same chair, desk, bookshelf, couch, and coffee table that were in the room we had been standing in. The room was scanned ahead of time using the sensors on the Alloy headset. We were able to physically navigate around the virtual version of the room thanks to the headset’s inside-out positional tracking.

After a few minutes of walking around and inspecting the virtual version of the room, the walls faded away to reveal a vast skybox of distant mountains and clouds. It really did feel like the walls had opened up before us and we had been transported to anothe realm. Before we knew it, the objects in the room had transformed into geometry that thematically matched the game world; the couch became a big rectangular metal storage bin, the desk and chair became a futuristic metal chair and computer terminal, the bookshelf turned into a door frame, and the circular coffee table turned into a futuristic metal… circular thing. The digital versions were not inch-for-inch facilities of the real furniture, but the assets were at least as big as the footprint of the real furniture. There was more virtual geomtry added too which didn’t exist at all in the real world, like a computer monitor on a tall mount and a crane-like mechanism perched overhead.

Using 3DOF controllers which were parented to the location of our head, we were able to aim and fire a rifle. The shooting mechanics worked fine, but the lack of positional tracking on the controllers meant it was a simple point-and-shoot affair with no ability to actually raise the weapon to your shoulder and look down the scope to aim properly (as we’ve seen on other VR systems with more advanced VR controllers).

Waves of flying drones came at us and were easily dispatched with one shot each. After clearing a swarm we would advance to the next wave which had a few more enemies. Thanks to the headset’s positional tracking, we could walk around the entire space as we played and duck behind the virtual/real cover. But it wasn’t exactly a high-action experience as the drones weren’t quite competent enough to make us really feel pressured into cover.

After running out of ammo, we’d needed to find an ammo pickup to replenish the weapon’s clip. I remember inching my way toward the pickups because I just didn’t feel quite confident in the mapping and tracking. Impressive as it was, I wasn’t able to achieve a sense of totally forgetting about the physical objects around me. Inside the demo, it felt as if the scale of the virtual environment was slightly off; when I took a step forward, it didn’t quite feel like I’d traveled the same distance in the virtual world as all my bodily sensors said I traveled in the real world. That amounted to taking careful steps with each movement.

Just a Demo, but Promising

As a demo and a concept, it was pretty cool to see this working. But there’s still a lot of work to be done to bring this sort of experience to everyone’s home. For one, the objects in the environment were not automatically identified and replaced with virtual objects. The demo appeared to be made for this specific room size and this specific arrangement of this particular furniture. Adapting a VR game to any room and any furniture automatically will require some smart game design thinking, especially for anything beyond a wave-shooter where your couch turns from a couch into a virtual metal box for cover.

There’s also work to be done in building confidence in users so they can trust they aren’t going to bump into the real environment. The limited field of view makes knee-high objects like coffee tables and chairs a notable threat because you’re much less likely to see out of the corner of your eye. This is compounded not only by the scale issue I described above, but also because it’s hard to tell exactly where your legs are when you can’t see them in VR (like most VR experiences, I didn’t have an accompanying virtual body beyond my head and gun. With a VR headset like the HTC Vive, the chaperone system is so competent that I can almost completely forget about the real world around me, because I know the system will alert me if I’m in danger of running into the physical world 100% of the time. That sort of “freedom to forget” is essential for immersive virtual reality.

There’s also the added complication that a virtual asset may not perfectly align with the real one. This was demonstrated quite clearly by the coffee table in the middle of the room which—to the dismay of my shins—I bumped into more than once, even when I felt like I was well clear of the virtual counterpart. One of the developers running us through the experience also gave his knee a good whack on the table, at which point he figured out that the table had probably been moved after the scan. This is the sort of thing that needs to be solved if this tech is going to take off in people’s homes.

But of course that’s why all of this is just a demo, and a pretty exciting one at that. In fact it was the first time I did VR multiplayer where both players were inhabiting the same physical space. There’s kinks to work out before this sort of merged reality experience can work well in a wide range of environments, but the vision is promising and could be very compelling with the right execution.

The post Hands-on: Intel’s Project Alloy ‘Merged Reality’ Roomscale Multiplayer Demo appeared first on Road to VR.

Show more