Quantcast
Channel: vr input – Road to VR

Hands On: ‘The Assembly’ Offers New ‘Comfort Mode’, a Multi-Pronged Attack on Sim-Sickness

$
0
0

nDreams, a UK based games studio producing upcoming VR title The Assembly, are declaring war on the discomfort commonly associated with first-person VR games. We sat down with Senior Designer Jackie Tetley and Communications Manager George Kelion at Gamescom 2015 to hear more about the game’s new ‘VR comfort mode’.

Without a HUD, or a cockpit to help obscure some of the vection-inducing motion that can truly turn the stomachs of the best of us, first-person gameplay in VR is still pretty dicey for many. Traditional controls, although right for some who have already developed what’s commonly called “VR legs,” aren’t for everyone, and can bring on the dreaded ‘hot sweats’ that precede sim-sickness.

“I’m happy to say we’re on the forefront of trying to tackle these problems head on. In three or four years time we’ll be able to look back and all of this will be a memory,” Kelion said.

The Assembly - Screenshot - 06 - CH1

nDreams, developers of the upcoming multi-headset game The Assembly, are keenly aware that the quickly approaching consumer-level VR headsets will bring with them VR veterans and first-timers alike. In response, they’re integrating a number of different control schemes that they hope will let anyone play The Assembly comfortably.

The Assembly and VR Comfort Mode 

The game begins as I’m being rolled through the heart of the Great Basin Desert in Nevada. I’m upright and strapped to a gurney Hannibal Lecter-style, a mechanic Kelion tells me will “cajole people into using their head as the camera.” Because locomotion is intentionally on rails for the first chapter, my only choice is to observe the scene around me.

We come to the mouth of The Assembly’s bunker, the underground home to a clandestine group of amoral scientists and researchers, the sort of people that ran the Japanese Unit 731 back in WW2. A man and woman talk openly about who I am, and how long they think I’ll last in the group. We roll closer and closer, eventually passing a dead crow laying face down in the sand. I’m scanned by two security cameras and let into a service elevator past heavy blast doors, of course never glimpsing my captors, which from the dialogue sounds like they were fans of my medical work. Aha, so I’m a doctor. We go down several levels, passing open windows showing patients, each successively worse off than the last. Coming to a halt, we roll out of the elevator shaft and into a vast complex. They’ve already said too much, and they drug me once more.

The Assembly - Screenshot 16 - CH1

For the next chapter, Senior Designer Jackie Tetley gave me the choice. Did I want to play traditional style, where the gamepad’s left stick controls yaw, (a choice John Carmack calls “VR poison”), or did I want to give the new VR comfort mode a spin?

Choosing comfort mode, I was dropped back into the Bunker, this time in a closed-off lab complete with emergency eye washers, and medical storage of all types. Here I can move around and test it out.

See Also: Stanford Unveils ‘Light Field Stereoscope’ a VR Headset to Reduce Fatigue and Nausea

As opposed to traditional FPS gamepad control schemes, my right stick on the PS4 gamepad now acted as a ‘snap-to camera’, rotating me instantly 90 degrees to my left or right. Although not ideal for the sake of maintaining a contiguous feel in the world, my stomach thanks me for the option to forgo the sickening turns that I know first-hand can turn you off of VR for the rest of the day.

In comfort mode, my head tracking gently pulls my character left or right according to where I’m looking, much like you would in real life. This is much more subtle, and eventually faded into the background of exploring the space for any clue of how to get out, and who I was.

The Assembly - GC Screen - 05

The last comfort mode mechanic was a teleport option that lets you pre-select the way you want to face, so that you can jump to anywhere and immediately look the direction you need to. Tetley tells me that it still isn’t perfected, as the mode can still break the game if you teleport on top of boxes or other items, since the game doesn’t incorporate jumping. I can’t say I liked using it personally, because it brings up (necessarily so) an icon, which in the current build is a floating metal ball studded with arrows. Select the arrow with your right stick, and change the distance of it with your left. The floating icon was easy to use, and teleporting was by no means jarring, but the icon itself was physically intrusive in the space. Thankfully though, all of these control schemes are optional, and can be ignored or utilized according to individual preferences.

Kelion and Tetley assured me that none of this is final however, as they want to include a wider range of movement controls in the game at launch to satisfy more gameplay styles.

The Assembly - Screenshot - 14 - CH3

But what about large tracking volumes that let you actually walk in the virtual space, a la HTC Vive and the new Oculus Rift? Isn’t that a sort of ‘VR comfort mode’ too? Kelion responded:

“The truth is when it comes to the HTC Vive, this is something we’ve been implementing for control pads for Oculus, and for Playstation Morpheus. We’ve had the hardware for longer. We also know the Oculus is shipping with the pad and not Touch, and we don’t even know what the Morpheus is shipping with. We have to assume in terms of utility that we need to make it work well on the gamepad.”

Although not in the current build, nDreams doesn’t put HTC’s Lighthouse tracked controllers or Oculus’ Touch out of the realm of possibility.

nDreams has targeted all major VR headsets for The Assembly‘s release, including HTC Vive, Oculus Rift, and Playstation Morpheus.

For an updated gameplay close-up with commentary by Senior Designer Jackie Tetley, take a look at the video below.

The post Hands On: ‘The Assembly’ Offers New ‘Comfort Mode’, a Multi-Pronged Attack on Sim-Sickness appeared first on Road to VR.


Cloudhead Games’‘Blink’ to Bring Nausea-free VR to HTC Vive this Holiday

$
0
0

Cloudhead Games, the developers behind one of the first successfully Kickstarted VR dedicated games, have announced their technique called ‘blink’, a locomotion system they claim eliminates nausea entirely from VR gaming.

VR hardware is improving and the display techniques such as low persistence of vision, pioneered by Oculus and Valve over just the last couple of years mean perceiving VR imagery inside VR headsets is a whole lot more comfortable these days.

GallerySixElements

But now that nausea and VR sickness brought on by lacklustre hardware is quickly becoming a thing of the past, why is nausea still an issue for today’s VR game designers? It turns out, once you’ve removed all hardware barriers to perceive virtual worlds, you’re left with much more fundamental issues to deal with at the software design level and chief among them is locomotion inside VR.

The Oculus Rift DK2 brought us optical positional tracking, freeing our heads and bodies from a purely rotationally tracked virtual world. The system allowed free movement within the confines of the tracking camera’s FOV, which when positioned correctly was relatively generous. The HTC Vive and its Lighthouse laser tracking system expanded upon that freedom, introducing what they termed ‘room scale’ tracking – pitching their VR system as one you could explore both physical and virtual spaces with.

The Oculus Rift DK2 and Positional Tracking Camera
The Oculus Rift DK2 and Positional Tracking Camera

But no matter how good your tracking is, you’ll always be constrained by the physical world your body is occupying, especially when it comes to moving around a virtual space with boundaries beyond your physical environment. This means, designers must find ways to move players around their virtual worlds that are not natural. Using a gamepad’s analogue sticks to move around is one traditional gaming technique of course, but one that causes jarring feelings of disconnection when in virtual reality. Essentially, your inner ear disagrees with your brain – you’re moving in virtual space but not in physical.

htc-vive-juggling-alex-Schwartz-owlchemy-labs
HTC Vive and Room-scale tracking

Cloudhead Games, who have been grappling with the difficulties and intricacies of VR input since their appearance on the scene back in 2013, now claim they have a solution to VR nausea, physical playspace restrictions and locomotion with a system they’re calling ‘Blink’.

Subtitled ‘Elastic VR Playspace’ the solution comprises a number of techniques (as outlined in the video above, presented by Cloudhead Games’ Denny Unger) which, when combined tackles all of the above issues: Cinematic Blink, Precision Blink and Volume Blink – a “locomotion mechanic which ensures player safety, deep traversal and complete spacial awareness in both the virtual and real world.”

Essentially, Blink employs a point, click and teleport system which removes the uncomfortable elements of virtual travel by snapping you to a position and orientation defined by you instantly thereby essentially sidestepping the issue. This technique works hand in hand with Cloudhead’s elastic playspace too, a system which subtly reminds the player of their custom sized playspace with visual cues within the game.

The techniques described in the video, will debut ‘Blink’ in the first episode of their forthcoming ‘The Gallery‘ saga. This new episode, newly revealed as ‘Call of the Starseed‘ is scheduled for release alongside the HTC Vive, currently pegged for delivery towards the end of 2015.

You can find out more about The Gallery saga over at the game’s website here.

The post Cloudhead Games’ ‘Blink’ to Bring Nausea-free VR to HTC Vive this Holiday appeared first on Road to VR.

Manus Machina Begins to Ship Early Wireless VR Gloves to Developers

$
0
0

Manus Machina, a company dedicated to virtual reality input devices, has begun shipping their first sets of wireless VR gloves to developers.

Roaming the halls of Birmingham’s NEC covering EGX 2015 today, I bumped into the Manus Machina team, who gave me an impromptu demo of their prototype wireless input gloves, and an update on their development progress.

Manus’ PR Officer Bob Vlemmix happily informed me that the team had begun shipping their early developer kits to a selection of developers, meaning the work on content creation for the devices can now begin.

Manus' latest developer wireless input gloves being packed for shipping
Manus’ latest developer wireless input gloves being packed for shipping

Scott Hayden went hands-in with the new gloves back at Gamescom in August, you can check out his impressions here. The devices use a combination of per finger sensor strips and IMUs to track you digits and hands in virtual space.

According to the team, developer interest in the product has picked up significantly, with some big names potentially interested in working with the startup.

There are hardware improvements too, in the form of a new IMU which reduces yaw drift, an effect that can throw off your virtual hand’s orientation until recalibrated.

All-in-all, it was a positive update and despite some known issues with positional accuracy, the team seem to be achieving good progress.

The post Manus Machina Begins to Ship Early Wireless VR Gloves to Developers appeared first on Road to VR.

New Oculus Touch ‘Toybox’ Videos Show Gestures, Sock Puppets, Shrink Rays, and More

$
0
0

Oculus described Toybox as their internal test bed for the company’s ‘Touch’ motion input controllers. The company is polishing up the experience to give players a sandbox environment that shows off both multiplayer VR and the capability of the Touch controllers. A new video from Oculus gives us a glimpse inside of Toybox.

Oculus first showed off Toybox with their Touch controllers at E3 2015 where the combination of multiplayer social interaction and intuitive motion input wowed those lucky enough to try it. At the time the company wasn’t releasing any official footage of the experience.

See Also: Hands-on – Oculus Touch is an Elegant Extension of your Hand for Touching Virtual Worlds

At the company’s second annual Connect developer conference, Oculus ran Touch demos through the night, allowing a much larger swath of developers to see Toybox and others Touch experiences.

oculus-touch-vr-input-controller-hand-trigger

Inside Toybox, players see one another as Rift-wearing blue, body-less avatars that hover over virtual table tops piled high with a myriad objects, all there to demonstrate to you, the user, the joy of accurate, naturalistic virtual reality input.

However, Toybox is much more than a mere technical showcase for Oculus’ excellent proprietary input devices, it goes a long way to counterpoint the widely held opinion that virtual reality is a solitary, isolating experience – one in which the player shuts themselves off from the real world and its inhabitants. In Toybox, with voice comms in place, the enjoyment of this virtual playground is enhanced immeasurably by the presence of your Oculus Touch wielding companion.

Up to now, Oculus have been extremely shy about journalists capturing footage of the ‘programmer art’ filled demonstration. But now, two videos direct from Oculus not only demonstrate what Toybox looks like, they allow you to see how real world actions are interpreted by Oculus Touch.

Two-handed interactions are fluid and intuitive and Touch’s unique gesture sensing features are on show here, allowing in-game hands to represent articulation in both thumb and forefinger. And as trivial as that may sound, as you can see from the footage, seeing those gestures in VR not only enhance communication, they also elevate the sense of shared presence in the virtual world, providing subtle, humanistic cues pulling you into the world.

Oculus Touch will ship separately from the Rift, the latter shipping in Q1 2016 with the former following later in Q2. Emphasis on Oculus Touch and its capabilities was, predictably, very heavy at Oculus’ recent developer conference Connect, with new games from Oculus Studios and Epic demonstrating the unique experience dedicated VR input devices can provide.

The post New Oculus Touch ‘Toybox’ Videos Show Gestures, Sock Puppets, Shrink Rays, and More appeared first on Road to VR.

Candid Reactions to the Oculus Toybox Demo and Social Presence

$
0
0

The Oculus Touch Toybox demo was shown to the most number of people in one day at Oculus Connect 2 on September 23rd. This was the first time that a lot of developers were able to get their hands on the ‘Half Moon Prototype VR input controllers. But more importantly, it was a watershed moment for so many developers to be able to experience social and emotional presence with another person within virtual reality. It became less about the technology and tech specs, and more about the experience of playing, having fun, and connecting to another human in ways that were never possible before. This Toybox demo felt like a real turning point and “Aha!” moment for a lot of VR developers to see how compelling social experiences in VR are going to be. I had a chance to capture some of the candid reactions from Ken Nichols and Ela Darling moments after experiencing Toybox for the first time.

Listen:

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

The post Candid Reactions to the Oculus Toybox Demo and Social Presence appeared first on Road to VR.

Overlaid Videos Show Just how Accurate Oculus Touch Controls Are

$
0
0

Oculus Touch tech demonstration ‘Toybox’ made its public video debut a couple of days ago and highlighted the sheer fun that can be had in VR with intuitive motion controllers. The videos used a side-by-side in-game and real-world POV technique to illustrate how Touch can be used and in this latest video, those views are overlaid on top of eachother. It illustrates quite elegently the accuracy with which Oculus’ Touch controllers respond to real-world input.

Thanks to Scott McGregor for creating the video.

See Also: Hands-on: Oculus Touch is an Elegant Extension of your Hand for Touching Virtual Worlds

The post Overlaid Videos Show Just how Accurate Oculus Touch Controls Are appeared first on Road to VR.

Manus VR Experiment with Valve’s Lighthouse to Track VR Gloves

$
0
0

The Manus VR team demonstrate their latest experiment, utilising Valve’s laser-based Lighthouse system to track their in-development VR glove.

Manus VR (previously Manus Machina), the company from Eindhoven, Netherlands dedicated to building VR input devices, seem have gained momentum in 2015. They secured their first round of seed funding and have shipped early units to developers and now, their R&D efforts have extended to Valve’s laser based tracking solution Lighthouse, as used in the forthcoming HTC Vive headset and SteamVR controllers.

See Also10 Things You Didn’t Know About Steam VR’s Lighthouse Tracking System

The Manus VR team seem to have canibalised a set of SteamVR controllers, leveraging the positional tracking of wrist mounted units to augment Manus VR’s existing glove-mounted IMUs. Last time I tried the system, the finger joint detection was pretty good, but the Samsung Gear VR camera-based positional tracking struggled understandably with latency and accuracy. The experience on show seems immeasurably better, perhaps unsurprisingly.

manus-machina-gloves-1

It’ll be interesting to see where Manus take this experimentation from here. Architect of the Lighthouse tracking system, Valve’s Alan Yates, has said that the intention is for anyone who wants to license Lighthouse should be able to, as long as said use isn’t implemented in such a way that “violates the standard”. Manus are one of the first companies to work towards doing that, it’ll be interesting to see how easy they find the process.

The post Manus VR Experiment with Valve’s Lighthouse to Track VR Gloves appeared first on Road to VR.

Samsung to Demo Gear VR Motion Controller ‘rink’ at CES 2016

$
0
0

The Samsung and Oculus engineered Gear VR mobile headset is available in many countries now at retail, but unlike it’s desktop brethren from Oculus and Valve / HTC, it currently relies on basic, standard input for its applications. Samsung look to be changing that in 2016, as they prepare to demo a new motion controller designed for more intuitive control in virtual reality. It’s called Rink, and it’ll be shown at CES next week.

With the first generation of consumer VR headsets flooding into the market in the first half of 2016, the question of input inside virtual spaces will loom large next year. Valve have their Lighthouse tracked SteamVR controllers, due to ship with HTC’s Vive in April 2016 and Oculus aren’t far behind them with their optically tracked Touch controllers in Q2 2016. Up until now though, those looking to interact with Gear VR VR content have had to be content with either traditional, wireless gamepads, gaze control mechanisms using your head’s orientation or the unit’s integrated touchpad.

samsung gear vr gallery (8)

It seems that Samsung are keenly aware of this shortfall in delivering a compelling mobile VR experience as they’re due to demonstrate a prototype motion controller at CES 2016 designed to bridge this gap. It’s called ‘rink’ and we don’t know too much about it at this stage, other that it’s developed by Samsung’s C-Labs R&D division.

What we can ascertain or guess (and as you can see from the image above) is that it looks to be a wireless, single-handed peripheral into which your hand slips, as if holding a very large cup or mug. Samsung’s news release doesn’t offer many details, simply describign the device thus:

rink is an advanced hand-motion controller for mobile VR devices which offers a more intuitive and nuanced way to interact with the virtual world. The ability to intuitively control the game or content just by using their hands provides consumers with a much deeper level of mobile VR immersion.

The released image calls up other speculation though, as the wearer seems to have an additional box mounted atop the Gear VR headset, which might either be an optical device, used for tracking the peripheral or perhaps a wireless hub used for processing input and passing it onto the phone powering the Gear VR experience. All wild speculation at this point of course, and we’d love to hear your thoughts and theories in the comments section.

Either way, we’ll be at CES 2016 next week to try and find as many details as possible and hopefully get our hands on the new device.

The post Samsung to Demo Gear VR Motion Controller ‘rink’ at CES 2016 appeared first on Road to VR.


Close Up With ‘Rink’, Gear VR’s Prototype Motion Controller

$
0
0

We caught up with the developers of Rink, a new wireless motion controller that uses a combination of magnetic field enabled positional tracking and infra-red finger tracking.

Here’s a closer look at the new motion controllers which we wrote about recently, in development as part of the Samsung C-Labs incubator program, an initiative to encourage exploration and development of new technologies. It’s the brainchild of Yongjin Cho, a Senior Engineer at Samsung’s Creativity Lab
rink-ces-3We have a more detailed hands-on coming soon, but we thought you might take a high resolution look at the controllers, which are surprisingly well finished for prototype hardware.

The rink system comprises two handheld ‘money clip’ style controllers and one basestation. The two hand units connect via bluetooth LTE to the Gear VR, the basestation is, in essence, a dumb magnetic field generator. The handheld controllers do most of the work, using the relative strength of the magnetic field generated by the basestation to gauge position in space.

rink-ces-2 rink-ces-4 rink-ces-5 rink-ces-6 rink-ces-7

In addition, the rink controller have IR LED sensors that fire towards your fingers and are able to detect the flexing of your digits.

We’ll be back with our impressions later, but we’ll leave with you with these high-res shots to pore over in the mean time.

The post Close Up With ‘Rink’, Gear VR’s Prototype Motion Controller appeared first on Road to VR.

Using Your Body as an Input Controller with OBE Immersive

$
0
0

linda-lobatoLinda Lobato is the CEO and co-founder of OBE Immersive, which is a wearable tech start-up that is part of the current Rothenberg River Program. She previously raised $77,000 to kickstart a MIDI Controller Jacket, and after seeing an early Oculus prototype in Korea she decided that VR was the next frontier for wearable technology. I caught up with Linda at a Rothenberg demo day where she talks their progress for creating a jacket that turns your body into an immersive input controller within a first-person shooter. It’s still within the early stages of development, but they hope to launch a Kickstarter later this year.

LISTEN TO THE VOICES OF VR PODCAST

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

The post Using Your Body as an Input Controller with OBE Immersive appeared first on Road to VR.

Multiplayer ‘Siege VR’ Prototype Highlights Solid STEM Tracking Performance

$
0
0

At GDC 2016 we got to take a look at the latest STEM VR controllers from Sixense. With units purportedly on track to begin shipping next month, the company is also committing resources to create a promising cross-platform multiplayer experience called Siege VR which could turn into something much bigger than a mere demo.

Throughout STEM’s unfortunately troubled development timeline, one thing has surprised us: Sixense has always had highly functioning and fairly well polished demo experiences for their motion controllers. That probably comes with the territory; after all, the company designed the tech behind the Razer Hydra which hit the market years ahead of motion input controllers like Oculus Touch and the Vive controllers even having been announced. They also created the Portal 2: In Motion DLC which brought 20 new levels specifically built for motion controllers to the game.

razer-hydra
Sixense has been melding VR and motion controls since the DK1 days

So I suppose it shouldn’t be surprising after all that the many little tech demos they’ve made over the years to show off motion input mechanics have felt ahead of their time (see this one from 2014).

With that in mind, it was great news to hear at GDC 2016 last week that the company not only plans to finally ship the first STEM units to backers of their 2013 Kickstarter campaign, but they’re also developing a new game called Siege VR that’s going to have cross-platform support among headsets and motion controllers.

I Don’t Care What Platform It’s on, I Just Want to Play Siege VR

Unlike many of the short STEM experiences we’ve seen from Sixense over the years, Siege VR is more than a demo. What we saw at GDC was a prototype of what the company says will become a full game, which will be free to all backers of the STEM Kickstarter and will also be made available for the Oculus Rift, HTC Vive, and PlayStation VR (whether that’s using each platform’s own VR controllers or STEM).

sixense-stem-archery
See Also: Sixense Releases 5 STEM Demos and SDK Compatible with Razer Hydra

Siege VR is a first-person multiplayer castle defense game which (in prototype form) had me and an ally wielding bows side-by-side, trying to stop hordes of enemies from reaching the castle gates. In addition to shooting regular arrows, there are two special arrow types: an explosive arrow (from the quiver seen on the left wall), which when ignited by a nearby torch explodes on impact; and an artillery arrow, which fires a smoke round which designates a target for your friendly catapult to fire upon. The former is great for taking out dangerous groups (including enemy archers who will aim for you directly) and the latter works against enemy catapults. The special arrow types regenerate over time, but you’ll want to use them sparingly, especially as the artillery arrow stockpile is shared between both players.

The game is still very early, but the creators say they’re considering having many more than two players all working together to defend the castle. Perhaps—they conjectured—there would be teammates at more forward towers who could aim back at the castle to prevent enemies from scaling the walls, while the players on the wall itself would be focused on enemies in the field. Maybe—it was suggested—it could be a multi-stage experience where, if the enemies break through the main gate, you and your team fall back to using melee weapons.

Some earlier prototypes included the ability to pour buckets of boiling oil onto would-be castle crashers, though that and some other features were cut for the time being to add a bit of simplicity and polish for the GDC demo.

Between the lot of us excitedly chattering about ‘what about [insert super cool gameplay]? Or how about [more super cool gameplay]?’ it was clear that Siege VR could have legs well beyond a simple demo, and that’s where Sixense says they plan to take it.

Forgetting It’s There is a Good Thing

As I played Siege VR using STEM with a Rift DK2, I got that wonderful feeling of forgetting about the technology and simply having fun playing the game. That means that everything was working together to make a fun and intuitive experience which kept me immersed. When I came out, the top of my mind was filled not with questions about STEM, but about the scope and potential of Siege VR.

STEM motion input controller with three 'Packs'
STEM motion input controller with three tracking modules

STEM itself was integral to getting us to the stage of talking not about limitations, but about possibilities for Siege VR. I’ve used the system at many points along its oft-delayed development, and while it’s always felt good, this time around it felt better than at any point in the past; even after using Touch and Vive controllers all week throughout the rest of GDC.

For me the thing that pushed the needle most significantly was the headtracking performance. STEM has additional tracking modules which can be affixed to head and feet (or elsewhere, up to 10 tracked points). For their demos Sixense often eschews the Rift’s own headtracking in favor of using a STEM tracking module. Having used the Rift plenty, it always felt like there was something a little ‘off’ about the STEM-based headtracking—whether it was latency or positional accuracy, I’m not quite sure. But this time around I actually had to ask to clarify if they were using the Rift’s tracking camera or if it was STEM: it was 100% STEM.

I point to headtracking because it’s easier to tell when something isn’t right with your head, compared to your hands; light drift of a few millimeters or inaccuracy on your hands can be very hard to spot. When the placement of your virtual eyes depends entirely on the tracking though, it’s easy to feel when things aren’t working right. So what I’m saying is that because the headtracking was solid, that means the rest of STEM’s tracking is solid too (as there’s no difference in tracking a module on your head vs. a module on your foot).

Particularly in an electromagnetically dense setting—like, say, the middle of the GDC expo floor—which can mess with the magnetically-based tracking, it was impressive that the headtracking felt that good. In fact, Sixense’s booth had a number of STEM basestations scattered about; there was one just a few feet away from the one I was using, and I didn’t spot any interference-based tracking issues despite competing magnetic fields.

Sixense isn’t trying to quarantine itself from the competition either. They had both the Oculus Rift and HTC Vive (with Vive controllers) in action at their booth, and says their SixenseVR SDK will allow developers to create games that are ready for any motion controllers, not just STEM. The SDK allows for “code-free setup” for humanoid characters in Unity and Unreal Engine, and provides a full body skeletal pose based on sensor data.

The post Multiplayer ‘Siege VR’ Prototype Highlights Solid STEM Tracking Performance appeared first on Road to VR.

Manus VR’s Gloves in Action Using Valve’s Lighthouse Tracking for Hands and Arms

$
0
0

We’ve been tracking Manus VR, the start-up dedicated to producing an intuitive VR glove input device, for a while now. The team were present at GDC in March, showing their latest prototype glove along with their in-house developed game Pillow’s Willow.

One of the privileges of writing for Road to VR is to watch small, independent start-ups with great ideas blossom rapidly inside the VR space. Manus VR is one such company who’ve moved from crude looking, Gear VR based early designs through to slick looking functional iterations, as shows at March’s GDC running on desktop PC hardware alongside the HTC Vive.

manus-machina-gloves-1
Early Manus Glove Prototypes

Manus VR brought along their latest gloves to the show along with an early
version of their showcase game, with integrated support for the gloves – footage you can see in the embedded video a the top of this page.

Manus is using Lighthouse integration, albeit somewhat crudely at this stage, by literally strapping a SteamVR controller to the arms of the player. Positional tracking from those controls are then fused with input for each finger to provide a fairly intuitive looking way to interact. If you check the video embedded above, you’ll see the player grab objects from the world, rotate, pass from hand to hand, all pretty seamlessly.

Manus have also recently released a sneak preview of their research into getting those disembodied hands tied to arms, for a less disconcerting projection of your digital self in VR, as seen in the below video.

Manus VR have already announced that their first developer kit edition gloves will ship in Q3 2016 and will set you back $250. Developers can put themselves down for a pre-order reservation now and for your money you’ll receive a pair of Manus gloves, plugins for Unity and Unreal Engine and access to the SDK for Android, Windows 7+, Linux, iOS 9, Mac OSX.

The post Manus VR’s Gloves in Action Using Valve’s Lighthouse Tracking for Hands and Arms appeared first on Road to VR.

Why CES 2017 Will Set the Stage for the Next Year of VR

$
0
0

The annual Consumer Electronics Show (CES) has served as a good way-point for VR’s progress over the years. With CES 2017 kicking off next week, here we take a look back at the highlights (and low-lights) from 4 years of VR at CES to gauge how far the industry has come and look for clues as to where it goes from here.

Wedged somewhat inconsiderately at the very start of every year (it’s OK CES organisers, no one in the tech industry have families they want to spend time with), the annual Consumer Electronics Show held in Las Vegas is still the biggest event for hardware in the world. A swirling mass of corporate marketing excess and the single platform showcasing the best (and worst) new gear from around the world expected to vie for our attention in 2017 and beyond. Virtual reality has figured prominently at the event in recent years of course, quickly rising to become one of the shows hottest technologies. With that in mind, and with CES 2017 imminent, we thought we’d take a look back at the notable VR events from past shows, charting VR’s progress to the present day.

CES 2013 / 2014: The Early Years

From the advent of the Oculus Rift in 2012, we saw Oculus attend the show for the first time in 2013 to show off their pre-production Rift headset prototype ahead of the DK1 launch, following their wildly successful Kickstarter campaign. Press response to the closed-doors meetings was almost universally positive. Road to VR was still in its infancy at the time, but Tested.com went hands-on with an interim Rift prototype at the show along with giving us a glimpse at the near-complete Rift DK-1 design that would ship to Kickstarter backers later that year. The demonstration included the now familar Unreal Engine 3 powered citadel scene, one which would become the setting for one of the most famous early VR applications of all time, Rift Coaster. The Rift had of course been covered by media before, most notably when Id co-founder John Carmack at E3 2012 showed an early, modified Oculus Rift prototype sent to him by the device’s inventor (and future Oculus VR founder) Palmer Luckey. CES 2013 however gave us the first glimpse of Oculus VR operating as a company.

Oculus' Pre-DK1 Prototype, shown at CES 2013
Oculus’ Pre-DK1 Prototype, shown at CES 2013

The following year at CES 2014, Oculus had to share the immersive technology limelight with a slew of new startups who had appeared in the wake of Oculus’ success. The unique (and formerly Valve developed) retro-reflective-powered CastAR system gave us a glimpse at one of augmented reality’s possible futures; Avegant turned up with their bizarre yet technically impressive personal media player the Glyph; PrioVR had their new entry-level motion tracking / VR input system to try.

But the star of the show remained Oculus who, having grappled with the DK1’s biggest technical flaws, showed their latest prototype which resolved two of them in one fell swoop. The Crystal Cove headset featured a cluster of IR LEDs and a tracking camera to provide positional tracking and also introduced low persistence of vision displays. Both advancements provided a vast improvement in user experience, and provided the baseline technical platform for the consumer Rift when it appeared in 2016. A few months after CES 2014, Facebook would acquire the company for $2Bn.

oculus-rift-crystal-cove
The Oculus Rift Crystal Cove prototype VR headset and tracking camera a shown at CES 2014

CES 2015: Consumer VR Takes Shape

CES 2015 brought yet more impressive advancements from VR and AR fields, and some notable setbacks. With the huge uptick in interest surrounding VR technology it inevitably drew opportunistic businesses to join the bandwagon with minimal work.

The most infamous example is of course, the legendary 3DHead system. This was a product which purported to offer a full-fledged VR experience with no lenses and mostly, off-the-shelf technology. Backed by the eccentric billionaire Alki David, the product’s aggressive (and as it turns out misleading) marketing had already drawn ire from the VR community, adopting as it did taglines like “Oculus killer” in its promotional material and then booking a booth directly next door to Oculus themselves sporting those same slogans. The headset itself was enormous – somewhat akin to the head of H R Geiger’s Alien, although somehow less attractive – and the advertising was painfully bad, but we nevertheless did our best to keep an open mind. Inevitably however, after Ben Lang tried 3DHead for himself, and subsequently interviewed the seemingly sincere James Jacobs (at that time COO of the operation), it was clear 3DHead was at best a terrible product and at worst, a complete sham. Watch the interview for yourselves below (along with Ben’s write-up of his experience) but needless to say Ben’s original and succinct summary of his experiences were right on the money; it was indeed, “beyond bad”.

Elsewhere however, things were looking much more promising. Oculus had once again brought along its latest prototype, the Oculus Rift Crescent Bay. Unveiled originally at the company’s 1st developer conference Connect in September, Crescent Bay gave us what we know now to be a pretty good sneak peek at the device that would eventually ship in March the following year. It had integrated, high quality headphones (supported by a custom inline DAC and amp), lightweight construction, and rear-mounted ‘Constellation’ infra-red LEDs for 360 degrees of positional tracking with a single camera sensor. We would also later learn the device (as with the consumer version) sported dual OLED panels and Fresnel lenses, quite a departure from all Rift devices that had preceded Crescent Bay. For the first time, Oculus had shown a device that looked like a consumer product.

The Oculus Rift Crescent Bay Prototype
The Oculus Rift Crescent Bay Prototype

2015’s CES was the first for Samsung’s ‘Oculus powered’ mobile VR headset ‘Gear VR’ having been unveiled and impressing a few months earlier at IFA Berlin. The then Galaxy Note 4 powered device was featured heavily at Oculus’ booth both in front and behind the scenes. It was clear Oculus, thanks in no small part to its CTO John Carmack, was serious about the future of mobile, untethered mobile virtual reality.

A new VR headset challenger also entered the ring at 2015’s CES, one which promised to eschew the proprietary, walled garden approach which Oculus had adopted and open up both the hardware and software for developers to tinker. The Razer-fronted Open Source Virtual Reality (OSVR) platform was announced along with its very first flagship hardware, the Hacker Developer Kit (HDK for short). This was a headset designed to be pulled apart redesigned and put back together then shared with the community. Built atop an open source set of APIs, the platform was a refreshing take on how to deliver immersive technology. Although the platform left a little to be desired in the overall experience compared with the Rift, it was encouraging to see such a fresh approach.

The OSVR HDK Headset
The OSVR HDK Headset

Unbeknownst to most (with nary a whisper uttered at 2015’s CES), Valve and HTC were working in secret on a virtual reality system that would shake up the fledgling VR industry and present Oculus with their first serious competitor in the PC VR space. At MWC in Barcelona in March that year, HTC unveiled the Valve/SteamVR driven Vive headset and arguably went on to dominate the Game Developer Conference (GDC) show which followed. The Vive was powered by Valve’s new laser-based room-scale tracking technology ‘Lighthouse’ and gave many people their first taste of presence thanks to the system’s then prototype motion controllers which demonstrated an at that time unprecedented level of input fidelity. Vive’s entrance would help shape the conversation around what we should expect from consumer virtual reality throughout 2015 and beyond.

Sony however, having debuted it’s PlayStation 4 powered ‘Morpheus’ (later

The HTC Vive (DK1), SteamVR Controllers and Laser Basestations
The HTC Vive (DK1), SteamVR Controllers and Laser Basestations

re-christened PlayStation VR) headset at GDC in March of 2014, was largely absent from 2015’s CES, with the company focusing more heavily on its more traditional consumer electronic lines. Sony however would go on to push the Morpheus hard at gaming shows throughout 2015 such that, by the close of that year, the PlayStation VR would become one of VR’s best hopes at reaching out to a mass market audience.

Sixense also showed off the latest iteration of their STEM motion controller. The company had run an extremely successful in 2014 riding the wave of interest in VR and aiming to plug the gap for VR-centric controllers. At CES

Ben Lang trying out the Sixense STEM at CES 2015
Ben Lang trying out the Sixense STEM at CES 2015

2015 the company demonstrated a new version which integrated IMUs to tackle tracking drift and distortion inherent in the device’s electromagnetic tracking system. It was impressive stuff at the time, and Sixense was at the time confidently contemplating shipping finalised devices to Kickstarter backers later in the year. Alas, at the time of writing this piece, and thanks to a series of frustrating delays, the company is yet to fulfill that promise.

CES 2016: Consumer VR Wars

All of that brings us to 2016, and a CES which marked the beginnings of what would turn out to be VR’s most important 12 months so far. Most industry observers (including us) had expected to usher in the first generation of consumer virtual reality in 2015. The hardware felt ready and there had even been indications to that effect from the company largely responsible for VR’s resurgence, Oculus. As it turned out, we had to wait until CES 2016 to learn when we could pre-order the world’s first consumer desktop VR headset. Oculus announced that pre-order sales would go live during CES itself (which posed some logistical issues for those of us covering the event and wanting to get their hands on a Rift let me tell you). On January 6th 2016 Rift pre-orders went live at with headsets expected to ship a couple of months later in March. Also, as a nod to the company’s roots, and as a (largely unprecedented) “thank you” to the original Rift Kickstarter backers that launched the company, Oculus gave announced that every supporter would receive a free consumer Rift. Sadly, the Rift’s launch would become mired in familiar shipping difficulties, in part blamed on component shortage, exacerbated by some seemingly poor logistics management.

The Final Oculus Rift consumer edition
The final Oculus Rift consumer edition

Throughout 2015, the Rift’s biggest competitor, the HTC Vive had made phenomenal gains in public awareness and word-of-mouth PR. Its room-scale credentials and those precisely tracked SteamVR motion controllers had been demo’d around the world and its particular flavour of immersive interactive entertainment was a big hit. Oculus’ handicap, its resolutely seated/standing experience focus and (most importantly) its lack of dedicated out-of-the-box tracked motion controllers, helped Valve and HTC present the Vive as the first ‘complete’ VR solution and people were buying into the idea that room-scale VR might be the future – albeit one which many may not have the room for. In any case, Vive’s launch in April 2016 kickstarted a formation of Oculus and Vive factions, ushering in the dawn of VR format wars with partisan arguments strongly reminiscent of every console generation past.

The Vive Pre (right) versus the Vive DK1 (left)
The Vive Pre (right) versus the Vive DK1 (left)

To highlight the rapidity at which the Vive was approaching a consumer reality, HTC took the opportunity to demonstrate what would turn out to be the HTC Vive’s final form. We’d already seen various iterations of the Vive developer kits, in fact Valve showcased the hardware’s evolution as part of it’s unveiling at CES, but at the show, HTC showcased the Vive Pre, sporting some significant hardware enhancements over its predecessors. The Pre packed in a new, front-facing camera-sensor which allowed users to glimpse their real world within VR. The Pre also came with Mura correction, a process to help minimise artefacts and disparity between the unit’s OLED display panels. It was also notably smaller than what had come before too. The Pre was, to all intents and purposes, what the retail Vive would turn out to be when it was launched just a few months later in April. It was an encouraging show of readiness from HTC then, although perhaps a far cry from the previously teased “very, very big breakthrough” teased by the company’s CES just a couple of weeks prior.

The HTC Vive Pre
The HTC Vive Pre at CES 2016

On the VR input side of things, Virtuix looked in bullish form at the event with a generously sized stand and a dedicated multiplayer event featuring 4 Omni treadmills and a new, in-house developed FPS for people to compete in. We got our feet on the new improved omni-directional treadmill Infinadeck and, despite its gargantuan size and weight, came away intrigued by its unique take on VR locomotion. Equally quirky, the intriguing Rudder VR made an appearance in its final form at the show and announced it would go on sale in 2016. We also got hyped for an experimental input device which promised to bring electromagnetic field powered positional input tracking to Samsung Gear VR. Alas, the Rink controllers were early prototypes and, once we found them, were disappointed by the performance. We’ve not heard anything of them since.

Eye tracking finally began dovetailing with virtual reality at CES 2016 too thanks to Sensomotoric Industries and their impressive demonstration of both gaze-based input and an implementation of foveated rendering, all on a neatly hacked-up Oculus Rift DK2. Eye tracking, the main USP of the recently released FOVE headset, seems to be one of the most likely additional technologies to make its way into future generations of consumer virtual reality given its obvious experiential and performance benefits.

Finally, Oculus brought their tracked motion controllers Touch to CES for the first time and we caught a glimpse of the latest design iteration which would prove to be near identical to the retail editions once they arrived. We’d have to wait almost a full year however to get our hands on the consumer version.

So that’s it, a massively condensed history of VR at CES over the years. CES is by nature a hardware focused event so details of the huge leaps and bounds made by developers and industry leaders in VR content is beyond the scope of this piece. But in truth, as all major VR platforms are now in the hands on consumers and Oculus, Sony and Valve/HTC concentrate on heavily content production for their new babies, the focus for CES this year will likely be less about hardware revisions and more about a glimpse of technologies we may see forming part of the next generation of yet-to-be announced VR hardware. We’ll see the strides made by companies in the field of wireless VR, hopefully progress in the eye-tracking arena and perhaps a handful of VR-centric input devices. That said, the joy of CES is that you never can tell what might happen. In either case, Road to VR will be there to find out as it has done since the beginning.


Road to VR will of course be at CES 2017, and if you have something VR related you’d like to show or talk to us about, drop us an email at tips@roadtovr.com.

The post Why CES 2017 Will Set the Stage for the Next Year of VR appeared first on Road to VR.

Hands-on: Massless Wants to Bring High-precision Stylus Input to VR

$
0
0

Massless is developing a stylus designed specifically for high-precision VR input. We got to check out a prototype version of the device this week at GDC 2018.

While game-oriented VR controllers are the norm as far as VR input today is concerned, Massless hopes to bring another option to the market for use cases which benefit from greater precision, like CAD. Controllers like Oculus’ Touch and HTC’s Vive wands are quite precise, but they are articulated primarily by our wrists, and miss out on the fine-grain control that comes from our fingers—when you write on a piece of paper, notice how much more your fingers are in control of the movements vs. your wrist. This precession is amplified by the fact that the tabletop surface acts as an anchor for your finger movements. Massless has created a tracked stylus with the goal of bringing the precision of writing implements into virtual reality, with a focus on enterprise use-cases.

Image courtesy Massless

At GDC I saw a working, 3D printed prototype of the Massless Pen working in conjunction with the Oculus Rift headset. The system uses a separate camera, aligned with the Rift’s sensor, for tracking the tip of the stylus. With the stylus held in my left hand, and a Touch controller in my right, a simple demo application placed me into an empty room where I could see the tip of the pen moving around in front of me. I could draw in the air by holding a button on the Touch controller and waving the stylus through the air. I could also use the controller’s stick to adjust the size of the stroke.

Photo by Road to VR

Using the Massless Pen felt a lot like drawing in the air with an app like Tilt Brush, but I was also able to write tiny letters quite easily; without a specific task comparison, or objective means of measurement between controller and stylus though, it’s tough to assess the precision of the pen by just playing with it, other than to say that it feels at least as precise as Touch and Vive controllers.

SEE ALSO
Oculus Research Devises High-accuracy Low-cost Stylus for Writing & Drawing in VR

Since the ‘action’ of writing in real life is initiated ‘automatically’ when your writing implement touches the writing medium, it felt a little awkward to have to press a button (especially on my other hand) in order to initiate strokes. Of course, the Massless Pen itself could have a button on it (so at least it might feel a little more natural since the stroke initiation would happen in the same hand as the writing action), but the company says they’ve steered away from that because the action of pressing a button on the pen itself would cause it to move slightly, working against the precision they are attempting to maintain.

Photo by Road to VR

If you’ve ever used one of a million trigger-activated laser-pointed interfaces in VR, you’ll know that this is actually a fair point, as pointing with a laser and then using the controller’s trigger to initiate an action causes the laser to move significantly (especially as it’s amplified by leverage). It felt weird using my other hand to initiate strokes at first, but I feel fairly confident that this would begin to feel natural over time, especially considering that many professional digital artists use drawing tablets where they draw on one surface (the tablet) and see it appear on other (the monitor).

Inside the demo I could see the white outline of a frustum projected from a virtual representation of the Rift sensor in front of me. The outline was a visual representation of the trackable area of the Massless Pen’s own sensor, and it was relatively narrow compared to the Rift’s own tracking. If I moved the stylus outside the edge of the outline, it would stop tracking until I brought it back into view. As Massless continued to refine their product, I hope the company is prioritizing growing the trackable area to be more comparable to the headset and controller that it’s being used with.

While the Massless Pen prototype I used has full positional tracking, it lacks rotational tracking at the moment, meaning it can only create strokes from a singular point, and can’t yet support strokes that would benefit from tilt information, though the company plans to support rotation eventually.

Photo by Road to VR

More so than drawing in the air, I’m interested in VR stylus input because of what it could mean for text input handwritten on an actual surface (rather than arbitrary strokes in the air); history bred the stylus for this use-case, and they could become a key tool for productivity in VR. Drawing broad strokes in the air is nice, but writing benefits greatly from using the writing surface as an anchor for your hand, allowing your dexterous fingers to do the precision work; for anything but course annotations, if you’re planning to write in VR, it should be done against a real surface.

To see what that might be like with the Massless Pen, I tried my hand at writing ‘on’ the surface of the table I was sitting at. After sketching a few lines (as if trying to color in a shape) I leaned down to see how consistently the lines aligned with the flat surface of the table. I was surprised at the flatness of the overall sketched area (which suggests fairly precise, well calibrated tracking), but did note that the shape of the individual lines showed regular bits of tiny jumpiness (suggesting local jitter). Granted, this is to be expected—Massless says they haven’t yet added ‘surface sensing’ to the pen (though they plan to), which could reasonably be used to eliminate jitter during real surface writing entirely, since they could have a binary understanding of whether or not the pen is touching a real surface, and use that information to ‘lock’ the stroke to one plane.

The Massless Pen is interesting for in-air input, but since the stylus was born for writing on real surfaces, I hope the company increases its focus in that area, and allows 3D drawing and data manipulation to evolve as a natural, secondary extension of handwritten VR input.

The post Hands-on: Massless Wants to Bring High-precision Stylus Input to VR appeared first on Road to VR.

Valve Psychologist to Explore Brain-Computer Interface Research at GDC

$
0
0

At GDC 2019 later this month, Valve’s Principal Experimental Psychologist, Mike Ambinder will present the latest research pertaining to brain-computer interfaces—using signals from the brain as computer input. Ambinder says that BCI is still “speculative technology,” but could play an important role in the way players interact with the games of the future.

As time moves forward, the means by which users interact with computers have becoming increasingly natural. First was the punch card, then the command line, the mouse… and now we’ve got touchscreens, voice assistants, and VR/AR headsets which read the precise position of our head and hands for natural interactions with the virtual world.

More natural computer interfaces make it easier for us to communicate our intent to a computer, making computers more accessible and useful with less time spent learning the abstract input systems.

Perhaps the final frontier of computer input is the brain-computer interface (BCI). Like the virtual reality system envisioned in The Matrix (1999), the ultimate form of BCI would be some sort of direct neural input/output interface where the brain can directly ‘talk’ to a computer and the computer can directly ‘talk’ back, with no abstract I/O needed.

While we’re far, far away from anything like direct brain I/O, there has been some headway made in recent years at least on the input side—’brain reading’, if you will. And while early, there’s exciting potential for the technology to transform the way we interact with computers, and how computers interact (and react) to us.

At GDC 2019 later this month in San Francisco, Valve’s Principal Experimental Psychologist, Mike Ambinder, will present an overview of recent BCI research with an eye toward its applicability to gaming. The session, titled Brain-Computer Interfaces: One Possible Future for How We Play, will take place on Friday, March 22nd. The official description reads:

While a speculative technology at the present time, advances in Brain-Computer Interface (BCI) research are beginning to shed light on how players may interact with games in the future. While current interaction patterns are restricted to interpretations of mouse, keyboard, gamepad, and gestural controls, future generations of interfaces may include the ability to interpret neurological signals in ways that promise quicker and more sensitive actions, much wider arrays of possible inputs, real-time adaptation of game state to a player’s internal state, and qualitatively different kinds of gameplay experiences. This talk covers both the near-term and long-term outlook of BCI research for the game industry but with an emphasis on how technologies stemming from this research can benefit developers in the present day.

Ambinder holds a B.A. in Computer Science and Psychology from Yale, and a PhD in Psychology from the University of Illinois; according to his LinkedIn profile, he’s been working at Valve for nearly 11 years.

The session details say that the presentation’s goal is to equip developers with an “understanding of the pros and cons of various lines of BCI research as well as an appreciation of the potential ways this work could change the way players interact with games in the future.”

SEE ALSO
Facebook is Researching Brain-Computer Interfaces, "Just the Kind of Interface AR Needs"

While the description of the upcoming GDC presentation doesn’t specifically mention AR/VR, the implications of combining BCI and AR/VR are clear: by better understanding the user, the virtual world can be made even more immersive. Like eye-tracking technology, BCI signals could be used, to some extent, to read the state and intent of the user, and use that information as useful input for an application or game. Considering Valve’s work in the VR space, we’d be surprised if Ambinder doesn’t touch on the intersection of VR and BCI during the presentation.

Road to VR will be at GDC later this month to bring you the most important news. Showing something awesome in AR or VR? Get in touch!

The post Valve Psychologist to Explore Brain-Computer Interface Research at GDC appeared first on Road to VR.






Latest Images