Quantcast
Channel: vr input – Road to VR
Viewing all 23 articles
Browse latest View live

SVVR Con: User Input and Locomotion in VR – LiveStream @ 3pm PDT

$
0
0

SVVR-Conf-SplashOur SVVR Conference and Expo 2014 coverage continues with a LiveStream of “User Input and Locomotion in VR”, a panel discussion which aims to explore what challenges, solutions and challenges there might be to allow users of virtual reality to interact with these new and exciting spaces.

The Panel consists of:

Richard Marks, Senior Director, Sony Magic Lab
Jan Goetgeluk, CEO, Virtuix
Nathan Burba, CEO, Survios
David Holz, CTO and Co-founder, Leap Motion
Danny Woodall, Creative Director, Sixense

Moderator: Jason Jerald, Founder, NextGen Interactions

Brian Hart will be LiveStreaming the event which kicks off at 3pm PDT. The video should appear below.

The post SVVR Con: User Input and Locomotion in VR – LiveStream @ 3pm PDT appeared first on Road to VR.


Oculus on Unconfirmed VR Input Device: “You’ll Probably Hear More About It [at Connect]…”

$
0
0

While there was lots of great info to come out of our Gamecom 2014 interview with Oculus VR, perhaps the most telling bit was the way the Oculus founder Palmer Luckey answered the question about an unconfirmed VR input device that the company is reportedly working on.

The raw transcript itself doesn’t sound exciting, at 0:32 in the video. The telling part came from Nate Mitchell (right) Oculus VR’s VP of Product. His reaction (below), which appeared to be along the lines of “I didn’t know we were saying that yet,” spoke volumes more than the words alone.

Paul James: “How likely is it we’re hear more about your unconfirmed work with VR input devices at Oculus Connect?

Palmer Luckey: “You’ll probably hear more about it, we’re going to be talking about everything we’re doing at Connect.”

For some time now Oculus VR has said that they’ve been researching VR input devices, but whether or not they’d be making their own has been a long-dodged question. The company has only ever shown their VR headset, the Rift DK1 and DK2, played with keyboard and mouse or a wireless Xbox 360 controller. Once Sony announced Morpheus with a heavy emphasis on the PlayStation Move controller as a natural input device, the pressure began to build for how Oculus would respond.

And for good reason. Using a natural input controller is far more immersive than an abstract controller like a keyboard and mouse. Even though many of us are right at home with such devices, they aren’t actually intuitive; interacting inside of a 3D virtual world will always be easier and more natural when it can be done with an input method that that is tracked with 6 degrees of freedom (6DOF). In my experience with the Oculus Rift, many of the most immersive experiences make use of the Razer Hydra controller to put your hands in VR. Many of my favorites—from the original Sixense Tuscany VR demo to the HL2VR mod with Hydra support to Crashland—use it very effectively.

I’ve had the opportunity to see many people experience Sony’s Morpheus for the first time in the Castle demo where they use the Move controller to attack an armored mannequin with fists, swords, and crossbows. Every time—experienced gamers or not—they jump right in, immediately throwing a flurry of punches, often with a big smile on their face. There’s no learning to be done, no ‘pull the trigger to hit,’ they jump right into it because they can rely on their natural human motion for input.

See AlsoKilling Zombies with the Survios Prime 3 Prototype is a Blast

Oculus is well aware of the need for a proper natural input solution, they’re also well aware of the existing solutions out there—STEM, PrioVR, and Control VR, to name a few. At the SVVR 2014 Conference & Expo, Luckey sat in on the Rev VR ‘Ubercast’ episode and answered some questions on the input topic:

cyberglove cybergrasp
The CyberGrasp system from CyberGlove offers per-finger force feedback.

At Oculus we try not to do things unless we can do them the best and I do think that 6 DOF input is very critical to virtual reality but I think that inputs need to be more than just input devices, they need to be input and output devices and this is true of the headset. The Rift wouldn’t be useful if it was just an output device, it wouldn’t be useful as only an input devices, because they work together that you’re able to have this incredible system. I think the same goes for input… having something that allows you to reach into a virtual world is pretty cool but you need to be able to… at least to a certain degree, feel things in that virtual world. Shaking hands is not feeling things. Shaking hands is not next-gen… I think that there’s a long way to go for VR input before it’s able to hold up to that level of quality where people try it and they say ‘this is blowing my mind, it’s like I’m actually reaching out into a virtual world’, not just ‘I’m moving a ghostly apparition of a hand’… we’ve done a lot of R&D around virtual reality input and our conclusion has been that it’s very very hard to do, it’s not something you can just say ‘have a wand moving through space, call it a day’.

For pure input, systems like STEM appear to be well on their way to providing a great low latency option for virtual reality. But Oculus seems quite concerned about haptics—the ability to feel the virtual world. It sounds like they want something more than rumble.

See Also: Three New STEM VR Controller Videos Show Incredible Lightsaber and Gun Interactions

The problem with haptics is that existing solutions are mechanically complex, often contributing to bulk, increased need for maintenance, and cost. A high-end system like the CyberGlove CyberForce might do everything you’d ever dream for haptic VR input, but its size and cost isn’t viable for the consumer market:

Tactical Haptics was an early player in the consumer VR space that sought to bring an innovative form of haptic feedback to VR controllers, but hasn’t been able to catch on thus far. Team GauntL33t is working on a per-finger haptic feedback device, but there’s clearly a lot of refinement that needs to be done.

What will Oculus consider the minimum viable ‘hatpic-ness’ for consumer VR? Rumble? Finger resistance? Hand resistance? Palm forces? The key, with any consumer product, is to balance the experience with practicality and cost. Whatever the company ends up doing, it sounds quite certain that we’ll be hearing about it next month at Oculus’ Connect developer conference next month.

The post Oculus on Unconfirmed VR Input Device: “You’ll Probably Hear More About It [at Connect]…” appeared first on Road to VR.

‘SightLine’ Dev Creates Incredible ‘Holographic’ User Interface Powered by Oculus Rift and Leap Motion

$
0
0

Tomáš “Frooxius” Mariančík, creator of the stunning VR experience SightLine: The Chairis back! This time he’s demoing an educational prototype that aims to provide an intuitive virtual reality user interface that allows you to reach into VR and control the environment with your hands.

Reaching in to Virtual Spaces

One of the scenes from Sightline: The Chair
One of the scenes from Sightline: The Chair

Not content with developing with one of the best virtual reality concepts at last year’s VR Jam event with Sightline and blowing mine and everyone else’s socks off with his follow up tech demo SightLine: The Chairdeveloper Tomáš “Frooxius” Mariančík has come up with what looks to be one of the most intuitive and responsive VR user experience I’ve yet seen.

See Also: Why ‘Sightline: The Chair’ on the DK2 is My New VR Reference Demo

His new project, which is still a prototype, uses the combination of an Oculus Rift DK2 and its positional head tracking fused with a Rift-mounted Leap Motion controller. The Leap detects your real life hands and fingers and allows Tomáš to translate any gestures or movements into equivalent actions in the virtual world. And in order for your virtual hands to have something to do in virtual space, he’s also conceptualised and built a series of menus and gesture commands that allow the user to navigate and control the world around them.

The current prototype uses a human skeleton, complete with organs, to show how natural interaction with our hands can be used to manipulate and inspect objects in 3D space. The Oculus Rift DK2 brings its positional tracking to the party, allowing the user to grab, hold, and then glance around the virtual object in a natural way. The demo video above shows an extraordinary amount of precision and grace in play here, something that was difficult to achieve in earlier versions of the Leap SDK. But with Leap Motion’s recent skeletal tracking advancements, recently made available to developers in V2 Beta of the SDK, it seems you can clearly achieve some incredible things.

See Also: Leap Motion’s Next-gen ‘Dragonfly’ Sensor is Designed for VR Headsets

The Leap Motion sensor attached using the dedicated mount
The Leap Motion sensor attached using the dedicated mount

The project looks impressive on multiple counts, but it’s the precision of control that’s enabled here that blew me away. Tomáš (featured in the video) taps delicately at menu sliders and scrolls through a test textbox deftly and with apparent ease. I also particularly liked the gesture of a clenched fist, which allows the user to drag the world’s position around them (or vice versa depending on how you look at it).

Mount Your Leap

vr_feature_fovLeap Motion used in conjunction with DK2’s positional tracking seems to be a potent mix and the company has done everything it can to try to evangelise this. They now offer a special mount for your DK2 which allows you to slot your Leap Motion sensor onto the front of your headset (cleverly missing most of the IR LEDs covering the DK2) and presto, you have a sensor capable of spotting your wavy arms within its horizontal FOV of 135 degrees (so claims the maker). It certainly manages to make a compelling argument for an answer of naturalistic VR input, something that even Oculus hasn’t yet publicly addressed.

VR input and ways to allow humans to interface with these new digital worlds lags behind rapidly advancing VR headset solutions. Tomáš’ prototype gives us a glimpse at how we could be interacting with our digital worlds very soon indeed. What’s more, this seems to mark a new lease of life for the Leap Motion device, which had appeared to be searching for a fitting application.

We’ll be digging deeper into this project along with some hands-on impressions and thoughts from the Tomáš himself on the project. In the mean time, you can find more on SightLine here and the facebook page.

The post ‘SightLine’ Dev Creates Incredible ‘Holographic’ User Interface Powered by Oculus Rift and Leap Motion appeared first on Road to VR.

‘Dexmo’ The VR Exoskeleton Glove with Force Feedback Launches Kickstarter Campaign

$
0
0


Dexta Robotics today launched a Kickstarter campaign for their VR input device known as Dexmo, a mechanical exoskeleton glove designed for VR input and which offers users force feedback.

VR Input and Output in Once Device

The Dexmo 'Classic' Exoskeleton
The Dexmo ‘Classic’ Exoskeleton

With the debut of Oculus’ latest feature prototype dubbed ‘Crescent Bay’ at the recent Oculus Connect conference, the general feeling is that the VR company is close to nailing the hard problems associated with displaying virtual worlds and fusing low latency head tracking to deliver an experience that can induce presence in its user. The big question at that event was actually over the ever growing issue surrounding how you interact with these convincingly realised virtual worlds. But beyond VR input devices, how do you then give a person in VR tactile information on the world they’re in?

Dexta Robotics think they have part of the answer. It’s called the Dexmo and it’s an intriguing device for VR enthusiasts. The company has announced the launch of that product’s Kickstarter campaign which will give enthusiasts early access to a device that not so long back would have been considered a work of science fiction.

The Demo 'F2' Force Feedback Exoskleton
The Demo ‘F2′ Force Feedback Exoskleton

Dexta Robotics, a Chinese startup firm born in March of this year, was founded to bring this new product to life. After iterating through some 17 prototypes, the company have decided to release the Dexmo in two versions via a new Kickstarter campaign with a goal of $200k, according to the company.

Dexmo Kickstarter

The Dexmo comes in two flavours, the Dexmo Classic and F2. The Classic is purely an input device which uses IMUs attached to each of your fingers and thumb to detect movement of the articulated links to each digit plus rotational input from the thumb. It can detect both the splitting (when you separate your fingers) and bending of your fingers plus 3 degrees of movement (rotational) of the thumb. The F2 adds digital force feedback for both your index finger and thumb, meaning that onboard servos ‘brake’ your movement according to information pulled from a VR experience. The upshot is that you can virtually ‘feel’ an object when your hand comes into contact with it.

Neither version of the device offers any form of spatial or rotational tracking that is, it has no idea where your hand is in 3D space. The team instead plan to encourage the use of 3rd party IMUs such as the Sixense STEM system to provide these extra degrees of input. It’s important to note here that force feedback is not only limited to your thumb and index finger, but there is no analogue feedback supplied—the finger ‘braking’ is purely binary, on or off. This does limit the gloves uses for interactive experiences where nuanced feedback would be required. It’s still a pretty impressive feat of engineering though and a step in the right direction.

What do I Get For My Money?

The entry level pledge which gives you an actual device starts at $75 for a single Classic Dexmo exoskeleton, with a pair requiring a pledge of $150. A single F2 with force feedback costs $179 and $319 for a pair. All hardware pledges also come with access to the company’s SDK and Arduino library. Arduino is an open source software platform designed to interface with robotic devices such as the Dexmo.

It’s all incredibly interesting and another example of a far eastern company managing to spin up designs and prototypes for a futuristic device in an amazingly short period. It’s also an important step in the direction of providing VR users sensory input to seal the sense of presence when interacting with virtual spaces. We’re very much looking forward to seeing what comes of the Dexmo and wish the team the best of luck with their campaign.

You can back the Dexmo Kickstarter right here and you can find more information on Dexta at their website here.

The post ‘Dexmo’ The VR Exoskeleton Glove with Force Feedback Launches Kickstarter Campaign appeared first on Road to VR.

Nimble Sense Kickstarter Aims to Bring Time-of-Flight Depth-sensing Tech to VR on the Cheap

$
0
0

Today Nimble VR launches a Kickstarter campaign for Nimble Sense, a natural input controller that the company says was designed for virtual reality input. And while Nimble Sense doesn’t at first appear to be much different than Leap Motion, the company says they’re using ‘time-of-flight’ depth sensing technology, like what’s used in the Kinect 2, which they say has unique benefits.

Nimble Sense Kickstarter

Nimble Sense uses an infrared laser to create a pulse of invisible light which illuminates the environment surrounding the user. An IR camera with a 110 degree field of view senses the light as it bounces off the environment and uses the known speed of light to work out the distance to discrete points in the environment. This allows the camera to capture a 3D model of the environment.

“Our camera captures a dense 3D point cloud every 20 milliseconds that can be used to bring not only hands, but arms, legs, and even your desk into VR,” Rob Wang, co-founder of Nimble VR, told me. “The 3D point cloud of the real world is rendered exactly at the right scale and location in VR—regardless of your IPD. The point cloud can even be shared and visualized by other users for multiplayer / social VR experiences.”

Wang notes that a stereo camera approach is necessarily locked to one IPD when it comes to viewing the outside environment.

“Competing cameras that use stereo imaging may only provide two infrared pictures, rather than dense 3D geometry. It’s harder to visualize a stereo image pair from a different user’s perspective, and stereo also assumes a specific IPD based on the spacing of the cameras.”

nimble sense kickstarterTime of Flight at the Right Price

Nimble VR says that they’ve “achieved a breakthrough in the accuracy, cost, and power consumption” of time-of-flight sensors, and they’re aiming to bring the tech to the world of VR at an affordable price point. For the first 500 backers, Nimble Sense starts at $99, which includes the camera and the DK2 mount.

The Nimble Sense Kickstarter campaign has a funding goal of $62,500 and the company expects to ship their product to backers in June of 2015. Nimble VR of course will also make available their SDK to allow developers to integrate the camera into their games and applications. The company is also planning to release four open source Oculus Rift demos to give developers a jump start on working with Nimble Sense.

VR Ready with DK1 and DK2 Mounts

nimble sense virtual reality input

Nimble Sense will ship with a mount for the Oculus Rift DK2 which smartly takes the place of the unit’s cable cover, meaning there’s no permanent modification needed to mount the sensor. The mount appears to cover two of the Rift’s IR LEDs, which are used for positional tracking, but like the Leap Motion VR Mount, it may not significantly detract from tracking performance.

Nimble VR also have a DK1 mount and desk mount available depending upon the Kickstarter backer level.

Nimble VR say they’ve been hard at work developing their skeletal hand-tracking technology to bring a user’s hands into the virtual environment. The unit has been designed to work best with objects that are 10cm to 70cm in front of the user, which the company says is the natural zone of interaction for the hands. They say that their hand-tracking tech is “the best in the world.”

Blair Renaud of Iris VR, developer of Technolust, sounds impressed by Nimble Sense. He’s quoted on the Kickstarter page as saying, “I’ve tried just about every VR control scheme out there, and I have to say that the Nimble Sense is by far the most promising I’ve tried so far.”

Nimble VR reached out early to a few VR developers, including Renaud, to demonstrate that the system can be integrated easily into existing projects.

The company doesn’t mention their latency, but from a brief video on the Kickstarter page, it appears on cursory inspection as passable.

Want to share your impression of Nimble Sense? Drop us a line in the comments!

Full Disclosure: Nimble VR is running an ad on Road to VR.

The post Nimble Sense Kickstarter Aims to Bring Time-of-Flight Depth-sensing Tech to VR on the Cheap appeared first on Road to VR.

Inside Look at 3DRudder, Feet-Controlled Navigation Device Headed to CES 2015

$
0
0

3DRudder1

A new PC peripheral company based in France is launching a feet-controlled 3D navigation and motion control device that wants to get the lower half of your body in the game, freeing up your hands for the more important things in virtual life, like hand tracking and eating nachos without getting the keyboard dirty (not necessarily in that order).

3DRudder will launch its Indiegogo campaign on December 9th, and will also be demoing the device at the upcoming CES 2015 in Las Vegas this January, which is meant to show its capability to not only serve the unique purposes of VR enthusiasts, but also traditional PC gamers and design professionals. And although it’s intended to be used while seated, the fairly compact USB device introduces a novel way of navigating 3D environments that could help bridge the divide between passive gaming and the ultimate ideal of full-body tracking in VR.

The 3DRudder is meant to entirely replace the mouse/joystick/WASD paradigm, and utilizes its flat top and rounded ‘weeble wobble’ base so the user can position their weight in any direction, allowing for 3D navigation through a virtual environment. Walking is as simple as lightly tilting the weight of both feet in the intended direction of travel: forward for forward and right for right etc.—to run, you just shift your weight a little more. Turning (yawing) is equally as simple, requiring the user to pivot the device, which we’re hoping doesn’t lead to any uncomfortable positions considering many of us sit in wheely office chairs that could negatively impact its usability.

The device also boasts several other degrees of freedom including up/down elevation and pitch (nosing up or down) and is probably where the ‘rudder’ part comes in.

3DRudder TurnThe device is actually made up of two independent pedal halves that give it a helicopter-like control mechanism, and after reading through a few pages of technical data, let’s just say we’ve hoping we don’t have to head to flight school to figure it out. That said, the multitude of input potentials could make this a highly flexible device for anything from walking in an FPS to controlling the ailerons in a flight sim.

The Specs

3dRudderThe device houses an internal sensor array of accelerometers, gyros, magnetometer, and a collection of embedded force sensors that, according to 3DRudder CEO Stanislas Chesnais, will make the 3DRudder “extremely reactive with very low latency between the user’s movements and corresponding software reaction.”

The motion control device will also be shipping with its own SDK for the benefit of any potential developers.

Power is handled by a single USB cable, which eliminates the need to fiddle about with batteries and Bluetooth dongles. This also unfortunately limits the ability for the device to play with any future iteration of mobile VR like the Samsung’s soon-to-come Gear VR, making the 3DRudder exclusive to the ‘tethered’ experience. Of course we’re not saying we want to plop down a foot-based controller on the subway and blast some baddies in Ashar Wars, but it would still be nice to see some cross-platform integration.

Although we haven’t had a chance to test the device and verify its quality, we’ll be keeping our eye out for 3DRudder at CES 2015 to see how it fares under the stinky feet of 160,000 attendees.

The post Inside Look at 3DRudder, Feet-Controlled Navigation Device Headed to CES 2015 appeared first on Road to VR.

E3 2015 Will Mark Three Years of Incredible VR Progress

$
0
0

Arguably the games industry’s most important annual event, the Electronic Entertainment Expo 2015 begins in LA next week. As the final E3 before the retail release of three major virtual reality headsets from Oculus, Sony, and Valve/HTC, we take the opportunity to look back at the last three years of consumer VR.

The Los Angeles Convention Center, Home of E3 2015
The Los Angeles Convention Center, Home of E3 2015

E3 2015 is here again and begins officially on the 16th of June at the Los Angeles Convention Center. As ever, Road to VR will be there to bring you the latest news direct from the show. Do drop us an email at info@roadtovr.com if you have something VR related you’d like to share with us at the show and we’ll try to line up a meeting.

This year’s event marks something of a milestone in virtual reality’s recent explosive renaissance—specifically the final E3 before the arrival of the first major consumer virtual reality systems in this new era.

So, before we’re swept away by the inevitable tsunami of E3 (and pre-E3) revelations, here’s a brief up-to-date summary of this most recent virtual reality renaissance period, using E3 itself as a convenient way-point.

E3 2012

John Carmack, founder of id Software, co-creator of Doom, and all-round programming demi-god turns up at E3 with a gaffer-taped ski mask and a hacked-up version of Doom 3 and promptly steals the show, all the while proclaiming that virtual reality is back! The ski mask was of course an early version of the Oculus Rift, sent to Carmack by one Palmer Luckey, before the headset ever took to Kickstarter. The showing catapulted VR back into the gaming industry’s consciousness whilst awakening the VR dreams of a generation of industry professionals and enthusiasts all at once.

John Carmack at E3 2012, now Oculus VR CTO
John Carmack at E3 2012, now Oculus VR CTO

For the first time since the 90s, the gaming and tech media begin talking and writing about VR in positive, non-derisory terms and at Quakecon that same year, we see one of the first discussions on VR’s future. Featuring Oculus’ Palmer Luckey, Valve’s Michael Abrash and of course John Carmack, the event foreshadowed how important these three figures were to become in later years, with Carmack and Abrash eventually leaving id Software and Valve respectively to join Oculus.

quakecon-virtual-reality-keynote1
Palmer Luckey (right), Michael Abrash (center), John Carmack (right)

E3 2013

An early pre-production Oculus Rift DK1 Prototype
An early pre-production Oculus Rift DK1 Prototype

Oculus as a company has now been formed on the back of a wildly successful Kickstarter campaign to bring the so-called Rift DK1—the first widely available developer kit for a consumer virtual reality headset—to life. By now the DK1, despite production delays, is already in the hands of Kickstarter backers across the world. The VR community is swelling with an influx of enthusiasts feeding from an explosion of early developer tech demos for the device. Oculus don’t rest on their laurels however; during E3 2013 the company reveal their latest VR headset, simply called the ‘HD Prototype’, formed from the guts of a DK1 and added a new HD LCD Panel, upgrading the DK1’s resolution from 800p to 1080p.

A set of Oculus Rift HD Prototypes
A set of Oculus Rift HD Prototypes

Demo’s of the new headset included the latest previews of Unreal Engine E3 2013 and the combination of increased display fidelity and cutting-edge rendering techniques left quite an impression. At the same time, the first glimpse of triple-A software using VR was provided in the form of EVE Online developer CCP Games’ EVE-VR, a space combat shooter which used head tracking as a core gameplay mechanic.

Continue to Page 2 “E3 2014 and Beyond”

The post E3 2015 Will Mark Three Years of Incredible VR Progress appeared first on Road to VR.

Ximmerse Adds Motion Tracking to PC and Mobile VR Using Stereo Camera Tech

$
0
0

Ximmerse, a tech company based in China, are working on a suite of optically tracked input systems based around a proprietary stereo camera. The system is claimed to work with both PC and mobile based virtual reality.

The three big players in the PC virtual reality market have now demonstrated their solutions for naturalistic, motion input for virtual reality applications. Valve has Lighthouse, Oculus has Touch and Sony have Move.

Mobile virtual reality however is however sorely lacking promising motion input technologies, with Gear VR’s peripherals limited to wireless gamepads currently.

ximmerse-stereo-camera-1
Ximmerse’s stereo camera with a claimed 160 degree FOV

Ximmerse is a Chinese company who claim to have developed a cross-platform optical tracking system based around a high FOV stereo camera. Ximmerse’s solution comprises a suite of controller peripherals, all with embedded IMUs and glowing orbs. The latter recalls Sony’s Playstation Move, which adopts a similar method to provide 6 DoF (degrees of freedom) via the PS Eye camera.

x-immerse-xcobra-controllers-2
The X-Cobra input controllers from Ximmerse

Ximmerse claim that, with their system, they’re able to track up to 240 individual orbs or ‘blobs’ as the team calls them with no interference, all tracked by a single stereo camera. An impressive claim.

The company have developed a suite of controllers based around the company’s stereo camera. The X-Cobra, a pair of handheld motion controllers, similar in design to the Playstation Move or the Razer Hydra systems, uses optical tracking  ‘blobs’ for 6DoF positional tracking as well as IMU’s for rotational movement. The controllers are wireless and offer a neat way to bring your hands into VR – although it’s hard to judge just how effective they are from the video the team have released demonstrating the tech.

Additionally, Ximmerse have a wireless, body motion tracking IMU. Demonstrated in the above video, the team suggest it could be used as an alternative input device for mobile VR platforms, with the user moving their body to influence the experience.

The company also have a haptic glove, although details on the mechanics of it’s tracking solution is a little unclear. Ximmerse’s website seems to suggest the glove utilises IMUs from Perception Neuron yet also claim the glove is tracked via Ximmerse’s stereo camera system.

ximmerse-standalone-imu-1
Ximmerse’s ‘standalone’ IMU, used for body motion capture

All of these devices it’s suggested can be used with mobile virtual reality platforms such as Gear VR, details on how this works however are thin on the ground right now.

Ximmerse will be making the trip to Los Angeles this week to exhibit at SIGGRAPH 2015 at the LA convention center. If you’re attending the show, you can find the company’s booth at PD11. The team will also be attending VRLA later in the month.

The post Ximmerse Adds Motion Tracking to PC and Mobile VR Using Stereo Camera Tech appeared first on Road to VR.


Hands On: ‘The Assembly’ Offers New ‘Comfort Mode’, a Multi-Pronged Attack on Sim-Sickness

$
0
0

nDreams, a UK based games studio producing upcoming VR title The Assembly, are declaring war on the discomfort commonly associated with first-person VR games. We sat down with Senior Designer Jackie Tetley and Communications Manager George Kelion at Gamescom 2015 to hear more about the game’s new ‘VR comfort mode’.

Without a HUD, or a cockpit to help obscure some of the vection-inducing motion that can truly turn the stomachs of the best of us, first-person gameplay in VR is still pretty dicey for many. Traditional controls, although right for some who have already developed what’s commonly called “VR legs,” aren’t for everyone, and can bring on the dreaded ‘hot sweats’ that precede sim-sickness.

“I’m happy to say we’re on the forefront of trying to tackle these problems head on. In three or four years time we’ll be able to look back and all of this will be a memory,” Kelion said.

The Assembly - Screenshot - 06 - CH1

nDreams, developers of the upcoming multi-headset game The Assembly, are keenly aware that the quickly approaching consumer-level VR headsets will bring with them VR veterans and first-timers alike. In response, they’re integrating a number of different control schemes that they hope will let anyone play The Assembly comfortably.

The Assembly and VR Comfort Mode 

The game begins as I’m being rolled through the heart of the Great Basin Desert in Nevada. I’m upright and strapped to a gurney Hannibal Lecter-style, a mechanic Kelion tells me will “cajole people into using their head as the camera.” Because locomotion is intentionally on rails for the first chapter, my only choice is to observe the scene around me.

We come to the mouth of The Assembly’s bunker, the underground home to a clandestine group of amoral scientists and researchers, the sort of people that ran the Japanese Unit 731 back in WW2. A man and woman talk openly about who I am, and how long they think I’ll last in the group. We roll closer and closer, eventually passing a dead crow laying face down in the sand. I’m scanned by two security cameras and let into a service elevator past heavy blast doors, of course never glimpsing my captors, which from the dialogue sounds like they were fans of my medical work. Aha, so I’m a doctor. We go down several levels, passing open windows showing patients, each successively worse off than the last. Coming to a halt, we roll out of the elevator shaft and into a vast complex. They’ve already said too much, and they drug me once more.

The Assembly - Screenshot 16 - CH1

For the next chapter, Senior Designer Jackie Tetley gave me the choice. Did I want to play traditional style, where the gamepad’s left stick controls yaw, (a choice John Carmack calls “VR poison”), or did I want to give the new VR comfort mode a spin?

Choosing comfort mode, I was dropped back into the Bunker, this time in a closed-off lab complete with emergency eye washers, and medical storage of all types. Here I can move around and test it out.

See Also: Stanford Unveils ‘Light Field Stereoscope’ a VR Headset to Reduce Fatigue and Nausea

As opposed to traditional FPS gamepad control schemes, my right stick on the PS4 gamepad now acted as a ‘snap-to camera’, rotating me instantly 90 degrees to my left or right. Although not ideal for the sake of maintaining a contiguous feel in the world, my stomach thanks me for the option to forgo the sickening turns that I know first-hand can turn you off of VR for the rest of the day.

In comfort mode, my head tracking gently pulls my character left or right according to where I’m looking, much like you would in real life. This is much more subtle, and eventually faded into the background of exploring the space for any clue of how to get out, and who I was.

The Assembly - GC Screen - 05

The last comfort mode mechanic was a teleport option that lets you pre-select the way you want to face, so that you can jump to anywhere and immediately look the direction you need to. Tetley tells me that it still isn’t perfected, as the mode can still break the game if you teleport on top of boxes or other items, since the game doesn’t incorporate jumping. I can’t say I liked using it personally, because it brings up (necessarily so) an icon, which in the current build is a floating metal ball studded with arrows. Select the arrow with your right stick, and change the distance of it with your left. The floating icon was easy to use, and teleporting was by no means jarring, but the icon itself was physically intrusive in the space. Thankfully though, all of these control schemes are optional, and can be ignored or utilized according to individual preferences.

Kelion and Tetley assured me that none of this is final however, as they want to include a wider range of movement controls in the game at launch to satisfy more gameplay styles.

The Assembly - Screenshot - 14 - CH3

But what about large tracking volumes that let you actually walk in the virtual space, a la HTC Vive and the new Oculus Rift? Isn’t that a sort of ‘VR comfort mode’ too? Kelion responded:

“The truth is when it comes to the HTC Vive, this is something we’ve been implementing for control pads for Oculus, and for Playstation Morpheus. We’ve had the hardware for longer. We also know the Oculus is shipping with the pad and not Touch, and we don’t even know what the Morpheus is shipping with. We have to assume in terms of utility that we need to make it work well on the gamepad.”

Although not in the current build, nDreams doesn’t put HTC’s Lighthouse tracked controllers or Oculus’ Touch out of the realm of possibility.

nDreams has targeted all major VR headsets for The Assembly‘s release, including HTC Vive, Oculus Rift, and Playstation Morpheus.

For an updated gameplay close-up with commentary by Senior Designer Jackie Tetley, take a look at the video below.

The post Hands On: ‘The Assembly’ Offers New ‘Comfort Mode’, a Multi-Pronged Attack on Sim-Sickness appeared first on Road to VR.

Cloudhead Games’‘Blink’ to Bring Nausea-free VR to HTC Vive this Holiday

$
0
0

Cloudhead Games, the developers behind one of the first successfully Kickstarted VR dedicated games, have announced their technique called ‘blink’, a locomotion system they claim eliminates nausea entirely from VR gaming.

VR hardware is improving and the display techniques such as low persistence of vision, pioneered by Oculus and Valve over just the last couple of years mean perceiving VR imagery inside VR headsets is a whole lot more comfortable these days.

GallerySixElements

But now that nausea and VR sickness brought on by lacklustre hardware is quickly becoming a thing of the past, why is nausea still an issue for today’s VR game designers? It turns out, once you’ve removed all hardware barriers to perceive virtual worlds, you’re left with much more fundamental issues to deal with at the software design level and chief among them is locomotion inside VR.

The Oculus Rift DK2 brought us optical positional tracking, freeing our heads and bodies from a purely rotationally tracked virtual world. The system allowed free movement within the confines of the tracking camera’s FOV, which when positioned correctly was relatively generous. The HTC Vive and its Lighthouse laser tracking system expanded upon that freedom, introducing what they termed ‘room scale’ tracking – pitching their VR system as one you could explore both physical and virtual spaces with.

The Oculus Rift DK2 and Positional Tracking Camera
The Oculus Rift DK2 and Positional Tracking Camera

But no matter how good your tracking is, you’ll always be constrained by the physical world your body is occupying, especially when it comes to moving around a virtual space with boundaries beyond your physical environment. This means, designers must find ways to move players around their virtual worlds that are not natural. Using a gamepad’s analogue sticks to move around is one traditional gaming technique of course, but one that causes jarring feelings of disconnection when in virtual reality. Essentially, your inner ear disagrees with your brain – you’re moving in virtual space but not in physical.

htc-vive-juggling-alex-Schwartz-owlchemy-labs
HTC Vive and Room-scale tracking

Cloudhead Games, who have been grappling with the difficulties and intricacies of VR input since their appearance on the scene back in 2013, now claim they have a solution to VR nausea, physical playspace restrictions and locomotion with a system they’re calling ‘Blink’.

Subtitled ‘Elastic VR Playspace’ the solution comprises a number of techniques (as outlined in the video above, presented by Cloudhead Games’ Denny Unger) which, when combined tackles all of the above issues: Cinematic Blink, Precision Blink and Volume Blink – a “locomotion mechanic which ensures player safety, deep traversal and complete spacial awareness in both the virtual and real world.”

Essentially, Blink employs a point, click and teleport system which removes the uncomfortable elements of virtual travel by snapping you to a position and orientation defined by you instantly thereby essentially sidestepping the issue. This technique works hand in hand with Cloudhead’s elastic playspace too, a system which subtly reminds the player of their custom sized playspace with visual cues within the game.

The techniques described in the video, will debut ‘Blink’ in the first episode of their forthcoming ‘The Gallery‘ saga. This new episode, newly revealed as ‘Call of the Starseed‘ is scheduled for release alongside the HTC Vive, currently pegged for delivery towards the end of 2015.

You can find out more about The Gallery saga over at the game’s website here.

The post Cloudhead Games’ ‘Blink’ to Bring Nausea-free VR to HTC Vive this Holiday appeared first on Road to VR.

Manus Machina Begins to Ship Early Wireless VR Gloves to Developers

$
0
0

Manus Machina, a company dedicated to virtual reality input devices, has begun shipping their first sets of wireless VR gloves to developers.

Roaming the halls of Birmingham’s NEC covering EGX 2015 today, I bumped into the Manus Machina team, who gave me an impromptu demo of their prototype wireless input gloves, and an update on their development progress.

Manus’ PR Officer Bob Vlemmix happily informed me that the team had begun shipping their early developer kits to a selection of developers, meaning the work on content creation for the devices can now begin.

Manus' latest developer wireless input gloves being packed for shipping
Manus’ latest developer wireless input gloves being packed for shipping

Scott Hayden went hands-in with the new gloves back at Gamescom in August, you can check out his impressions here. The devices use a combination of per finger sensor strips and IMUs to track you digits and hands in virtual space.

According to the team, developer interest in the product has picked up significantly, with some big names potentially interested in working with the startup.

There are hardware improvements too, in the form of a new IMU which reduces yaw drift, an effect that can throw off your virtual hand’s orientation until recalibrated.

All-in-all, it was a positive update and despite some known issues with positional accuracy, the team seem to be achieving good progress.

The post Manus Machina Begins to Ship Early Wireless VR Gloves to Developers appeared first on Road to VR.

New Oculus Touch ‘Toybox’ Videos Show Gestures, Sock Puppets, Shrink Rays, and More

$
0
0

Oculus described Toybox as their internal test bed for the company’s ‘Touch’ motion input controllers. The company is polishing up the experience to give players a sandbox environment that shows off both multiplayer VR and the capability of the Touch controllers. A new video from Oculus gives us a glimpse inside of Toybox.

Oculus first showed off Toybox with their Touch controllers at E3 2015 where the combination of multiplayer social interaction and intuitive motion input wowed those lucky enough to try it. At the time the company wasn’t releasing any official footage of the experience.

See Also: Hands-on – Oculus Touch is an Elegant Extension of your Hand for Touching Virtual Worlds

At the company’s second annual Connect developer conference, Oculus ran Touch demos through the night, allowing a much larger swath of developers to see Toybox and others Touch experiences.

oculus-touch-vr-input-controller-hand-trigger

Inside Toybox, players see one another as Rift-wearing blue, body-less avatars that hover over virtual table tops piled high with a myriad objects, all there to demonstrate to you, the user, the joy of accurate, naturalistic virtual reality input.

However, Toybox is much more than a mere technical showcase for Oculus’ excellent proprietary input devices, it goes a long way to counterpoint the widely held opinion that virtual reality is a solitary, isolating experience – one in which the player shuts themselves off from the real world and its inhabitants. In Toybox, with voice comms in place, the enjoyment of this virtual playground is enhanced immeasurably by the presence of your Oculus Touch wielding companion.

Up to now, Oculus have been extremely shy about journalists capturing footage of the ‘programmer art’ filled demonstration. But now, two videos direct from Oculus not only demonstrate what Toybox looks like, they allow you to see how real world actions are interpreted by Oculus Touch.

Two-handed interactions are fluid and intuitive and Touch’s unique gesture sensing features are on show here, allowing in-game hands to represent articulation in both thumb and forefinger. And as trivial as that may sound, as you can see from the footage, seeing those gestures in VR not only enhance communication, they also elevate the sense of shared presence in the virtual world, providing subtle, humanistic cues pulling you into the world.

Oculus Touch will ship separately from the Rift, the latter shipping in Q1 2016 with the former following later in Q2. Emphasis on Oculus Touch and its capabilities was, predictably, very heavy at Oculus’ recent developer conference Connect, with new games from Oculus Studios and Epic demonstrating the unique experience dedicated VR input devices can provide.

The post New Oculus Touch ‘Toybox’ Videos Show Gestures, Sock Puppets, Shrink Rays, and More appeared first on Road to VR.

Candid Reactions to the Oculus Toybox Demo and Social Presence

$
0
0

The Oculus Touch Toybox demo was shown to the most number of people in one day at Oculus Connect 2 on September 23rd. This was the first time that a lot of developers were able to get their hands on the ‘Half Moon Prototype VR input controllers. But more importantly, it was a watershed moment for so many developers to be able to experience social and emotional presence with another person within virtual reality. It became less about the technology and tech specs, and more about the experience of playing, having fun, and connecting to another human in ways that were never possible before. This Toybox demo felt like a real turning point and “Aha!” moment for a lot of VR developers to see how compelling social experiences in VR are going to be. I had a chance to capture some of the candid reactions from Ken Nichols and Ela Darling moments after experiencing Toybox for the first time.

Listen:

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

The post Candid Reactions to the Oculus Toybox Demo and Social Presence appeared first on Road to VR.

Overlaid Videos Show Just how Accurate Oculus Touch Controls Are

$
0
0

Oculus Touch tech demonstration ‘Toybox’ made its public video debut a couple of days ago and highlighted the sheer fun that can be had in VR with intuitive motion controllers. The videos used a side-by-side in-game and real-world POV technique to illustrate how Touch can be used and in this latest video, those views are overlaid on top of eachother. It illustrates quite elegently the accuracy with which Oculus’ Touch controllers respond to real-world input.

Thanks to Scott McGregor for creating the video.

See Also: Hands-on: Oculus Touch is an Elegant Extension of your Hand for Touching Virtual Worlds

The post Overlaid Videos Show Just how Accurate Oculus Touch Controls Are appeared first on Road to VR.

Manus VR Experiment with Valve’s Lighthouse to Track VR Gloves

$
0
0

The Manus VR team demonstrate their latest experiment, utilising Valve’s laser-based Lighthouse system to track their in-development VR glove.

Manus VR (previously Manus Machina), the company from Eindhoven, Netherlands dedicated to building VR input devices, seem have gained momentum in 2015. They secured their first round of seed funding and have shipped early units to developers and now, their R&D efforts have extended to Valve’s laser based tracking solution Lighthouse, as used in the forthcoming HTC Vive headset and SteamVR controllers.

See Also10 Things You Didn’t Know About Steam VR’s Lighthouse Tracking System

The Manus VR team seem to have canibalised a set of SteamVR controllers, leveraging the positional tracking of wrist mounted units to augment Manus VR’s existing glove-mounted IMUs. Last time I tried the system, the finger joint detection was pretty good, but the Samsung Gear VR camera-based positional tracking struggled understandably with latency and accuracy. The experience on show seems immeasurably better, perhaps unsurprisingly.

manus-machina-gloves-1

It’ll be interesting to see where Manus take this experimentation from here. Architect of the Lighthouse tracking system, Valve’s Alan Yates, has said that the intention is for anyone who wants to license Lighthouse should be able to, as long as said use isn’t implemented in such a way that “violates the standard”. Manus are one of the first companies to work towards doing that, it’ll be interesting to see how easy they find the process.

The post Manus VR Experiment with Valve’s Lighthouse to Track VR Gloves appeared first on Road to VR.


Samsung to Demo Gear VR Motion Controller ‘rink’ at CES 2016

$
0
0

The Samsung and Oculus engineered Gear VR mobile headset is available in many countries now at retail, but unlike it’s desktop brethren from Oculus and Valve / HTC, it currently relies on basic, standard input for its applications. Samsung look to be changing that in 2016, as they prepare to demo a new motion controller designed for more intuitive control in virtual reality. It’s called Rink, and it’ll be shown at CES next week.

With the first generation of consumer VR headsets flooding into the market in the first half of 2016, the question of input inside virtual spaces will loom large next year. Valve have their Lighthouse tracked SteamVR controllers, due to ship with HTC’s Vive in April 2016 and Oculus aren’t far behind them with their optically tracked Touch controllers in Q2 2016. Up until now though, those looking to interact with Gear VR VR content have had to be content with either traditional, wireless gamepads, gaze control mechanisms using your head’s orientation or the unit’s integrated touchpad.

samsung gear vr gallery (8)

It seems that Samsung are keenly aware of this shortfall in delivering a compelling mobile VR experience as they’re due to demonstrate a prototype motion controller at CES 2016 designed to bridge this gap. It’s called ‘rink’ and we don’t know too much about it at this stage, other that it’s developed by Samsung’s C-Labs R&D division.

What we can ascertain or guess (and as you can see from the image above) is that it looks to be a wireless, single-handed peripheral into which your hand slips, as if holding a very large cup or mug. Samsung’s news release doesn’t offer many details, simply describign the device thus:

rink is an advanced hand-motion controller for mobile VR devices which offers a more intuitive and nuanced way to interact with the virtual world. The ability to intuitively control the game or content just by using their hands provides consumers with a much deeper level of mobile VR immersion.

The released image calls up other speculation though, as the wearer seems to have an additional box mounted atop the Gear VR headset, which might either be an optical device, used for tracking the peripheral or perhaps a wireless hub used for processing input and passing it onto the phone powering the Gear VR experience. All wild speculation at this point of course, and we’d love to hear your thoughts and theories in the comments section.

Either way, we’ll be at CES 2016 next week to try and find as many details as possible and hopefully get our hands on the new device.

The post Samsung to Demo Gear VR Motion Controller ‘rink’ at CES 2016 appeared first on Road to VR.

Close Up With ‘Rink’, Gear VR’s Prototype Motion Controller

$
0
0

We caught up with the developers of Rink, a new wireless motion controller that uses a combination of magnetic field enabled positional tracking and infra-red finger tracking.

Here’s a closer look at the new motion controllers which we wrote about recently, in development as part of the Samsung C-Labs incubator program, an initiative to encourage exploration and development of new technologies. It’s the brainchild of Yongjin Cho, a Senior Engineer at Samsung’s Creativity Lab
rink-ces-3We have a more detailed hands-on coming soon, but we thought you might take a high resolution look at the controllers, which are surprisingly well finished for prototype hardware.

The rink system comprises two handheld ‘money clip’ style controllers and one basestation. The two hand units connect via bluetooth LTE to the Gear VR, the basestation is, in essence, a dumb magnetic field generator. The handheld controllers do most of the work, using the relative strength of the magnetic field generated by the basestation to gauge position in space.

rink-ces-2 rink-ces-4 rink-ces-5 rink-ces-6 rink-ces-7

In addition, the rink controller have IR LED sensors that fire towards your fingers and are able to detect the flexing of your digits.

We’ll be back with our impressions later, but we’ll leave with you with these high-res shots to pore over in the mean time.

The post Close Up With ‘Rink’, Gear VR’s Prototype Motion Controller appeared first on Road to VR.

Using Your Body as an Input Controller with OBE Immersive

$
0
0

linda-lobatoLinda Lobato is the CEO and co-founder of OBE Immersive, which is a wearable tech start-up that is part of the current Rothenberg River Program. She previously raised $77,000 to kickstart a MIDI Controller Jacket, and after seeing an early Oculus prototype in Korea she decided that VR was the next frontier for wearable technology. I caught up with Linda at a Rothenberg demo day where she talks their progress for creating a jacket that turns your body into an immersive input controller within a first-person shooter. It’s still within the early stages of development, but they hope to launch a Kickstarter later this year.

LISTEN TO THE VOICES OF VR PODCAST

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

The post Using Your Body as an Input Controller with OBE Immersive appeared first on Road to VR.

Multiplayer ‘Siege VR’ Prototype Highlights Solid STEM Tracking Performance

$
0
0

At GDC 2016 we got to take a look at the latest STEM VR controllers from Sixense. With units purportedly on track to begin shipping next month, the company is also committing resources to create a promising cross-platform multiplayer experience called Siege VR which could turn into something much bigger than a mere demo.

Throughout STEM’s unfortunately troubled development timeline, one thing has surprised us: Sixense has always had highly functioning and fairly well polished demo experiences for their motion controllers. That probably comes with the territory; after all, the company designed the tech behind the Razer Hydra which hit the market years ahead of motion input controllers like Oculus Touch and the Vive controllers even having been announced. They also created the Portal 2: In Motion DLC which brought 20 new levels specifically built for motion controllers to the game.

razer-hydra
Sixense has been melding VR and motion controls since the DK1 days

So I suppose it shouldn’t be surprising after all that the many little tech demos they’ve made over the years to show off motion input mechanics have felt ahead of their time (see this one from 2014).

With that in mind, it was great news to hear at GDC 2016 last week that the company not only plans to finally ship the first STEM units to backers of their 2013 Kickstarter campaign, but they’re also developing a new game called Siege VR that’s going to have cross-platform support among headsets and motion controllers.

I Don’t Care What Platform It’s on, I Just Want to Play Siege VR

Unlike many of the short STEM experiences we’ve seen from Sixense over the years, Siege VR is more than a demo. What we saw at GDC was a prototype of what the company says will become a full game, which will be free to all backers of the STEM Kickstarter and will also be made available for the Oculus Rift, HTC Vive, and PlayStation VR (whether that’s using each platform’s own VR controllers or STEM).

sixense-stem-archery
See Also: Sixense Releases 5 STEM Demos and SDK Compatible with Razer Hydra

Siege VR is a first-person multiplayer castle defense game which (in prototype form) had me and an ally wielding bows side-by-side, trying to stop hordes of enemies from reaching the castle gates. In addition to shooting regular arrows, there are two special arrow types: an explosive arrow (from the quiver seen on the left wall), which when ignited by a nearby torch explodes on impact; and an artillery arrow, which fires a smoke round which designates a target for your friendly catapult to fire upon. The former is great for taking out dangerous groups (including enemy archers who will aim for you directly) and the latter works against enemy catapults. The special arrow types regenerate over time, but you’ll want to use them sparingly, especially as the artillery arrow stockpile is shared between both players.

The game is still very early, but the creators say they’re considering having many more than two players all working together to defend the castle. Perhaps—they conjectured—there would be teammates at more forward towers who could aim back at the castle to prevent enemies from scaling the walls, while the players on the wall itself would be focused on enemies in the field. Maybe—it was suggested—it could be a multi-stage experience where, if the enemies break through the main gate, you and your team fall back to using melee weapons.

Some earlier prototypes included the ability to pour buckets of boiling oil onto would-be castle crashers, though that and some other features were cut for the time being to add a bit of simplicity and polish for the GDC demo.

Between the lot of us excitedly chattering about ‘what about [insert super cool gameplay]? Or how about [more super cool gameplay]?’ it was clear that Siege VR could have legs well beyond a simple demo, and that’s where Sixense says they plan to take it.

Forgetting It’s There is a Good Thing

As I played Siege VR using STEM with a Rift DK2, I got that wonderful feeling of forgetting about the technology and simply having fun playing the game. That means that everything was working together to make a fun and intuitive experience which kept me immersed. When I came out, the top of my mind was filled not with questions about STEM, but about the scope and potential of Siege VR.

STEM motion input controller with three 'Packs'
STEM motion input controller with three tracking modules

STEM itself was integral to getting us to the stage of talking not about limitations, but about possibilities for Siege VR. I’ve used the system at many points along its oft-delayed development, and while it’s always felt good, this time around it felt better than at any point in the past; even after using Touch and Vive controllers all week throughout the rest of GDC.

For me the thing that pushed the needle most significantly was the headtracking performance. STEM has additional tracking modules which can be affixed to head and feet (or elsewhere, up to 10 tracked points). For their demos Sixense often eschews the Rift’s own headtracking in favor of using a STEM tracking module. Having used the Rift plenty, it always felt like there was something a little ‘off’ about the STEM-based headtracking—whether it was latency or positional accuracy, I’m not quite sure. But this time around I actually had to ask to clarify if they were using the Rift’s tracking camera or if it was STEM: it was 100% STEM.

I point to headtracking because it’s easier to tell when something isn’t right with your head, compared to your hands; light drift of a few millimeters or inaccuracy on your hands can be very hard to spot. When the placement of your virtual eyes depends entirely on the tracking though, it’s easy to feel when things aren’t working right. So what I’m saying is that because the headtracking was solid, that means the rest of STEM’s tracking is solid too (as there’s no difference in tracking a module on your head vs. a module on your foot).

Particularly in an electromagnetically dense setting—like, say, the middle of the GDC expo floor—which can mess with the magnetically-based tracking, it was impressive that the headtracking felt that good. In fact, Sixense’s booth had a number of STEM basestations scattered about; there was one just a few feet away from the one I was using, and I didn’t spot any interference-based tracking issues despite competing magnetic fields.

Sixense isn’t trying to quarantine itself from the competition either. They had both the Oculus Rift and HTC Vive (with Vive controllers) in action at their booth, and says their SixenseVR SDK will allow developers to create games that are ready for any motion controllers, not just STEM. The SDK allows for “code-free setup” for humanoid characters in Unity and Unreal Engine, and provides a full body skeletal pose based on sensor data.

The post Multiplayer ‘Siege VR’ Prototype Highlights Solid STEM Tracking Performance appeared first on Road to VR.

Manus VR’s Gloves in Action Using Valve’s Lighthouse Tracking for Hands and Arms

$
0
0

We’ve been tracking Manus VR, the start-up dedicated to producing an intuitive VR glove input device, for a while now. The team were present at GDC in March, showing their latest prototype glove along with their in-house developed game Pillow’s Willow.

One of the privileges of writing for Road to VR is to watch small, independent start-ups with great ideas blossom rapidly inside the VR space. Manus VR is one such company who’ve moved from crude looking, Gear VR based early designs through to slick looking functional iterations, as shows at March’s GDC running on desktop PC hardware alongside the HTC Vive.

manus-machina-gloves-1
Early Manus Glove Prototypes

Manus VR brought along their latest gloves to the show along with an early
version of their showcase game, with integrated support for the gloves – footage you can see in the embedded video a the top of this page.

Manus is using Lighthouse integration, albeit somewhat crudely at this stage, by literally strapping a SteamVR controller to the arms of the player. Positional tracking from those controls are then fused with input for each finger to provide a fairly intuitive looking way to interact. If you check the video embedded above, you’ll see the player grab objects from the world, rotate, pass from hand to hand, all pretty seamlessly.

Manus have also recently released a sneak preview of their research into getting those disembodied hands tied to arms, for a less disconcerting projection of your digital self in VR, as seen in the below video.

Manus VR have already announced that their first developer kit edition gloves will ship in Q3 2016 and will set you back $250. Developers can put themselves down for a pre-order reservation now and for your money you’ll receive a pair of Manus gloves, plugins for Unity and Unreal Engine and access to the SDK for Android, Windows 7+, Linux, iOS 9, Mac OSX.

The post Manus VR’s Gloves in Action Using Valve’s Lighthouse Tracking for Hands and Arms appeared first on Road to VR.

Viewing all 23 articles
Browse latest View live


Latest Images