“Arms” in VR and eye convergence


Earlier this year in January, together with my friends we were watching a live streamed organised by Nintendo where they announced their new console – Switch. They revealed one title that really caught my eye, that is Arms.

 

When we saw this trailer the first thing we thought about was that it has to have a great feeling in VR! To have this Inspector-Gadget-like extendable arms, trying to hit something at the distance by using your joycons. The part I was mostly interested was extendable arm itself. I did something similar about 2 years ago when I was working on Dr. Panda Space. Main scene of Space game served as a hub for several locations. Player used a spaceship to travel between different planets that represent mini-games. There is one particular part of the hub that was heavily inspired by this scene from Star Wars movie:

 


We added a planet with a worm. When you fly close enough, the worm will go out of its cave and start to follow your ship by extending its body. When the worm will hit your spaceship with its head, it will push you away. Since we were targeting small children with this game, our style was much more cute and the hit was not punishing the player, it was just adding more fun!


Recently I bought a Nintendo Switch and after completing Zelda (almost 100%) I can’t wait to play new Mario Kart and of course Arms! Unfortunately I didn’t have a chance to try any of them yet. Nintendo somehow avoids marketing their products in the part of the world I live at right now…
I took some time on Saturday evening to see how Arms mechanics would work in VR. But again, I haven’t tried the real game yet!

To do this small project I used Unity 5.5 and HTC Vive as my VR headset.
Here is my result!

Extendable Arms

Design

In real life, when you try to punch a bag (or a person – hahaha) you probably clench your fists. That’s why I connected extending action to grip button on the Vive controller.

In front of player’s camera I placed a dummy. Rules are very simple. When you press the grip button on either controller, punching glove will start to move forward. Current forward rotation is set by rotation of controller. If you release the grip button, arm will reach maximum length or you happen to hit a dummy, then the glove will come back following the same path.

Visuals

Because my time was very limited, I tried to use the easiest approach that’s possible. Unity has this awesome Line Renderer component (name is quite self-explanatory). It renders a line with a specified material that goes through a number of points either in local or world space. This component was a perfect candidate for a Saturday one-man gamejam. I decided to use few assets from the Unity Asset Store like boxing gloves, dummy and a skybox.

Code

Coding part of this project is minimal. We have just three classes:

  1. Glove.cs
    The only purpose of this class is to check if glove hit a dummy. If it did, OnHit event will be triggered.
  2. Arm.cs
    This is where I put all the logic. Arm has 3 states: Idle, Following and CommingBack. It contains a list of points that line is going through. When Arm enters the Following state it will move forward together with a forward vector of our controller’s orientation. During the CommingBack state it will use this list as a lookup where it should go. I decided to create a new point on the line only if the angle between last two points is bigger than a minAngleToAddLinePoint.
  3. Dummy.cs
    After glove hits it, it will play a wiggle animation.

 

Distance to Dummy

During a short test I discovered that it’s really hard to judge how far away the dummy actually is. Scene is very empty, so there are no visual cues that can help the player. Bringing dummy a little bit closer increased my hitting accuracy a lot! It’s quite obvious, isn’t it? You have something closer to you, so it’s easier to hit it, right? I don’t like to leave my questions without answers, that’s why I will try to explain what’s going on here.
There are two binocular cues that contribute to it: stereopsis and convergence. We (and other animals) experience them because we have two eyes placed few centimeters from each other. When the image projected onto each retina has slightly different angle we call it stereopsis or retinal (binocular) disparity. Convergence is even more interesting. Extraocular muscles (the ones that control movement of your eyeball) will stretch as the eyeballs converge.


To illustrate and better understand the topic I created a simple demo scene that is included in the project (name: Convergenece). It allows you to see different angles of an eyeball depending on a distance to a target:


 

 

I checked some values of this angle for interpupillary distance of 6.4 cm (distance between the centre of pupils of each eyeball). You can run it yourself and experiment with different distance. I did tests for some common values that I believe are possible in real life. You can see them on a chart below. Please note that a logarithmic scale was applied:

 

For distance to target of only 5 cm resulted in around 57.38°. For 10 cm it’s already ~86.34°, 50 cm ~ 86.34°; 1 m ~ 88.12°, 5 m ~89.63° and 10 m ~89.82°.
It means that whether a target we are looking at is 10 meters away from us or at infinity, the difference is only about 0.18°. This means that the effect of convergence will be neglectable for the objects further than 10 meters away and will be the strongest within your close proximity.

 

Project Files

Please feel free to check out or fork my GitHub repo. Because I used art assets from the Asset Store, they are replaced with placeholders.

Thank you very much!!! 🙂

 

 

 

Share:

+ There are no comments

Add yours