10th of January Update Post


UPDATE POST

Sorry for the radio silence over the past month, a few weeks went by where I felt like I didnt make enough progress to warrant a blog post but now after writing a very long update post I realize I probably should of been updating at least my failed attempts at fixing the problems I mentioned in the last post. 

That being said, prepare for a bit of a novel this time round and I'll try be more onto it for future blog. 

System as it currently stands:



I only just now after reading back on my previous blosg have realized that I haven't actually written about the new rigged hands (started a draft and then stopped) that I implemented from the Hands Module.

The way that I implemented this seems like a little bit of a round about way of doing it but is how Leap Motion recommends to do it.

So basically each controller has a hand pool, this is essentially an array of hand representations mapped to a single hand. The main purpose behind this I believe is so that you can have a graphics hand model, a physics hand model and even an attachment hand all as part of one hand shown on the screen. So it's kind of like adding and removing features of the hand.

What I did was for each controller (mirrored and normal) create matching hand pools containing 6 elements, where each element consists of a left hand right hand pairing as you can see from the left. Then for each element I created the combinations I needed (Rigged Right Hand, Debug Left) etc etc until I had six combinations I needed.

From there I created a script that could cycle between the different hand models in the pool so that I could essentially turn on and off what I wanted to create what was shown before.

This technique I saw in one of the example scenes that leap motion provided and would be how I imagine I would swap between models later in the game. 


Drifting Problem:

Left hands are where the actual hands are in real world and hands on the right show how the hands are positioned with the head turned to the right. 

So I have been looking into the mysterious drifting hands for a while now and have made next to no progress (which looking back is quite funny considering my last blog post sounded naively optimistic about solving it). As a reminder the issue is that when you move your head the mirrored hand in Unity space follows it rather than accurately reflecting where the hand is in real world space. Even with a combination of some of the greatest mind's available to me in the HCI lab (Jonny, Jonathon and Tobias) we couldn't quite work out how to fix it. 

We can assume what is occurring with the normal warping is that:

  1. Start with hands directly in front of the head mounted leap
  2. Move head to the right but keep actual hands exactly where they are. 
  3. Display must be updated according to head transform, so calculate how much the head has moved and apply a transformation to the visualization of the hands to move the same amount the head has but in the opposite direction so that it creates the illusion the hands have stayed in the same place in space. 
The drifting problem then comes directly from mirroring. The mirroring is done basically by taking the frames that the leap controller reads in (the raw data) and mirroring that before the representations are created. It's done this way because trying to do it any later seemed to be overridden somewhere within the leap system and wouldn't affect the hand representations. (So basically had to change the data that the hand representations were based on we couldn't actually change the hand representations themselves). The drifting is basically occurring because when we move the head to the right, instead of applying an equal opposition transform to the hand to recalculate its location within Unity space it seems to be applying the same transform to the hand that the head takes (AKA the opposite of what it should be doing).

Knowing this you would think that solving it would be a relatively easy task however even locating within the code where these transformations are applied has proven to be unnecessarily difficult. It seems that everything is pretty highly coupled where by disabling certain methods or even lines of code it affects the drifting as well as a pile of other things, or on the reverse affects neither when you would expect it would. And on top of this even understanding what different things do and why they are there is its own challenge. 

Basically I've learnt the hard way why Karen, Hamza and Sandy drilled detailed commenting into us in COMP160 because Leap have opted to take the route of obscure and minimal comments throughout their code which has turned out to be a massive pain. 

 Where to from here:

Basically I'm out of ideas on how to fix it with how the system is currently. After talking to Holger we are left with a few temporary fixes:

  1. Rewrite the system so that the leap motion is table mounted rather than head mounted so that the problem is eliminated completely however this is a less than ideal situation because it is very inflexible for movement (hands have limited space to interact with, the whole VR world will have to be limited to a small interactive space)
  2. Ignore for now and come back to it, this is because we may change how the mirroring occurs and this may fix the problem for us. For the current time this is what we decided to do and can revisit later on. 

Interaction Hands:

The other issue I had in my last update that I briefly summarized was to do with the physics of the mirrored hands (not working obviously). I can only seem to get one set of hands interacting with buttons and objects at a time. I believe this is largely due to the new module that Leap has released called the Interaction Engine where they have streamlined the way you interact with objects to make it more natural.

When first using the Interaction Engine model I followed the basic steps in the documentation however didn't really understand what was going on. The way that they explained it was that you need the following:
  1. An Interaction Manager which is a prefab that is part of the Interaction Engine package. The purpose of which is to be the overall manager of the hands (and other controllers if you choose).
    "The Interaction Manager receives FixedUpdate from Unity and handles all the internal logic that makes interactions possible, including updating hand/controller data and interaction object data" 
  2. The objects that you will be manipulating need to have the Interaction Behavior script attached to them which allows them to be interacted with.
    "InteractionBehaviours are components that enable GameObjects to interact with interaction controllers (InteractionControllerBase) in a physically intuitive way. By default, they represent objects that can be poked, prodded, smacked, grasped, and thrown around by Interaction controllers, including Leap hands."
  3. Interaction controllers, which I will get to in a bit. 
My first few tries consisted mainly of restructuring how I'd laid out my scene, adding the mirrored hands under the same parent as the original, duplicating the Interaction Manager etc etc but nothing worked. Mainly due to a poor understanding on my behalf and average documentation on Leaps behalf. 


After some more investigation I realized that the part I should probably focus on are the Interaction Controllers. At the moment I have two controllers, one for the left and one for the right. Each controller had a field for choosing the Data Mode, either left or right which I assume it gets as raw data as shown in the below image:


So the first obvious try was to duplicate these so that I had four controllers however I quickly realised that the way the Manager was set up didn't allow duplicates getting data from the same data mode.



The only other option was to pick custom data mode and specify the accessor functions manually via script. I suspect this is the path that I should follow when trying to implement this but is currently on the back burner. 

Just as an aside, another option would be to look into how interaction was facilitated before the relatively new Interaction Engine module was introduced, however it seem's a shame to not take full advantage of the module. 

School 1 & 2

After meeting with Holger today he explained the two different thoughts around mirror therapy. The first focusing on the actual hands where it would be important to implement rigged hands. The second was more about the hands interacting with things. For now he suggested that I focus on one of the schools of thought and get that working and come back to the other.

We decided to go with rigged hands for now which means that the interaction problem is one that I can leave for now and potentially tackle it in the future.

This means talking to Chris when he gets back about how he has implemented his hands (something to do with mirroring the hands in the shader) and learning about how to actually use these rigged hands. Hopefully by changing how the mirroring is done we can cancel out the drifting problem and can focus on implementing all the different combinations of hands that are possible. 

In the mean time I think I'll just watch some tutorials on UV mapping in a 3D modelling application and go back to doing the cardboard box I started near the beginning of the scholarship. This so I can be a little bit more prepared for how this is going to work when it gets to texturing the hands. 


I also realise that I am suppose to be keeping a rough track of my hours however because I've been slacking on the blog post I'm kind of behind on recording them. I believe the end of this week marks roughly 5 weeks (about 200 hours) since I started. I will try and keep a better record for the last half of the studentship.






Comments

Popular posts from this blog

Texturing Hands

Day 2

5th of Dec Update