Table Mounted Leap pt 1


A majority of the next few paragraphs I now realise are wrong, this was a post I was writing before meeting with Holger and Tobias, the new info starts a bit further down. 

I'm currently working on a table mounted version of the leap motion system. Getting the controller to allow this was relatively trivial however the bigger challenge was to create the real world coordinate system so that I could place objects in the scene relative to where things were in the real world.

I did a little bit of research into how this could be done using the Oculus Virtual Reality extension which provided an interface to the hardware. This looked familiar and I remembered that I'd seen Sam use it in his demo and after revisiting Sam's demo it looks like he used it primarily to generate the bounds of the scene. I got as far as locating the position transform of the Oculus sensor at runtime which I thought I could use to base everything else around it. My thinking behind this was that I would have a relatively more flexible program that would generate based on where the sensor was placed. This ended up being very complicated and I realised that the sensor wasn't likely going to move from its current position (mounted to the desktop) so I deserted it and went for a more "hard-coded" approach.

After an embarrassing amount of googling found that 1 unity unit is equivalent to 1 meter (makes sense). Using this concept I took a bunch of measurements of my desk area so that I could place the desk at the right height, length, distance etc away from the player.

I kept my original code from showing the sensor at run time in so that I could have a rough idea of whether the measurements looked right.

I also want to generate the leap bounds + interaction box that Sam uses in his demo however I don't really understand the math's behind it and again the SDK has changed since he wrote his code so I am finding it quite difficult to convert the two....










This was a post I was mid writing when I had last meeting with Holger + Tobi on Friday which I now realise I was on the complete wrong path hahaha, so just going to scrap that and start a new post:

So the meeting raised a few points of things to look into.

Coordinate Systems:
There is two types of coordinate systems used in the software that I am using. Left hand and right hand coordinate systems. Y and X increase in the same direction (upwards = + and right = +) however Z direction for left hand increases away from the user and Z direction increases for right hand towards the user.

Image result for left hand vs right hand coordinate system


  • Unity uses a Left Hand Coordinate system
  • Leap Motion uses a Right Hand Coordinate system
  • Oculus Rift is a Right Hand Coordinate system
So from my understanding below would be an example of the different coordinates for one position in space assuming that origin for all coordinate systems is placed in the center of the Leap Motion. Not 100% if this is correct however but using this logic the matrix multiplication to transition between the point would look something like this I think:

[x,y,z] multiplied by a 3x3 identiy matrix with a negative one in the third column.

This would mean that a transformation of [x,y,z] would = [x,y,-z]. I think this could also be done by getting the vector.z *= -1.0 though.



I'm not sure if I'm on the right track with this line of thought though and again I'm not sure where I would apply this transformations.

This formula is backed up by a blog post by Leap Motion found here:
https://developer.leapmotion.com/documentation/csharp/devguide/Leap_Coordinate_Mapping.html

Something that I don't quite understand is that LM uses mm and Unity measures in M so with that thinking wouldn't a point in Unity (1,0.5, 0.02) translate to: (1000, 500, -20) in LM? There is no mention of this from Leap though so i'm wondering if I'm over complicating it.

As far at the physical environment goes I began to set up my desk for a more permanent space to use. I measured out a 1m by 0.7m space and put the sensor in the middle (center of sensor at .5m) and took the height measurement of the sensor at 0.98m however my main problem that is stopping me moving forward currently is that I don't understand where and how the origin point is chosen. This seems like a relatively trivial thing but I can't wrap my head around it.

Say I want to mark a point on my desk as the origin, I could then make measurements from it and line everything up accordingly however I don't know how to find the exact position in Unity. I know that the sensor position is something that I should use and the origin appears to be lined up with the sensor roughly but it's nothing exact. I think I will have to talk to someone, maybe Chris because he is back now after the meeting.

Comments

Popular posts from this blog

Texturing Hands

Day 2

5th of Dec Update