9/21/22 – Progress Post

A picture of hands within the Manus Interface

Getting Started With Haptic Gloves

This week I finally got to pick up the Haptic Gloves and begin experimenting with them. I had to wait a couple days to charge the gloves because I could only charge them when I was present at ACCAD. I’ve got the individual dinger tracking down, and now I have to work on getting the gloves trackable within the Unity client to begin creating some experimental environments and scenarios. The gloves are responsive but not crazy accurate, which gives me some great info moving forward. My next step will be to get the gloves into Unity as well as the HTC Viv trackers attached to the wrists. I think there is a wrist attachment for the vive trackers but I’m not certain. Right now the main goal is to get the glove tracking and wrist tracking into Unity to start creating an experience.

Additional Goals For the Coming Weeks

An alternative that I’ve been thinking about involves ditching the vive tracker system and just focusing on the gloves and color-tracking through computer vision to determine the position of objects and wrist position. The benefits of this would be that object tracking through the Vive trackers is difficult, you cannot place your hand on top of the tracker and it needs to be mounted in clear view of the base stations responsible for looking at and interpreting the information from the tracker.  However, color tracking technology has the potential to be used for object tracking by creating brightly colored objects that are being monitored by a camera set up. This way, even if the user is covering part of the object, a little bit of color will allow the cameras to see and track the position of the object by translating 3 axis of identification.

This experimentation is a little easier because I’m already familiar with the color tracking and grid system within Isadora. So the experiment would consist of a transparent surface with a camera underneath, a camera to the x axis, and then a camera to the y axis. The information from each camera would then be translated to a value represented on a grid which would then get fed into Unity as x,y,z coordinate information. I plan on roping in a computer science student when the time comes to explore that method. Right now my main focus is to get wrist and finger tracking information into Unity and working.

Leave a Reply

Your email address will not be published. Required fields are marked *