Week 13 summary

OmniTouch: Wearable Multitouch Interaction Everywhere

-Chris Harrison, Hrvoje Benko and Andrew D. Wilson

Omnitouch is an interactive system that allows multitouch input and projected display on numerous surfaces. The system consists of a depth camera mounted on the shoulder along with a pico projector. The depth camera can detect multiple surfaces and the user’s hand in respect to these surfaces. Using a depth derivative map and flood filling technique the authors were able to isolate hovering hand events from click events with the surface. This basically makes any surface a multitouch screen for interaction. The pico projector can display images over surfaces of varying placement and orientation. The images have to be altered before projecting to account for varying position and orientation of the different surfaces. The authors implemented multiple touch interactions for testing and proof of concept purposes. During the testing of the interface the authors found the system to detect finger clicks more than 95% of the time and also proposed measures to get 98% accuracy. One mentioned drawback of the system is that the size of the display has to be equal to that of the smallest possible surface which is the hand. However, some clever surface identification or area estimation may be able to remove this drawback.

The system seems like a very robust implementation of the depth camera and projector interaction techniques that many researchers are trying to perfect. Some aspects of the project do remind me of Sixth Sense. However, the results mentioned by the authors are more promising and of practical importance.

 


Moving Objects In Space: Exploiting Proprioception In Virtual-Environment Interaction

-Mark R. Mine, Frederick P. Brooks Jr and Carlo H. Sequin

This paper targets navigation and object manipulation in virtual environments by using proprioception. Proprioception is the human ability to sense the relative position of neighbouring parts of the body and strength of effort being employed in movement. The authors cite the lack of proper haptic feedback as one of the primary reasons for VR systems not becoming main stream. They believe proprioception can help circumvent this problem. The authors propose numerous techniques that can leverage this human ability. The scaled world grab technique allows the user to grab any object in the world and the object itself scales down to fit the user’s hand. The object can be manipulated by twisting the hand. Further, the user may pull after grabbing an object to instantly trans locate in the virtual world to the position of the object. Hand held widgets place useful tools on the hand of the user where they are instantly available for use. Tools may also be tucked away like just above the users head and the user can just reach up to grab them. Head orientation can be used for head-butt zoom or look-at menus as an interaction technique. The authors also tested their system with a number of users and obtained a promisingly positive feedback from them.

I find most of the techniques proposed in the paper to be quite intuitive. The hand held widgets metaphor reminded me of the predator in the Alien vs Predator game who uses a similar set of tools. I do agree with the authors that lack of haptic feedback can be jarring in VR systems and proprioception seems to provide a good alternative to the same.

Comments are closed.