Displaying posts categorized under

week13

Summaries for week 13

Moving Objects In Space: Exploiting Proprioception In Virtual-Environment Interaction One of the aims in virtual environments is to provide to users natural ways to interact with the interfaces. However, applications of such technologies are seldom despite their importance. First because of the hardness of manipulating virtual objects, but also because of in non-adapted metaphors. Several […]

Week 13 Summaries

OmniTouch: Wearable Multitouch Interaction Everywhere This paper by Harrison et al is a more recent, robust (and real) implementation of the Sixth Sense tech by Pranav Mistry et al. The group has implemented the system using depth sensing technology similar to Kinect. They have also mentioned the work done by Sixth Sense and Interactive Dirt, […]

Week 13 summary

OmniTouch: Wearable Multitouch Interaction Everywhere -Chris Harrison, Hrvoje Benko and Andrew D. Wilson Omnitouch is an interactive system that allows multitouch input and projected display on numerous surfaces. The system consists of a depth camera mounted on the shoulder along with a pico projector. The depth camera can detect multiple surfaces and the user’s hand […]

week 13 summaries

OmniTouch: Wearable Multitouch Interaction Everywhere The paper discusses a novel wearable interactive system introduced by Microsoft. The system tries to leverage the ubiquitous access to information, its manipulation and sharing especially through mobile and cloud devices. However, the most prevalent interfaces available have limited screen real estate and modes of interaction. Omnitouch is an interactive […]

[Summaries Week 13] Exploiting Proprioception, OmniTouch

Moving Objects In Space: Exploiting Proprioception In Virtual-Environment Interaction This paper aims to allay the difficulty inherent in interaction in VEs by the ingenious use of  one’s awareness of one’s own body (proprioception) as an aid for orientation and spatial memory tasks. The authors expound the problem that motivates their study, stating that the precise manipulation […]

week 13 summaries

OmniTouch: Wearable Multitouch Interaction Everywhere This paper deals with a shoulder-worn omnitouch system developed by the authors. This system allows to turn everyday surfaces into a graphical, interactive, multitouch input. It comports 3 components: a short range PrimeSense depth camera, a Microvision ShowWX+ laser pico-projector, mounted on a form-fitting metal frame worn on the shoulders. […]

Week 13 Summaries

Moving Objects In Space: Exploiting Proprioception In Virtual-Environment Interaction Manipulation in immersive virtual environments is difficult partly because users must do without the haptic contact with real objects they rely on in the real world to orient themselves and their manipulanda. The paper describes proprioception, a person’s sense of the position and orientation of his […]

Summaries Week 13

Moving Objects In Space: Exploiting Proprioception In Virtual-Environment Interaction Working in a virtual environment with lasers and other pointing and manipulation tools is not as convincing as they were imagined. Pointing methods is tiring and hard to use. This is because of lack of haptic feedback. So the authors propose to use the user himself […]

Week 13 Summaries

Moving Objects In Space: Exploiting Proprioception In Virtual-Environment Interaction This paper describes the use of manipulation techniques in virtual environment exploiting the concept of proprioception, a person’s sense of the position and orientation of his body and limbs. There are three forms of body-relative interaction namely direct manipulation, physical mnemonics and gestural action. In general […]

OmniTouch: Wearable Multitouch Interaction Everywhere The OmniTouch is a system to enable graphical, interactive multi-touch input on every surfaces. In other words, it has on-the-go interactive capabilities with no calibration. The OmniTouch has three main components; a custom short-range PrimeSense depth camera, a Microvision ShowWX+ laser pico-projector, and a depth camera and projector are tethered […]

Week 13 Summary

Moving Objects In Space This paper tends to solve the problem about sense of touch in immersive virtual environments. It is important because users cannot feel the virtual world even though they can see and hear it. In order to overcome the challenges, the authors determine to use proprioception to build three kinds of interaction, which are […]

Week 13 Summaries

Omnitouch : The paper introduces a novel device from Microsoft. It is a shoulder worn system which aims to be used as an input/output device enabling multi-touch operations for input. It utilizes depth sensing and projection technologies to enable the multi-touch operations. Interfaces from the system can be projected on any surface ranging from table […]

Week 13 Summaries

Moving Objects in Space: Exploiting Proprioception In Virtual-Environment Interaction: This paper looks into the challenge of manipulation of objects in virtual environments.  The main hinderance in this space is the lack of haptic contact with real objects.  The authors of the paper propose using proprioception as a way to help users deal with the lack […]

Shane’s Week 12 Summaries

Omitouch: The omitouch is a shoulder mounted device intended to be used as an input/output device.  The goal is to be able to utilize different surfaces in a users environments as a touch screen device much like using your hand instead of a smart phone.  It consists of a few different parts.  It has a […]

Summaries :

Moving Objects in Space: Exploiting Proprioception In Virtual-Environment Interaction In virtual environment, precise manipulations of object in 3D worlds are hard mainly for three reasons. First, in VE, there is hardly haptic feedback. Hence, this is hard and very tiring for user to get precise results. Moreover, there is also a limitation about the input […]

Ruge’s week 13 summaries

OmniTouch: Wearable Multitouch Interaction Everywhere The Omnitouch paper discussed a prototype technology that utilized depth sensing cameras combined with small form projectors to create small usable interfaces in the real world.   The first component, the depth sensor was used to reliably determine the location of the users hands, fingers, pointers, and surfaces in the near environment. […]

[week 13 discussions-proprioception]

Main paper: Moving Objects in Space: Exploiting Proprioception in Virtual-Environment Interaction http://www.cs.unc.edu/~mine/papers/minecows.pdf This paper described a VE interface by utilizing proprioception. Because of the lack of haptic contact, how to manipulate objects in immersive virtual environments is of difficulty. By investigating body-relative interaction, the author presents a set of gestural actions that map to grabbing, […]

Moving Objects in Space: Exploiting Proprioception In Virtual-Environment Interaction Mine, Brooks, and Sequin This paper describes a great solution for handling 3DUI in immersive virtual environments. The methods leverage our innate proprioceptive sense. This is our human ability to sense the relative location of our body parts. It is the phenomenon that makes it possible […]