Real-time Immersive human-computer interaction based on tracking and recognition of dynamic hand gestures.
Doctoral thesis, University of Central Lancashire.
With fast developing and ever growing use of computer based technologies, human-computer interaction (HCI) plays an increasingly pivotal role. In virtual reality (VR), HCI technologies provide not only a better understanding of three-dimensional shapes and spaces, but also sensory immersion and physical interaction. With the hand based HCI being a key HCI modality for object manipulation and gesture based communication, challenges are presented to provide users a natural, intuitive, effortless, precise, and real-time method for HCI based on dynamic hand gestures, due to the complexity of hand postures formed by multiple joints with high degrees-of-freedom, the speed of hand movements with highly variable trajectories and rapid direction changes, and the precision required for interaction between hands and objects in the virtual world.
Presented in this thesis is the design and development of a novel real-time HCI system based on a unique combination of a pair of data gloves based on fibre-optic curvature sensors to acquire finger joint angles, a hybrid tracking system based on inertia and ultrasound to capture hand position and orientation, and a stereoscopic display system to provide an immersive visual feedback. The potential and effectiveness of the proposed system is demonstrated through a number of applications, namely, hand gesture based virtual object manipulation and visualisation, hand gesture based direct sign writing, and hand gesture based finger spelling.
For virtual object manipulation and visualisation, the system is shown to allow a user to select, translate, rotate, scale, release and visualise virtual objects (presented using graphics and volume data) in three-dimensional space using natural hand gestures in real-time. For direct sign writing, the system is shown to be able to display immediately the corresponding SignWriting symbols signed by a user using three different signing sequences and a range of complex hand gestures, which consist of various combinations of hand postures (with each finger open, half-bent, closed, adduction and abduction), eight hand orientations in horizontal/vertical plans, three palm facing directions, and various hand movements (which can have eight directions in horizontal/vertical plans, and can be repetitive, straight/curve, clockwise/anti-clockwise). The development includes a special visual interface to give not only a stereoscopic view of hand gestures and movements, but also a structured visual feedback for each stage of the signing sequence. An excellent basis is therefore formed to develop a full HCI based on all human
gestures by integrating the proposed system with facial expression and body posture recognition methods. Furthermore, for finger spelling, the system is shown to be able to recognise five vowels signed by two hands using the British Sign Language in real-time.
|Item Type:||Thesis (Doctoral)|
|Additional Information:||1. Dynamic Hand Gesture Tracking and Recognition for Real-Time Immersive Virtual Object Manipulation. 2009 International Conference on CyberWorlds, pp.29-35, 2009.
2. Hand motion recognition and visualisation for direct sign writing. 2010 International Conference on Information Visualisation, pp. 467 - 472, 2010.
3. Immersive manipulation of virtual objects through glove based hand gesture interaction. Virtual Reality, Aug, 2011.
|Uncontrolled Keywords (separate with ;):||Human-computer interaction; Dynamic hand gesture tracking and recognition; Virtual object manipulation; Direct sign writing|
|Subjects:||Q Science > Q Science (General)|
T Technology > T Technology (General)
T Technology > TJ Mechanical engineering and machinery
|Schools:||School of Computing Engineering & Physcial Sciences|
Khalil Ahmed Patel
|Deposited On:||30 Nov 2011 16:57|
|Last Modified:||11 Feb 2014 14:49|
Repository Staff Only: item control page