|Title||Gesture based human motion and game principles to aid understanding of science and cultural practices|
|Authors||Prakash, E., Navarro-Newball, A., Moreno, I., Arya, A., Contreras, V.E., Quiceno, V.A., Lozano, S., Mejìa, J.D. and Loaiza, D.F.|
We present a novel approach for recreating life-like experiences through an easy and natural gesture-based interaction. By focusing on the locations and transforming the role of the user, we are able to significantly maximise the understanding of an ancient cultural practice, behaviour or event over traditional approaches. Technology-based virtual environments that display object reconstructions, old landscapes, cultural artefacts, and scientific phenomena are coming into vogue. In traditional approaches the user is a visitor navigating through these virtual environments observing and picking objects. However, cultural practices and certain behaviours from nature are not normally made explicit and their dynamics still need to be understood. Thus, our research idea is to bring such practices to life by allowing the user to enact them. This means that user may re-live a step-by-step process to understand a practice, behaviour or event. Our solution is to enable the user to enact using gesture-based interaction with sensor-based technologies such as the versatile Kinect. This allows easier and natural ways to interact in multidimensional spaces such as museum exhibits. We use heuristic approaches and semantic models to interpret human gestures that are captured from the user’s skeletal representation. We present and evaluate three applications. For each of the three applications, we integrate these interaction metaphors with gaming elements, thereby achieving a gesture-set to enact a cultural practice, behaviour or event. User evaluation experiments revealed that our approach achieved easy and natural interaction with an overall enhanced learning experience.
|Keywords||Gesture, Human motion, Gamification, Museum|
|Journal||Multimedia Tools and Applications|
|Journal citation||pp. 1-24|
|Digital Object Identifier (DOI)||doi:10.1007/s11042-015-2667-5|
|Published||31 May 2015|