Functional arm movements, such as reaching while standing, are planned and executed according to our perception of body position in space and are relative to environmental objects. The angle under which the environment is observed is one component used in creating this perception. This suggests that manipulation of viewing angle may modulate whole body movement to affect performance. We tested this by comparing its effect on reaching in a virtually generated environment. Eleven young healthy individuals performed forward and lateral reaches in the virtual environment, presented on a flat screen in third-person perspective. Participants saw a computer-generated model (avatar) of themselves standing in a courtyard facing a semi-circular hedge with flowers. The image was presented in five different viewing angles ranging from seeing the avatar from behind (0°), to viewing from overhead (90°). Participants attempted to touch the furthest flower possible without losing balance or stepping. Kinematic data were collected to analyze endpoint displacement, arm-postural coordination and center of mass (COM) displacement. Results showed that reach distance was greatest with angular perspectives of approximately 45-77.5°, which are larger than those used in analogous real world situations. Larger reaches were characterized by increased involvement of leg and trunk body segments, altered inter-segmental coordination, and decreased inter-segmental movement time lag. Thus a viewing angle can be a critical visuomotor variable modulating motor coordination of the whole body and related functional performance. These results can be used in designing virtual reality games, in ergonomic design, teleoperation training, and in designing virtual rehabilitation programs that re-train functional movement in vulnerable individuals.
|State||Published - 2010|