Control Interface for Driving Interactive Characters in Immersive Virtual Environments
The effectiveness of training Soldiers in immersive 3D virtual environments is currently limited by character control interfaces that require users to learn actions, for example moving a joystick or pressing a button, that do not necessarily enhance the user's physical performance in equivalent...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Report |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The effectiveness of training Soldiers in immersive 3D virtual environments is currently limited by character control interfaces that require users to learn actions, for example moving a joystick or pressing a button, that do not necessarily enhance the user's physical performance in equivalent real world tasks and situations. In order to address this need, an advanced man/machine user interface has been developed utilizing inertial position, orientation, ultrasonic range and foot force sensors that allows users to naturally control interactive character movements using sensorimotor responses that closely resemble the tasks and actions performed in the real world. Known as a Virtual Locomotion Controller (VLC), this paper describes the VLC system architecture, control logic and associated sensor processing and the simulation environment used to determine the feasibility of the approach.
See also ADM002075. Presented at the Army Science Conference (25th) held in Orlando, FL, on 27-30 Nov 2006. |
---|