Alternative techniques for the efficient acquisition of haptic data
Immersive environments are those that surround users in an artificial world. These environments consist of a composition of various types of immersidata: unique data types that are combined to render a virtual experience. Acquisition, for storage and future querying, of information describing sessio...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Immersive environments are those that surround users in an artificial world. These environments consist of a composition of various types of immersidata: unique data types that are combined to render a virtual experience. Acquisition, for storage and future querying, of information describing sessions in these environments is challenging because of the real-time demands and sizeable amounts of data to be managed. In this paper, we summarize a comparison of techniques for achieving the efficient acquisition of one type of immersidata, the haptic data type, which describes the movement, rotation, and force associated with user-directed objects in an immersive environment. In addition to describing a general process for real-time sampling and recording of this type of data, we propose three distinct sampling strategies: fixed, grouped, and adaptive. We conducted several experiments with a real haptic device and found that there are tradeoffs between the accuracy, efficiency, and complexity of implementation for each of the proposed techniques. While it is possible to use any of these approaches for real-time haptic data acquisition, we found that an adaptive sampling strategy provided the most efficiency without significant loss in accuracy. As immersive environments become more complex and contain more haptic sensors, techniques such as adaptive sampling can be useful for improving scalability of real-time data acquisition. |
---|---|
ISSN: | 0163-5999 |
DOI: | 10.1145/378420.378830 |