Data-Driven Texture Modeling and Rendering on Electrovibration Display

With the introduction of variable friction displays, either based on ultrasonic or electrovibration technology, new possibilities have emerged in haptic texture rendering on flat surfaces. In this work, we propose a data-driven method for realistic texture rendering on an electrovibration display. W...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on haptics 2020-04, Vol.13 (2), p.298-311
Hauptverfasser: Osgouei, Reza Haghighi, Kim, Jin Ryong, Choi, Seungmoon
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:With the introduction of variable friction displays, either based on ultrasonic or electrovibration technology, new possibilities have emerged in haptic texture rendering on flat surfaces. In this work, we propose a data-driven method for realistic texture rendering on an electrovibration display. We first describe a motorized linear tribometer designed to collect lateral frictional forces from textured surfaces under various scanning velocities and normal forces. We then propose an inverse dynamics model of the display to describe its output-input relationship using nonlinear autoregressive neural networks with external input. Forces resulting from applying a pseudo-random binary signal to the display are used to train each network under the given experimental condition. In addition, we propose a two-step interpolation scheme to estimate actuation signals for arbitrary conditions under which no prior data have been collected. A comparison between real and virtual forces in the frequency domain shows promising results for recreating virtual textures similar to the real ones, also revealing the capabilities and limitations of the proposed method. We also conducted a human user study to compare the performance of our neural-network-based method with that of a record-and-playback method. The results showed that the similarity between the real and virtual textures generated by our approach was significantly higher.
ISSN:1939-1412
2329-4051
DOI:10.1109/TOH.2019.2932990