PressureVision: Estimating Hand Pressure from a Single RGB Image
People often interact with their surroundings by applying pressure with their hands. While hand pressure can be measured by placing pressure sensors between the hand and the environment, doing so can alter contact mechanics, interfere with human tactile perception, require costly sensors, and scale...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | People often interact with their surroundings by applying pressure with their
hands. While hand pressure can be measured by placing pressure sensors between
the hand and the environment, doing so can alter contact mechanics, interfere
with human tactile perception, require costly sensors, and scale poorly to
large environments. We explore the possibility of using a conventional RGB
camera to infer hand pressure, enabling machine perception of hand pressure
from uninstrumented hands and surfaces. The central insight is that the
application of pressure by a hand results in informative appearance changes.
Hands share biomechanical properties that result in similar observable
phenomena, such as soft-tissue deformation, blood distribution, hand pose, and
cast shadows. We collected videos of 36 participants with diverse skin tone
applying pressure to an instrumented planar surface. We then trained a deep
model (PressureVisionNet) to infer a pressure image from a single RGB image.
Our model infers pressure for participants outside of the training data and
outperforms baselines. We also show that the output of our model depends on the
appearance of the hand and cast shadows near contact regions. Overall, our
results suggest the appearance of a previously unobserved human hand can be
used to accurately infer applied pressure. Data, code, and models are available
online. |
---|---|
DOI: | 10.48550/arxiv.2203.10385 |