Tactile Functasets: Neural Implicit Representations of Tactile Datasets
Modern incarnations of tactile sensors produce high-dimensional raw sensory feedback such as images, making it challenging to efficiently store, process, and generalize across sensors. To address these concerns, we introduce a novel implicit function representation for tactile sensor feedback. Rathe...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Modern incarnations of tactile sensors produce high-dimensional raw sensory
feedback such as images, making it challenging to efficiently store, process,
and generalize across sensors. To address these concerns, we introduce a novel
implicit function representation for tactile sensor feedback. Rather than
directly using raw tactile images, we propose neural implicit functions trained
to reconstruct the tactile dataset, producing compact representations that
capture the underlying structure of the sensory inputs. These representations
offer several advantages over their raw counterparts: they are compact, enable
probabilistically interpretable inference, and facilitate generalization across
different sensors. We demonstrate the efficacy of this representation on the
downstream task of in-hand object pose estimation, achieving improved
performance over image-based methods while simplifying downstream models. We
release code, demos and datasets at
https://www.mmintlab.com/tactile-functasets. |
---|---|
DOI: | 10.48550/arxiv.2409.14592 |