Cross-species 3D virtual reality toolbox for visual and cognitive experiments
•This toolbox adds VR capability to any pre-existing data acquisition framework.•Cross-species usage, from rodents to humans, is supported.•Possible paradigms range from simple search to complex contextual learning.•Can be paired with eye tracking and electrophysiological recording.•Minimizes implem...
Gespeichert in:
Veröffentlicht in: | Journal of neuroscience methods 2016-06, Vol.266, p.84-93 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | •This toolbox adds VR capability to any pre-existing data acquisition framework.•Cross-species usage, from rodents to humans, is supported.•Possible paradigms range from simple search to complex contextual learning.•Can be paired with eye tracking and electrophysiological recording.•Minimizes implementation costs and doesn’t require specific hardware.
Although simplified visual stimuli, such as dots or gratings presented on homogeneous backgrounds, provide strict control over the stimulus parameters during visual experiments, they fail to approximate visual stimulation in natural conditions. Adoption of virtual reality (VR) in neuroscience research has been proposed to circumvent this problem, by combining strict control of experimental variables and behavioral monitoring within complex and realistic environments.
We have created a VR toolbox that maximizes experimental flexibility while minimizing implementation costs. A free VR engine (Unreal 3) has been customized to interface with any control software via text commands, allowing seamless introduction into pre-existing laboratory data acquisition frameworks. Furthermore, control functions are provided for the two most common programming languages used in visual neuroscience: Matlab and Python.
The toolbox offers milliseconds time resolution necessary for electrophysiological recordings and is flexible enough to support cross-species usage across a wide range of paradigms.
Unlike previously proposed VR solutions whose implementation is complex and time-consuming, our toolbox requires minimal customization or technical expertise to interface with pre-existing data acquisition frameworks as it relies on already familiar programming environments. Moreover, as it is compatible with a variety of display and input devices, identical VR testing paradigms can be used across species, from rodents to humans.
This toolbox facilitates the addition of VR capabilities to any laboratory without perturbing pre-existing data acquisition frameworks, or requiring any major hardware changes. |
---|---|
ISSN: | 0165-0270 1872-678X |
DOI: | 10.1016/j.jneumeth.2016.03.009 |