Hierarchical Bayesian Analyses for Modeling BOLD Time Series Data

Hierarchical Bayesian analyses have become a popular technique for analyzing complex interactions of important experimental variables. One application where these analyses have great potential is in analyzing neural data. However, estimating parameters for these models can be complicated. Although m...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computational brain & behavior 2018-06, Vol.1 (2), p.184-213
Hauptverfasser: Molloy, M. Fiona, Bahg, Giwon, Li, Xiangrui, Steyvers, Mark, Lu, Zhong-Lin, Turner, Brandon M.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Hierarchical Bayesian analyses have become a popular technique for analyzing complex interactions of important experimental variables. One application where these analyses have great potential is in analyzing neural data. However, estimating parameters for these models can be complicated. Although many software programs facilitate the estimation of parameters within hierarchical Bayesian models, due to some restrictions, complicated workarounds are sometimes necessary to implement a model within the software. One such restriction is convolution, a technique often used in neuroimaging analyses to relate experimental variables to models describing neural activation. Here, we show how to perform convolution within the R programming environment. The strategy here is to pass the convolved neural signal to existing software package for fitting hierarchical Bayesian models to data such as JAGS (Plummer 2003 ) or Stan (Carpenter et al. 2017 ). We use the convolution technique as a basis for describing neural time series data and develop five models to describe how subject-, condition-, and brain-area-specific effects interact. To provide a concrete example, we apply these models to fMRI data from a stop-signal task. The models are assessed in terms of model fit, parameter constraint, and generalizability. For these data, our results suggest that while subject and condition constraints are important for both fit and generalization, region of interest constraints did not substantially improve performance.
ISSN:2522-0861
2522-087X
DOI:10.1007/s42113-018-0013-5