Evaluating the effectiveness of artifact correction and rejection in event‐related potential research
Eyeblinks and other large artifacts can create two major problems in event‐related potential (ERP) research, namely confounds and increased noise. Here, we developed a method for assessing the effectiveness of artifact correction and rejection methods in minimizing these two problems. We then used t...
Gespeichert in:
Veröffentlicht in: | Psychophysiology 2024-05, Vol.61 (5), p.e14511-n/a |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Eyeblinks and other large artifacts can create two major problems in event‐related potential (ERP) research, namely confounds and increased noise. Here, we developed a method for assessing the effectiveness of artifact correction and rejection methods in minimizing these two problems. We then used this method to assess a common artifact minimization approach, in which independent component analysis (ICA) is used to correct ocular artifacts, and artifact rejection is used to reject trials with extreme values resulting from other sources (e.g., movement artifacts). This approach was applied to data from five common ERP components (P3b, N400, N170, mismatch negativity, and error‐related negativity). Four common scoring methods (mean amplitude, peak amplitude, peak latency, and 50% area latency) were examined for each component. We found that eyeblinks differed systematically across experimental conditions for several of the components. We also found that artifact correction was reasonably effective at minimizing these confounds, although it did not usually eliminate them completely. In addition, we found that the rejection of trials with extreme voltage values was effective at reducing noise, with the benefits of eliminating these trials outweighing the reduced number of trials available for averaging. For researchers who are analyzing similar ERP components and participant populations, this combination of artifact correction and rejection approaches should minimize artifact‐related confounds and lead to improved data quality. Researchers who are analyzing other components or participant populations can use the method developed in this study to determine which artifact minimization approaches are effective in their data.
Blinks and the other large artifacts can significantly reduce effect sizes and create confounds. We propose a new method for assessing the effectiveness of artifact correction and rejection in maximizing effect sizes and minimizing confounds. We also apply this method to data from five ERP components to demonstrate that a common combination of artifact correction and rejection is reasonably effective. |
---|---|
ISSN: | 0048-5772 1469-8986 1540-5958 |
DOI: | 10.1111/psyp.14511 |