On bias, variance, overfitting, gold standard and consensus in single‐particle analysis by cryo‐electron microscopy
Cryo‐electron microscopy (cryoEM) has become a well established technique to elucidate the 3D structures of biological macromolecules. Projection images from thousands of macromolecules that are assumed to be structurally identical are combined into a single 3D map representing the Coulomb potential...
Gespeichert in:
Veröffentlicht in: | Acta crystallographica. Section D, Biological crystallography. Biological crystallography., 2022-04, Vol.78 (4), p.410-423 |
---|---|
Hauptverfasser: | , , , , , , , , , , , , , , , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Cryo‐electron microscopy (cryoEM) has become a well established technique to elucidate the 3D structures of biological macromolecules. Projection images from thousands of macromolecules that are assumed to be structurally identical are combined into a single 3D map representing the Coulomb potential of the macromolecule under study. This article discusses possible caveats along the image‐processing path and how to avoid them to obtain a reliable 3D structure. Some of these problems are very well known in the community. These may be referred to as sample‐related (such as specimen denaturation at interfaces or non‐uniform projection geometry leading to underrepresented projection directions). The rest are related to the algorithms used. While some have been discussed in depth in the literature, such as the use of an incorrect initial volume, others have received much less attention. However, they are fundamental in any data‐analysis approach. Chiefly among them, instabilities in estimating many of the key parameters that are required for a correct 3D reconstruction that occur all along the processing workflow are referred to, which may significantly affect the reliability of the whole process. In the field, the term overfitting has been coined to refer to some particular kinds of artifacts. It is argued that overfitting is a statistical bias in key parameter‐estimation steps in the 3D reconstruction process, including intrinsic algorithmic bias. It is also shown that common tools (Fourier shell correlation) and strategies (gold standard) that are normally used to detect or prevent overfitting do not fully protect against it. Alternatively, it is proposed that detecting the bias that leads to overfitting is much easier when addressed at the level of parameter estimation, rather than detecting it once the particle images have been combined into a 3D map. Comparing the results from multiple algorithms (or at least, independent executions of the same algorithm) can detect parameter bias. These multiple executions could then be averaged to give a lower variance estimate of the underlying parameters.
Single‐particle analysis (SPA) by cryo‐electron microscopy comprises the estimation of many parameters along its image‐processing pipeline. Overfitting observed in SPA is normally due to misestimated parameters, and the only way to identify these is by comparing the estimates of multiple algorithms or, at least, multiple executions of the same algorithm. |
---|---|
ISSN: | 2059-7983 0907-4449 2059-7983 1399-0047 |
DOI: | 10.1107/S2059798322001978 |