Recent advances in scaling‐down sampling methods in machine learning
Data sampling methods have been investigated for decades in the context of machine learning and statistical algorithms, with significant progress made in the past few years driven by strong interest in big data and distributed computing. Most recently, progress has been made in methods that can be b...
Gespeichert in:
Veröffentlicht in: | Wiley interdisciplinary reviews. Computational statistics 2017-11, Vol.9 (6), p.e1414-n/a |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Data sampling methods have been investigated for decades in the context of machine learning and statistical algorithms, with significant progress made in the past few years driven by strong interest in big data and distributed computing. Most recently, progress has been made in methods that can be broadly categorized into random sampling including density‐biased and nonuniform sampling methods; active learning methods, which are a type of semi‐supervised learning and an area of intense research; and progressive sampling methods which can be viewed as a combination of the above two approaches. A unified view of scaling‐down sampling methods is presented in this article and complemented with descriptions of relevant published literature. WIREs Comput Stat 2017, 9:e1414. doi: 10.1002/wics.1414
This article is categorized under:
Statistical and Graphical Methods of Data Analysis > Sampling
Summary of scaling‐down techniques found in literature |
---|---|
ISSN: | 1939-5108 1939-0068 |
DOI: | 10.1002/wics.1414 |