Gapprox: using Gallup approach for approximation in Big Data processing

As Big Data processing often takes a long time and needs a lot of resources, sampling and approximate computing techniques may be used to generate a desired Quality of Result. On the other hand, due to not considering data variety, available sample-based approximation approaches suffer from poor acc...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of Big Data 2019-02, Vol.6 (1), p.1-24, Article 20
Hauptverfasser: Ahmadvand, Hossein, Goudarzi, Maziar, Foroutan, Fouzhan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:As Big Data processing often takes a long time and needs a lot of resources, sampling and approximate computing techniques may be used to generate a desired Quality of Result. On the other hand, due to not considering data variety, available sample-based approximation approaches suffer from poor accuracy. Data variety is one of the key features of Big Data which causes various parts of data to have different impact on the final result. To address this problem, we develop a data variety aware approximation approach called Gapprox. Our idea is to use a kind of cluster sampling to improve the accuracy of estimation. Our approach can decrease the amount of data to be processed to achieve the desired Quality of Result with acceptable error bound and confidence interval. We divide the input data into some blocks considering the intra/inter cluster variance. The size of the block and the sample size are determined in such a way that by processing small amount of input data, an acceptable confidence interval and error bound is achieved. We compared our work with two well-known state of the art. The experimental results show that our result surpasses the state of the art and improve processing time up to 17× compared to ApproxHadoop and 8× compared to Sapprox when the user can tolerate an error of 5% with 95% confidence.
ISSN:2196-1115
2196-1115
DOI:10.1186/s40537-019-0185-4