Big Data (R)evolution in Geography: Complexity Modelling in the Last Two Decades
ABSTRACT The use of data and statistics along with computational systems heralded the beginning of a quantitative revolution in Geography. Use of simulation models (Cellular Automata and Agent‐Based Models) followed in the late 1990s, with ontology and epistemology of complexity theory and modelling...
Gespeichert in:
Veröffentlicht in: | Geography compass 2024-11, Vol.18 (11), p.n/a |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | ABSTRACT
The use of data and statistics along with computational systems heralded the beginning of a quantitative revolution in Geography. Use of simulation models (Cellular Automata and Agent‐Based Models) followed in the late 1990s, with ontology and epistemology of complexity theory and modelling being defined a little less than two decades ago. We are, however, entering a new era where sensors regularly collect and update large amounts of spatio‐temporal data. We define this ‘Big Data’ as geolocated data collected in sufficiently high volume (exceeding storage capacities of the largest personal hard drives currently available), that is updated at least daily, from a variety of sources in different formats, often without recourse to verification of its accuracy. We then identify the exponential growth in the use of complexity simulation models in the past two decades via an extensive literature review (broken down by application area), but also notice a recent slowdown. Further, a gap in the utilisation of Big Data by modellers to calibrate and validate their models is noted, which we attribute to data availability issues. We contend that Big Data can significantly boost simulation modelling, if certain constraints and issues are managed properly. |
---|---|
ISSN: | 1749-8198 1749-8198 |
DOI: | 10.1111/gec3.70009 |