Searching for explanations: testing social scientific methods in synthetic ground-truthed worlds

A scientific model’s usefulness relies on its ability to explain phenomena, predict how such phenomena will be impacted by future interventions, and prescribe actions to achieve desired outcomes. We study methods for learning causal models that explain the behaviors of simulated “human” populations....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computational and mathematical organization theory 2023-03, Vol.29 (1), p.156-187
Hauptverfasser: Schmidt, Aurora C., Cameron, Christopher J., Lowman, Corey, Brulé, Joshua, Deshpande, Amruta J., Fatemi, Seyyed A., Barash, Vladimir, Greenberg, Ariel M., Costello, Cash J., Sherman, Eli S., Bhattacharya, Rohit, McQuillan, Liz, Perrone, Alexander, Kouskoulas, Yanni A., Fink, Clay, Zhang, June, Shpitser, Ilya, Macy, Michael W.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:A scientific model’s usefulness relies on its ability to explain phenomena, predict how such phenomena will be impacted by future interventions, and prescribe actions to achieve desired outcomes. We study methods for learning causal models that explain the behaviors of simulated “human” populations. Through the Ground Truth project, we solved a series of Challenges where our explanations, predictions and prescriptions were scored against ground truth information. We describe the processes that emerged for applying causal discovery, network analysis, agent-based modeling and other analytical methods to inform solutions to Challenge tasks. We present our team’s overall performance results on these Challenges and discuss implications for future efforts to validate social scientific research using simulation-based challenges.
ISSN:1381-298X
1572-9346
DOI:10.1007/s10588-021-09353-w