Estimating the parameters of epidemic spread on two-layer random graphs: a classical and a neural network approach
In this paper, we study the spread of a classical SIR process on a two-layer random network, where the first layer represents the households, while the second layer models the contacts outside the households by a random scale-free graph. We build a three-parameter graph, called polynomial model, whe...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper, we study the spread of a classical SIR process on a two-layer
random network, where the first layer represents the households, while the
second layer models the contacts outside the households by a random scale-free
graph. We build a three-parameter graph, called polynomial model, where the new
vertices are connected to the existing ones either uniformly, or
preferentially, or by forming random triangles. We examine the effect of the
graph's properties on the goodness of the estimation of the infection rate
$\tau$, which is the most important parameter, determining the reproduction
rate of the epidemic.
In the classical maximum likelihood approach, to estimate $\tau$ one needs to
approximate the number of SI edges between households, since the graph itself
is supposed to be unobservable. Our simulation study reveals that the
estimation is poorer at the beginning of the epidemic, for larger preferential
attachment parameter of the graph, and for larger $\tau$. We present two
heuristic improvement algorithms and establish our method to be robust to
changes in average clustering of the graph model.
We also extend a graph neural network (GNN) approach for estimating contagion
dynamics for our two-layered graphs. We find that dense networks offer better
training datasets. Moreover, GNN perfomance is measured better using the $l_2$
loss function rather than cross-entropy. |
---|---|
DOI: | 10.48550/arxiv.2303.02195 |