Model Checking Markov Chains as Distribution Transformers
The conventional perspective on Markov chains considers decision problems concerning the probabilities of temporal properties being satisfied by traces of visited states. However, consider the following query made of a stochastic system modelling the weather: given the conditions today, will there b...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The conventional perspective on Markov chains considers decision problems
concerning the probabilities of temporal properties being satisfied by traces
of visited states. However, consider the following query made of a stochastic
system modelling the weather: given the conditions today, will there be a day
with less than 50\% chance of rain? The conventional perspective is
ill-equipped to decide such problems regarding the evolution of the initial
distribution. The alternate perspective we consider views Markov chains as
distribution transformers: the focus is on the sequence of distributions on
states at each step, where the evolution is driven by the underlying stochastic
transition matrix. More precisely, given an initial distribution vector $\mu$,
a stochastic update transition matrix $M$, we ask whether the ensuing sequence
of distributions $(\mu, M\mu, M^2\mu, \dots)$ satisfies a given temporal
property. This is a special case of the model-checking problem for linear
dynamical systems, which is not known to be decidable in full generality. The
goal of this article is to delineate the classes of instances for which this
problem can be solved, under the assumption that the dynamics is governed by
stochastic matrices. |
---|---|
DOI: | 10.48550/arxiv.2406.15087 |