Algorithms for Noisy Broadcast under Erasures
The noisy broadcast model was first studied in [Gallager, TranInf'88] where an $n$-character input is distributed among $n$ processors, so that each processor receives one input bit. Computation proceeds in rounds, where in each round each processor broadcasts a single character, and each recep...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The noisy broadcast model was first studied in [Gallager, TranInf'88] where
an $n$-character input is distributed among $n$ processors, so that each
processor receives one input bit. Computation proceeds in rounds, where in each
round each processor broadcasts a single character, and each reception is
corrupted independently at random with some probability $p$. [Gallager,
TranInf'88] gave an algorithm for all processors to learn the input in
$O(\log\log n)$ rounds with high probability. Later, a matching lower bound of
$\Omega(\log\log n)$ was given in [Goyal, Kindler, Saks; SICOMP'08].
We study a relaxed version of this model where each reception is erased and
replaced with a `?' independently with probability $p$. In this relaxed model,
we break past the lower bound of [Goyal, Kindler, Saks; SICOMP'08] and obtain
an $O(\log^* n)$-round algorithm for all processors to learn the input with
high probability. We also show an $O(1)$-round algorithm for the same problem
when the alphabet size is $\Omega(\mathrm{poly}(n))$. |
---|---|
DOI: | 10.48550/arxiv.1808.00838 |