Searching, Sorting, and Cake Cutting in Rounds

We study searching and sorting in rounds motivated by a fair division question: given a cake cutting problem with $n$ players, compute a fair allocation in at most $k$ rounds of interaction with the players. Rounds interpolate between the simultaneous and the fully adaptive settings, also capturing...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Brânzei, Simina, Paparas, Dimitris, Recker, Nicholas
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We study searching and sorting in rounds motivated by a fair division question: given a cake cutting problem with $n$ players, compute a fair allocation in at most $k$ rounds of interaction with the players. Rounds interpolate between the simultaneous and the fully adaptive settings, also capturing parallel complexity. We find that proportional cake cutting in rounds is equivalent to sorting with rank queries in rounds. We design a protocol for proportional cake cutting in rounds, while lower bounds for sorting in rounds with rank queries were given by Alon and Azar. Inspired by the rank query model, we then consider two basic search problems: ordered and unordered search. In unordered search, we get an array $\vec{x}=(x_1, \ldots, x_n)$ and an element $z$ promised to be in $\vec{x}$. We have access to an oracle that receives queries of the form "Is $z$ at location $i$?" and answers "Yes" or "No". The goal is to find the location of $z$ with success probability at least $p$ in at most $k$ rounds of interaction with the oracle. We show the expected query complexity of randomized algorithms on a worst case input is $np\bigl(\frac{k+1}{2k}\bigr) \pm O(1)$, while that of deterministic algorithms on a worst case input distribution is $np \bigl(1 - \frac{k-1}{2k}p \bigr) \pm O(1)$. These bounds apply even to fully adaptive unordered search, where the ratio between the two complexities converges to $2-p$ as the size of the array grows. In ordered search, we get sorted array $\vec{x}=(x_1, \ldots, x_n)$ and element $z$ promised to be in $\vec{x}$. We have access to an oracle that gets comparison queries. Here we find that the expected query complexity of randomized algorithms on a worst case input and deterministic algorithms on a worst case input distribution is essentially the same: $p k \cdot n^{\frac{1}{k}} \pm O(1+pk)$.
DOI:10.48550/arxiv.2012.00738