Average Binary Long-Lived Consensus: Quantifying the Stabilizing Role Played by Memory

  • Authors:
  • Florent Becker;Sergio Rajsbaum;Ivan Rapaport;Éric Rémila

  • Affiliations:
  • Université de Lyon, LIP UMR 5668 CNRS - ÉNS Lyon - UCB Lyon 1, France;Instituto de Matemáticas, Universidad Nacional Autónoma de México,;DIM and CMM, Universidad de Chile,;Université de Lyon, LIP UMR 5668 CNRS - ÉNS Lyon - UCB Lyon 1, France

  • Venue:
  • SIROCCO '08 Proceedings of the 15th international colloquium on Structural Information and Communication Complexity
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Consider a system composed of nsensors operating in synchronous rounds. In each round an input vectorof sensor readings xis produced, where the i-th entry of xis a binary value produced by the i-th sensor. The sequence of input vectors is assumed to be smooth: exactly one entry of the vector changes from one round to the next one. The system implements a fault-tolerant averaging consensus functionf. This function returns, in each round, a representative output valuevof the sensor readings x. Assuming that at most tentries of the vector can be erroneous, fis required to return a value that appears at least t+ 1 times in x. The instabilityof the system is the number of output changes over a random sequence of input vectors.Our first result is to design optimal instability consensus systems with and without memory. Roughly, in the memoryless case, we show that an optimal system is D0, that outputs 1 unless it is forced by the fault-tolerance requirement to output 0 (on vectors with tor less 1's). For the case of systems with memory, we show that an optimal system is D1, that initially outputs the most common value in the input vector, and then stays with this output unless forced by the fault-tolerance requirement to change (i.e., a single bit of memory suffices).Our second result is to quantify the gain factor due to memory by computing cn(t), the number of decision changes performed by D0per each decision change performed by D1. If $t=\frac{n}{2}$ the system is always forced to decide the simple majority and, in that case, memory becomes useless. We show that the same type of phenomenon occurs when $\frac{n}{2}-t$ is constant. Nevertheless, as soon as $\frac{n}{2}-t \sim \sqrt{n}$, memory plays an important stabilizing role because the ratio cn(t) grows like $\Theta(\sqrt{n})$. We also show that this is an upper bound: $c_n(t)=O(\sqrt{n})$ for every t.Our results are average case versions of previous works where the sequence of input vectors was assumed to be, in addition to smooth, geodesic: the i-th entry of the input vector was allowed to change at most onceover the sequence. It thus eliminates some anomalies that ocurred in the worst case, geodesic instability setting.