December 11, Sunday
12:00 – 14:00
In this talk I will discuss a model introduced by El Gamal in 1984 for sharing information over a noisy broadcast network: each of N players is given one input bit, and the goal is for all players to learn (with high probability) all the input bits of the other players, using the smallest possible number of broadcasts over a joint communication channel. In each broadcast a player transmits one bit to all other players; however noise flips the bit heard by each recipient with some fixed probability.
Without noise, N broadcasts would trivially suffice for the players to learn all bits. However the best known protocol that deals with noise, discovered by Gallager in 1988, uses $N \log \log N$ broadcasts. Attempts made since to bring the blowup down to a constant have failed. Our main result is that Gallager's protocol is in fact optimal up to a constant factor
This is joint work with Navin Goyal and Michael Saks.
Short Bio:
I am currently a post-doctoral researcher with the theory group in Microsoft Research. My research is focused on theoretical computer science, and in particular in the introduction and applications of mathematical methods to computer science. I have completed my PhD in 2002 under the supervision of Prof. Shmuel Safra, and before coming to Microsoft I have been a post-doc at the Hebrew University, and in a joint two-year program of the Institute for Advanced Study in Princeton, and DIMACS (DIMACS is located at Rutgers University).