Picture of Egor Pifagorov
Марк Браверман (Торонто)
by Egor Pifagorov - Wednesday, 29 June 2011, 10:25 PM
Совместное заседание Общества и Лаборатории Чебышева при СПбГУ
5 июля 2011 г.
ПОМИ, Фонтанка, 27, аудитория 311, 18 час.

(Университет Торонто)


Notions of entropy and information, pioneered by Shannon, have been
very powerful tools in coding theory. Coding theory aims to solve the
problem of one-way communication: sending a message from Alice to Bob
using as little communication as possible, sometimes over a noisy

We will discuss several extensions of information-theoretic notions
to the two-way communication setting. We use them to prove a direct
sum theorem for randomized communication complexity, showing that
implementing k copies of a functionality requires substantially more
communication than just one copy.

More generally, we will show that information cost I(f) can be
defined as a natural fundamental property of a functionality f.
We will describe several new tight connections between I(f), direct
sum theorems, interactive compression schemes, and amortized communication

(Доклад будет прочитан на русском языке).