Theory Afternoon

February 1, 2008

 

Rm.121, Ho Sin Hang Engineering Bldg., CUHK

Dr. Shengyu Zhang, California Institute of Technology

Title :

Random walk on graphs and its algorithmic applications

Time:

3:00p.m - 3:50p.m.

Abstract:

Random walk on graphs is a simple but powerful technique to design randomized algorithms. In this talk, I'll give a gentle introduction to this topic with a couple of illustrative examples, such as Connectivity, Enumeration Approximation, and walk on expanders. Depending on the time and audience's preference, I may also talk about a quantum version of random walk and how it has been exploited to achieve a number of breakthrough results in the past several years.

Biography:

Shengyu Zhang received the B.S. in Mathematics at Fudan University in 1999, a M.S. in Computer Science at Tsinghua University in 2002, and the Ph.D. in Computer Science at Princeton University in 2006. He is currently a postdoc at California Institute of Technology. His main research interest is theoretical computer science, including randomized and quantum algorithm design and computational complexity theory.

   
Dr. Elad Verbin, Institute of Theoretical Computer Science, Tsinghua University

Title :

On Recent Developments in Learning with Agnostic Noise

Time:

4:00p.m - 4:50p.m.

Abstract:

In this talk I will give an overview of some exciting recent developments in learning boolean functions with noise. In normal non-noisy learning, the goal is to learn a function, given samples of the function at randomly-selected points. In noisy learning, the problem is made more difficult, by allowing a fraction of the examples to be wrong. If the wrong examples are chosen at random, then the noise is called "random", while if the examples are corrupted by a malicious adversary, the noise is called "agnostic". The subject of computation with noisy input has been ubiquitous in computer science, in topics such as coding theory and approximation algorithms, as well as practical topics such as computational biology and vision. Learning with noise has been considered extremely difficult (especially in the agnostic setting), but in recent years we've seen a large influx of positive results (algorithms), as well as some negative results (hardness results). I will describe some of these. I will also outline connections to coding theory and computational complexity. Here is an example of one of the problems that I will discuss in the talk, the problem of learning parity functions with noise: In the problem of learning parity (without noise), one gets samples of the form (x,f(x)), where x is an n-bit string chosen uniformly at random, and f is the XOR function on a subset of the bits. The function f is not known to the algorithm. The algorithm can request as many of these samples as it wants (but cannot specify the x's. They are picked at random). The goal is to return f with high probability. Learning parity without noise is easy to do in time O(n^3), using O(n) samples, by using Gaussian elimination. Now, suppose that 1% noise is introduced, so that in each sample, f(x) is returned with probability 0.99, and 1-f(x) is returned with probability 0.01. In this case, the problem becomes extremely difficult. The faster algorithm known is due to Blum, Kalai and Wasserman, and runs in time $2^{O(n/logn)}$. The problem is believed to be hard (but even defining what hardness means here is non-trivial). In discussing this problem, I will describe some possible consequences of parity with noise being easy or hard. Each way would be good: Easiness would give us a holy hammer for use in learning theory, while hardness would imply simple and efficient cryptographic primitives, among other consequences. Some parts of this talk are based on joint work with Adam Kalai and Yishay Mansour. In the talk I will not assume any prior knowledge on computational learning theory.

Biography:

Elad Verbin is a postdoctoral researcher at the Institute for Theoretical Computer Science (ITCS), in Tsinghua university. His interests include combinatorial algorithms and probabilistic algorithms.