Sae-Young Chung, KAIST, Korea

Title: Coding Theorems for Noisy Networks

Abstract:
A lot of progress has been made in network information theory in recent years, which sparkled a flurry of interest in designing smarter and more efficient communication systems. Noise and interference are some of the major obstacles in achieving efficient and reliable information flow in a network, which often requires complicated processing at intermediate relay nodes in the network. There are three fundamental strategies for relaying when the channel is noisy. If a relay can decode the message from the source, it can clean up the noise and the message can be re-encoded and forwarded (decode-and-forward, DF). Another strategy is to describe what is received by the relay without knowing the codebook structure, i.e., quantize the received signal vector and forward the compression index (compress-and-forward, CF). A simpler alternative is to perform symbol-wise relaying, e.g., amplify-and-forward (AF) relaying. In general, none of these are universally optimal for general noisy networks.

In this talk, we show some new coding strategies for enabling efficient information flow in a network. For noiseless interferenceless networks, network coding by Ahslwede, Cai, Li, and Yeung is known to achieve the multicast capacity of the network. We show it is possible to simultaneously generalize network coding and CF relaying, which we call noisy network coding (NNC). Our results show the best approximate capacity characterization to date for a general Gaussian relay network, i.e., NNC achieves the capacity of the network to within 0.63N bits regardless of the power and the channel gains of the network, where N is the number of nodes in the network. To improve the performance of NNC further, we design a new structured coding scheme based on nested lattices. Our coding scheme achieves the sum-rate capacity of the Gaussian two-way relay channel to within 0.69 bits, which is the best gap-to-capacity result to date. Therefore, these two strategies are approximately optimal at high signal-to-noise ratio (SNR)’s. Furthermore, we show that it is possible to achieve the information-theoretic capacity exactly for some new classes of networks with an arbitrary number of nodes. By combining DF and CF relaying, we can achieve the capacity of a class of single-source discrete memoryless relay networks with an arbitrary number of nodes having a tree topology. If the relay’s output does not only depend on past received symbols but also on the current received symbol, we show that the capacity can be achieved by a simple AF relaying for a class of noisy networks.

Joint work with Ihn-Jung Baik, Abbas El Gamal, Young-Han Kim, Si-Hyeon Lee, Sung Hoon Lim, and Wooseok Nam.