IEEE Home Search IEEE Shop Web Account Contact IEEE IEEE logo

IEEE Information Theory Society

Student Resources

 

Research Discussion and Lunch at ISIT 2007

 
     

Home

Student Resources

Info. Theory Society

Media Services

 

Committee Events

Future Events

Past Events

 

Volunteering

 

Getting Started

Info. Theory Textbook
Info. Theory Tutorials

Interference Channel
Comm. Networks
Cooperative Comm.
Cryptography
Multiuser Info. Theory
Network Coding

Biology & Info. Theory
Quantum Info. Theory
 

Doctoral Dissertations

 

Finding Your Career

Job Resources

Job Opportunities

Pool of Candidates

 

Useful Links

Writing Papers

Journals

Conferences

Resources

 

Committee Members

 

The research discussion will be held at ISIT 2007. Here is the agenda for the discussions:
 

Table #1:            Multiantenna and Multiuser Channels
                           Recent Results on the Capacity Regions and Degrees of Freedom

                           Leader: Hanan Weingarten (The Technion)

 

This will be an open talk on recent advances in multi-user communications with emphasis on new results on degrees of freedom. Some relevant results:

  1. S. A. Jafar and S. Shamai "Degrees of Freedom Region for the MIMO X Channel"

  2. A. Lapidoth, S. Shamai and M. A. Wigger "On the capacity of fading MIMO broadcast channels with imperfect transmitter side information"

Table #2:            Rate Distortion Theory for Multi-terminal Networks
                           Leader: Haim Permuter (Stanford)

 

The theme of our table is reconstructing sources of information in a network. We will discuss problem that students will suggest such as:

Slepian-Wolf problem:
D. Slepian and J. K. Wolf, "Noiseless coding of correlated information sources," IEEE Trans. Inform. Theory, vol. IT-19, pp. 471-480, July 1973.
T. M. Cover, "A proof of data compression theorem of Slepian and Wolf for ergodic sources," IEEE Trans. Inform. Theory, vol. IT-21, pp. 226-228, Mar. 1975.

Wyner-Ziv problem
A. D. Wyner and J. Ziv, "The rate-distortion function for source coding with side information at the decoder," IEEE Trans. Inform. Theory, vol. IT-22, pp. 1-10, Jan. 1976.

Two terminal Source Coding
T. Berger, "Multiterminal source coding," in The Information Theory Approach to Communications, ser. CISM Courses and Lectures, G. Longo,Ed. Springer-Verlag, 1978, vol. 229, pp. 171-231.
S.-Y. Tung, "Multiterminal source coding," Ph.D. dissertation, School of Electrical Engineering, Cornell University, Ithaca, NY, May 1978.
A. B. Wagner, S. Tavildar, and P. Viswanath, "The Rate Region of the Quadratic Gaussian Two-Terminal Source-Coding Problem." Submitted to IEEE Trans. on Inf. Theory. arXiv:cs.IT/0510095

Multiple Description
A. A. El Gamal and T. M. Cover, "Achievable rates for multiple descriptions," IEEE Trans. Inform. Theory, vol. IT-28, pp. 851-857, Nov. 1982.

Successive Refinement
W. H. R. Equitz and T. M. Cover, "Successive refinement of information," IEEE Trans. Inform. Theory, vol. 37, pp. 269-275, Mar. 1991.

CEO problem
T. Berger, Z. Zhang, and H. Viswanathan, "The CEO problem," IEEE Trans. Inf. Theory, vol. 42, no. 3, pp. 887-902, May 1996.
 

Table #3:            Cognitive Radios and Universal IT
                           Leader: Krish Eswaran (Berkeley)

 

The focus of this discussion will be universality in information theory. This area considers questions of source coding, prediction, and channel coding when the model for the distribution of the source/channel is unknown in advance. While much of the work in this area has been on source coding and prediction, there have also been recent advancements in universal channel coding. The hope for the discussion is to review work in the area and brainstorm interesting research directions and applications for future exploration. Here are some accessible references that should be a good starting point for discussion.

 

  1. Willems, Shtarkov, Tjalkens. "The context-tree weighting method: basic properties," IEEE Transactions on Information Theory, May 1995.

  2. Willems, Shtarkov, Tjalkens. "Reflections on 'The context-tree weighting method: basic properties'," IEEE Information Theory Society Newsletter, Mar. 1997.

  3. Merhav, Feder. "Universal prediction," IEEE Transactions on Information Theory, Oct 1998.

  4. Shayevitz and Feder. "Communicating using feedback over a binary channel with arbitrary noise sequence," Proceedings of the International Symposium on Information Theory, Melbourne, Australia, Sept. 2005.

  5. Shayevitz and Feder. "Achieving the Empirical Capacity Using Feedback - Part I: Memoryless Additive Models," submitted to IEEE Transactions on Information Theory.

Other references:

  1. Lempel and Ziv. "A universal algorithm for sequential data compression," IEEE Transactions on Information Theory, May 1977.

  2. Effros, Visweswariah, Kulkarni, Verdu. "Universal lossless source coding with the Burrows Wheeler transform," IEEE Trans. on Info. Theory, May 2002.

  3. Tchamkerten, Telatar. "Variable length coding over an unknown channel," IEEE Transactions on Information Theory, May 2006.

Table #4:            Neuroscience and IT

                           Leader: Bobak Nazer (Berkeley)

 

The brain is an information processing system whose components, neurons, are subject to noise processes and energy constraints. At a first glance, this bears a striking similarity to the types of problems considered in communications and information theory. As a result, many researchers have been tempted to directly apply the results of information theory to neurons. For instance, many works have focused on computing the mutual information across a neuron and arguing that this implies something about the representational power of the neural code. However, neurons are not standard communications blocks and they most certainly do not implement a block code so these naive arguments fall far short. Yet, an information-theoretic approach may still be fruitful. Intuitively, one can potentially assume the brain is optimal in terms of some yet unknown energy-delay-computation tradeoff. Given this assumption, one can use ideas such as capacity-per-unit-cost and measure matching to connect data collected from neural experiments to certain aspects of the ``neural code.'' At this roundtable, we will consider these issues and discuss what place information theory can and/or should have in theoretical neuroscience.

Couple of papers:

  1. L. R. Varshney, P. J. Sjostrom, D. B. Chklovskii, Optimal Information Storage in Noisy Synapses Under Resource Constraints, Neuron, 52, 409-423, Nov 9 2006

  2. V. Balasubramanian, D. Kimber, M. J. Berry, Metabolically Efficient Information Processing, Neural Computation, 13, 799-815, 2001

  3. S. B. Laughlin, R. R. de Ruyter van Steveninck, J. C. Anderson, The Metabolic Cost of Neural Information, Nature Neuroscience, v.1 no. 1, may 1998, 36-41

Table #5:            Multiaccess Channels with Feedback

                           Leader: Michele Wigger (ETHZ)

 

Following are two papers that will be discussed:

  1. "The capacity of white Gaussian multiple-access channel with feedback. " IEEE Trans. Inform. Theory 30(4):623-629, July 1985.

  2. "An achievable rate region for the multiple-access channel with feedback." IEEE Trans. Inform. Theory, 27(3):292-298, May 1981.

Table #6:            IT Security
                          Leader: Prasanth Ananthapadmanabhan (UMD/UMCP)

 

The theme will be wireless information-theoretic security. Following are a couple of recent papers (preprints) on this topic.

  1. M. Bloch, J. Barros, M. R. D. Rodrigues and S. W. McLaughlin, "Wireless Information-Theoretic Security - Part I: Theoretical Aspects," Submitted to the IEEE Transactions on Information Theory, Special Issue on Information-Theoretic Security, November 2006.

  2. Y. Liang, H. V. Poor and S. Shamai (Shitz), "Secure communication over fading channels," Submitted to IEEE Transactions on Information Theory, Special Issue on information theoretic security, November 2006.

Table #7:            Transmission of Correlated Data in Wireless Networks
                           Leader: Stephan Tinguely (ETHZ)

 

In view of these two papers, would it be possible to change the title of the topic to "Transmission of Correlated Data in Networks", rather than "Transmission of Correlated Data in Wireless Networks"?

  1. T. M. Cover, A. El Gamal, M. Salehi, "Multiple Access Channels with Arbitrarily Correlated Sources", IEEE Transactions on Information Theory, Vol. IT-26(6), November 1980.

  2. T. S. Han, M. H. M. Costa, "Broadcast Channels with Arbitrarily Correlated Sources", IEEE Transactions on Information Theory, Vol. IT-33(5), September 1987.

Table #8:            Joint Source-Channel Coding
                          Leader: Deniz Gunduz (Brooklyn Poly)

 

will be posted soon.

 

Table #9:            Information Theoretical Aspects of Decentralized Detection
 
                          Leader: Amichi Sanderovich (The Technion)

The talk can include one of the following issues:
* Wyner model and joint cell-site processing with fading channels.
* Sub-optimal approaches to improve decentralized detection, such as binary signaling, etc.
* Efficiently combining local and centralized decoding with separated message sets.

 

Sample papers are:

  1. Aaron Wyner "Shannon Theoretic Approach to a Gaussian Celluler Multiple Access Channel," IEEE Transactions on Information Theory, November 1994, pp. 1713-1727.

  2. R. Dabora and S. Servetto "Broadcast Channels With Cooperating Decoders," IEEE Transactions on Information Theory, December, pp. 5438-5454.