Info. Theory Society
• Info. Theory
• Network Coding
• Biology & Info.
• Quantum Info.
The research discussion will be held at ISIT
2007. Here is the agenda for the discussions:
Multiantenna and Multiuser Channels
Recent Results on the Capacity Regions and Degrees of Freedom
Leader: Hanan Weingarten (The Technion)
This will be an open talk on recent
advances in multi-user communications with emphasis on new results on
degrees of freedom. Some relevant results:
S. A. Jafar and S. Shamai "Degrees
of Freedom Region for the MIMO X Channel"
A. Lapidoth, S. Shamai and M. A. Wigger "On
the capacity of fading MIMO broadcast channels with imperfect
transmitter side information"
Rate Distortion Theory for Multi-terminal
Leader: Haim Permuter (Stanford)
The theme of our table is reconstructing
sources of information in a network. We will discuss problem that
students will suggest such as:
D. Slepian and J. K. Wolf, "Noiseless coding of correlated
information sources," IEEE Trans. Inform. Theory, vol. IT-19, pp.
471-480, July 1973.
T. M. Cover, "A proof of data compression theorem of Slepian and Wolf
for ergodic sources," IEEE Trans. Inform. Theory, vol. IT-21, pp.
226-228, Mar. 1975.
A. D. Wyner and J. Ziv, "The rate-distortion function for source
coding with side information at the decoder," IEEE Trans. Inform.
Theory, vol. IT-22, pp. 1-10, Jan. 1976.
Two terminal Source Coding
T. Berger, "Multiterminal source coding," in The Information Theory
Approach to Communications, ser. CISM Courses and Lectures, G. Longo,Ed.
Springer-Verlag, 1978, vol. 229, pp. 171-231.
S.-Y. Tung, "Multiterminal source coding," Ph.D. dissertation, School of
Electrical Engineering, Cornell University, Ithaca, NY, May 1978.
A. B. Wagner, S. Tavildar, and P. Viswanath, "The Rate Region of the
Quadratic Gaussian Two-Terminal Source-Coding Problem." Submitted to
IEEE Trans. on Inf. Theory. arXiv:cs.IT/0510095
A. A. El Gamal and T. M. Cover, "Achievable rates for multiple
descriptions," IEEE Trans. Inform. Theory, vol. IT-28, pp. 851-857, Nov.
W. H. R. Equitz and T. M. Cover, "Successive refinement of
information," IEEE Trans. Inform. Theory, vol. 37, pp. 269-275, Mar.
T. Berger, Z. Zhang, and H. Viswanathan, "The CEO problem," IEEE
Trans. Inf. Theory, vol. 42, no. 3, pp. 887-902, May 1996.
Cognitive Radios and Universal IT
Leader: Krish Eswaran (Berkeley)
The focus of this discussion
will be universality in information theory. This area considers
questions of source coding, prediction, and channel coding when the
model for the distribution of the source/channel is unknown in advance.
While much of the work in this area has been on source coding and
prediction, there have also been recent advancements in universal
channel coding. The hope for the discussion is to review work in the
area and brainstorm interesting research directions and applications for
future exploration. Here are some accessible references that should be a
good starting point for discussion.
Willems, Shtarkov, Tjalkens. "The
context-tree weighting method: basic properties," IEEE Transactions
on Information Theory, May 1995.
Willems, Shtarkov, Tjalkens. "Reflections
on 'The context-tree weighting method: basic properties'," IEEE
Information Theory Society Newsletter, Mar. 1997.
Merhav, Feder. "Universal
prediction," IEEE Transactions on Information Theory, Oct 1998.
Shayevitz and Feder. "Communicating
using feedback over a binary channel with arbitrary noise sequence,"
Proceedings of the International Symposium on Information Theory,
Melbourne, Australia, Sept. 2005.
Shayevitz and Feder. "Achieving
the Empirical Capacity Using Feedback - Part I: Memoryless Additive
Models," submitted to IEEE Transactions on Information Theory.
Lempel and Ziv. "A
universal algorithm for sequential data compression," IEEE
Transactions on Information Theory, May 1977.
Effros, Visweswariah, Kulkarni, Verdu. "Universal
lossless source coding with the Burrows Wheeler transform," IEEE
Trans. on Info. Theory, May 2002.
Tchamkerten, Telatar. "Variable
length coding over an unknown channel," IEEE Transactions on
Information Theory, May 2006.
Neuroscience and IT
Leader: Bobak Nazer (Berkeley)
The brain is an information processing system
whose components, neurons, are subject to noise processes and energy
constraints. At a first glance, this bears a striking similarity to the
types of problems considered in communications and information theory.
As a result, many researchers have been tempted to directly apply the
results of information theory to neurons. For instance, many works have
focused on computing the mutual information across a neuron and arguing
that this implies something about the representational power of the
neural code. However, neurons are not standard communications blocks and
they most certainly do not implement a block code so these naive
arguments fall far short. Yet, an information-theoretic approach may
still be fruitful. Intuitively, one can potentially assume the brain is
optimal in terms of some yet unknown energy-delay-computation tradeoff.
Given this assumption, one can use ideas such as capacity-per-unit-cost
and measure matching to connect data collected from neural experiments
to certain aspects of the ``neural code.'' At this roundtable, we will
consider these issues and discuss what place information theory can
and/or should have in theoretical neuroscience.
Couple of papers:
L. R. Varshney, P. J. Sjostrom, D. B. Chklovskii,
Optimal Information Storage in Noisy Synapses Under Resource Constraints,
Neuron, 52, 409-423, Nov 9 2006
V. Balasubramanian, D. Kimber, M. J. Berry,
Metabolically Efficient Information Processing, Neural Computation,
13, 799-815, 2001
S. B. Laughlin, R. R. de Ruyter van Steveninck, J. C. Anderson,
The Metabolic Cost of Neural Information, Nature Neuroscience, v.1
no. 1, may 1998, 36-41
Multiaccess Channels with Feedback
Leader: Michele Wigger (ETHZ)
Following are two papers that will be discussed:
"The capacity of white
Gaussian multiple-access channel with feedback. " IEEE Trans. Inform.
Theory 30(4):623-629, July 1985.
"An achievable rate region for the multiple-access channel with
feedback." IEEE Trans. Inform. Theory, 27(3):292-298, May 1981.
Leader: Prasanth Ananthapadmanabhan (UMD/UMCP)
theme will be wireless information-theoretic security. Following are a
couple of recent papers (preprints) on this topic.
M. Bloch, J. Barros, M. R. D. Rodrigues and S. W. McLaughlin, "Wireless
Information-Theoretic Security - Part I: Theoretical Aspects,"
Submitted to the IEEE Transactions on Information Theory, Special Issue
on Information-Theoretic Security, November 2006.
Y. Liang, H. V. Poor and S. Shamai (Shitz), "Secure
communication over fading channels," Submitted to IEEE Transactions
on Information Theory, Special Issue on information theoretic security,
Transmission of Correlated Data in Wireless
Leader: Stephan Tinguely (ETHZ)
view of these two papers, would it be possible to change the title of
the topic to "Transmission of Correlated Data in Networks", rather than
"Transmission of Correlated Data in Wireless Networks"?
M. Cover, A. El Gamal, M. Salehi, "Multiple Access Channels with
Arbitrarily Correlated Sources", IEEE Transactions on Information
Theory, Vol. IT-26(6), November 1980.
T. S. Han, M. H. M. Costa, "Broadcast Channels with Arbitrarily
Correlated Sources", IEEE Transactions on Information Theory, Vol.
IT-33(5), September 1987.
Joint Source-Channel Coding
Leader: Deniz Gunduz (Brooklyn Poly)
will be posted soon.
Information Theoretical Aspects of
Leader: Amichi Sanderovich (The Technion)
The talk can include one of
the following issues:
* Wyner model and joint cell-site processing with fading channels.
* Sub-optimal approaches to improve decentralized detection, such as
binary signaling, etc.
* Efficiently combining local and centralized decoding with separated
Sample papers are:
Aaron Wyner "Shannon Theoretic Approach to a Gaussian Celluler Multiple
Access Channel," IEEE Transactions on Information Theory, November 1994,
R. Dabora and S. Servetto "Broadcast Channels With Cooperating
Decoders," IEEE Transactions on Information Theory, December, pp.