We consider discrete memoryless channels with input alphabet size $n$ and output alphabet size $m$ , where $m=\left\lceil{γ n}\right\rceil$ for some constant $γ>0$ . The channel transition matrix consists of entries that, before being normalized, are independent and identically distributed nonnegative random variables $V$ and such that $\mathbb{E}{(V \log V)^2}<∞$ . We prove that in the limit as $n{\to }∞$ the capacity of such a channel converges to $\text{Ent}(V) / \mathbb{E}[V]$ almost surely and in $\text{L}^{2}$ , where $\text{Ent}(V):= \mathbb{E}[{V\log V}]-\mathbb{E}[{V}]\log \mathbb{E}[{V}]$ denotes the entropy of $V$ . We further show that, under slightly different model assumptions, the capacity of these random channels converges to this asymptotic value exponentially in $n$ . Finally, we present an application in the context of Bayesian optimal experiment design.
Citation: |
Figure 1.
For different alphabet sizes
Figure 2.
For different alphabet sizes
[1] | S. Arimoto, An algorithm for computing the capacity of arbitrary discrete memoryless channels, IEEE Transactions on Information Theory, 18 (1972), 14-20. |
[2] | D. P. Bertsekas, Convex Optimization Theory Athena Scientific optimization and computation series, Athena Scientific, 2009. |
[3] | E. Biglieri, J. Proakis and S. Shamai, Fading channels: Information-theoretic and communications aspects, IEEE Transactions on Information Theory, 44 (1998), 2619-2692. doi: 10.1109/18.720551. |
[4] | R.E. Blahut, Computation of channel capacity and rate-distortion functions, IEEE Transactions on Information Theory, 18 (1972), 460-473. |
[5] | S. Boucheron, G. Lugosi and P. Massart, Concentration Inequalities Oxford University Press, Oxford, 2013, URL http://dx.doi.org/10.1093/acprof:oso/9780199535255.001.0001, A nonasymptotic theory of independence. |
[6] | A. G. Busetto, A. Hauser, G. Krummenacher, M. Sunnåker, S. Dimopoulos, C. S. Ong, J. Stelling and J. M. Buhmann, Near-optimal experimental design for model selection in systems biology., Bioinformatics, 29 (2013), 2625-2632, URL http://dblp.uni-trier.de/db/journals/bioinformatics/bioinformatics29.html#BusettoHKSDOSB13. doi: 10.1093/bioinformatics/btt436. |
[7] | M. Chiang, Geometric programming for communication systems, Foundations and Trends in Communications and Information Theory, 2 (2005), 1-154. doi: 10.1561/0100000005. |
[8] | M. Chiang and S. Boyd, Geometric programming duals of channel capacity and rate distortion, IEEE Transactions on Information Theory, 50 (2004), 245-258. doi: 10.1109/TIT.2003.822581. |
[9] | T. M. Cover and J. A. Thomas, Elements of Information Theory Wiley Interscience, 2006. |
[10] | L. Devroye, Nonuniform Random Variate Generation Springer-Verlag, New York, 1986, URL http://dx.doi.org/10.1007/978-1-4613-8643-8. |
[11] | R. Durrett, Probability: Theory and Examples Cambridge University Press, 2010. |
[12] | M.B. Hastings, Superadditivity of communication capacity using entangled inputs, Nature Physics, 5 (2009), 255-257. doi: 10.1038/nphys1224. |
[13] | A. S. Holevo, Quantum Systems, Channels, Information De Gruyter Studies in Mathematical Physics 16,2012. |
[14] | J. Huang and S.P. Meyn, Characterization and computation of optimal distributions for channel coding, IEEE Transactions on Information Theory, 51 (2005), 2336-2351. doi: 10.1109/TIT.2005.850108. |
[15] | D.V. Lindley, On a measure of the information provided by an experiment, Ann. Math. Statist., 27 (1956), 986-1005. doi: 10.1214/aoms/1177728069. |
[16] | M. Raginsky and I. Sason, Concentration of measure inequalities in information theory, communications, and coding, Foundations and Trends in Communications and Information Theory, 10 (2013), 1-246. doi: 10.1561/0100000064. |
[17] | R. T. Rockafellar, Convex Analysis Princeton Landmarks in Mathematics and Physics Series, Princeton University Press, 1997. |
[18] | W. Rudin, Principles of Mathematical Analysis 3rd edition, McGraw-Hill Book Co., New York-Auckland-Düsseldorf, 1976, International Series in Pure and Applied Mathematics. |
[19] | C. E. Shannon, A mathematical theory of communication, Bell System Technical Journal, 27 (1948), 379-423,623-656, URL http://math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf. doi: 10.1002/j.1538-7305.1948.tb01338.x. |
[20] | T. Sutter, D. Sutter, P. Mohajerin Esfahani and J. Lygeros, Efficient approximation of channel capacities, Information Theory, IEEE Transactions on, 61 (2015), 1649-1666. doi: 10.1109/TIT.2015.2401002. |
[21] | A.M. Tulino and S. Verdú, Random matrix theory and wireless communications, Foundations and Trends in Communications and Information Theory, 1 (2014), 1-182. doi: 10.1561/0100000001. |