\`x^2+y_1+z_12^34\`
Advanced Search
Article Contents
Article Contents

Capacity of random channels with large alphabets

TS and JL were supported by the ETH grant (ETH-15 12-2). DS acknowledges support by the Swiss National Science Foundation (SNSF) via the National Centre of Competence in Research QSIT and by the Air Force Office of Scientific Research (AFOSR) via grant FA9550-16-1-0245.
Abstract / Introduction Full Text(HTML) Figure(2) Related Papers Cited by
  • We consider discrete memoryless channels with input alphabet size $n$ and output alphabet size $m$ , where $m=\left\lceil{γ n}\right\rceil$ for some constant $γ>0$ . The channel transition matrix consists of entries that, before being normalized, are independent and identically distributed nonnegative random variables $V$ and such that $\mathbb{E}{(V \log V)^2}<∞$ . We prove that in the limit as $n{\to }∞$ the capacity of such a channel converges to $\text{Ent}(V) / \mathbb{E}[V]$ almost surely and in $\text{L}^{2}$ , where $\text{Ent}(V):= \mathbb{E}[{V\log V}]-\mathbb{E}[{V}]\log \mathbb{E}[{V}]$ denotes the entropy of $V$ . We further show that, under slightly different model assumptions, the capacity of these random channels converges to this asymptotic value exponentially in $n$ . Finally, we present an application in the context of Bayesian optimal experiment design.

    Mathematics Subject Classification: Primary: 94A15, 94A17; Secondary: 62B10.

    Citation:

    \begin{equation} \\ \end{equation}
  • 加载中
  • Figure 1.  For different alphabet sizes $n$ we plot the capacity of five random channels, constructed as explained in Example 3.1. The method introduced in [20] is used to determine upper and lower bounds for the capacity for finite alphabet sizes $n$. The asymptotic capacity (for $n\to \infty$) is depicted by the dashed line

    Figure 2.  For different alphabet sizes $n$, we plot in (a) the empirical mean of the maximum expected information gain (blue line) $\tfrac{1}{N} \sum_{i=1}^N \sup_{\lambda\in\Lambda} I(p, \mathsf{W}_i^{(\lambda, V, n)})$, where $(\mathsf{W}_i^{(\lambda, V, n)})_{i=1}^N$ are independent channels and $N=1000$. The red line represents the empirical mean of the suboptimal expected information gain, that is given by $\tfrac{1}{N} \sum_{i=1}^N I(p, \mathsf{W}_i^{(\hat{\lambda}, V, n)})$, where $\hat{\lambda}$ are the optimal parameters for the asymptotic capacity, derived in Proposition 4.3. (b) depicts the empirical variance of the maximum expected information gain (blue line) as well as the empirical variance of the suboptimal expected information gain (red line)

  • [1] S. Arimoto, An algorithm for computing the capacity of arbitrary discrete memoryless channels, IEEE Transactions on Information Theory, 18 (1972), 14-20. 
    [2] D. P. Bertsekas, Convex Optimization Theory Athena Scientific optimization and computation series, Athena Scientific, 2009.
    [3] E. BiglieriJ. Proakis and S. Shamai, Fading channels: Information-theoretic and communications aspects, IEEE Transactions on Information Theory, 44 (1998), 2619-2692.  doi: 10.1109/18.720551.
    [4] R.E. Blahut, Computation of channel capacity and rate-distortion functions, IEEE Transactions on Information Theory, 18 (1972), 460-473. 
    [5] S. Boucheron, G. Lugosi and P. Massart, Concentration Inequalities Oxford University Press, Oxford, 2013, URL http://dx.doi.org/10.1093/acprof:oso/9780199535255.001.0001, A nonasymptotic theory of independence.
    [6] A. G. Busetto, A. Hauser, G. Krummenacher, M. Sunnåker, S. Dimopoulos, C. S. Ong, J. Stelling and J. M. Buhmann, Near-optimal experimental design for model selection in systems biology., Bioinformatics, 29 (2013), 2625-2632, URL http://dblp.uni-trier.de/db/journals/bioinformatics/bioinformatics29.html#BusettoHKSDOSB13. doi: 10.1093/bioinformatics/btt436.
    [7] M. Chiang, Geometric programming for communication systems, Foundations and Trends in Communications and Information Theory, 2 (2005), 1-154.  doi: 10.1561/0100000005.
    [8] M. Chiang and S. Boyd, Geometric programming duals of channel capacity and rate distortion, IEEE Transactions on Information Theory, 50 (2004), 245-258.  doi: 10.1109/TIT.2003.822581.
    [9] T. M. Cover and J. A. Thomas, Elements of Information Theory Wiley Interscience, 2006.
    [10] L. Devroye, Nonuniform Random Variate Generation Springer-Verlag, New York, 1986, URL http://dx.doi.org/10.1007/978-1-4613-8643-8.
    [11] R. Durrett, Probability: Theory and Examples Cambridge University Press, 2010.
    [12] M.B. Hastings, Superadditivity of communication capacity using entangled inputs, Nature Physics, 5 (2009), 255-257.  doi: 10.1038/nphys1224.
    [13] A. S. Holevo, Quantum Systems, Channels, Information De Gruyter Studies in Mathematical Physics 16,2012.
    [14] J. Huang and S.P. Meyn, Characterization and computation of optimal distributions for channel coding, IEEE Transactions on Information Theory, 51 (2005), 2336-2351.  doi: 10.1109/TIT.2005.850108.
    [15] D.V. Lindley, On a measure of the information provided by an experiment, Ann. Math. Statist., 27 (1956), 986-1005.  doi: 10.1214/aoms/1177728069.
    [16] M. Raginsky and I. Sason, Concentration of measure inequalities in information theory, communications, and coding, Foundations and Trends in Communications and Information Theory, 10 (2013), 1-246.  doi: 10.1561/0100000064.
    [17] R. T. Rockafellar, Convex Analysis Princeton Landmarks in Mathematics and Physics Series, Princeton University Press, 1997.
    [18] W. Rudin, Principles of Mathematical Analysis 3rd edition, McGraw-Hill Book Co., New York-Auckland-Düsseldorf, 1976, International Series in Pure and Applied Mathematics.
    [19] C. E. Shannon, A mathematical theory of communication, Bell System Technical Journal, 27 (1948), 379-423,623-656, URL http://math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf. doi: 10.1002/j.1538-7305.1948.tb01338.x.
    [20] T. SutterD. SutterP. Mohajerin Esfahani and J. Lygeros, Efficient approximation of channel capacities, Information Theory, IEEE Transactions on, 61 (2015), 1649-1666.  doi: 10.1109/TIT.2015.2401002.
    [21] A.M. Tulino and S. Verdú, Random matrix theory and wireless communications, Foundations and Trends in Communications and Information Theory, 1 (2014), 1-182.  doi: 10.1561/0100000001.
  • 加载中

Figures(2)

SHARE

Article Metrics

HTML views(2180) PDF downloads(202) Cited by(0)

Access History

Other Articles By Authors

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return