skip to main content
10.1145/3406325.3451119acmconferencesArticle/Chapter ViewAbstractPublication PagesstocConference Proceedingsconference-collections
research-article
Public Access

Frozen 1-RSB structure of the symmetric Ising perceptron

Published: 15 June 2021 Publication History

Abstract

We prove, under an assumption on the critical points of a real-valued function, that the symmetric Ising perceptron exhibits the `frozen 1-RSB' structure conjectured by Krauth and Mezard in the physics literature; that is, typical solutions of the model lie in clusters of vanishing entropy density. Moreover, we prove this in a very strong form conjectured by Huang, Wong, and Kabashima: a typical solution of the model is isolated with high probability and the Hamming distance to all other solutions is linear in the dimension. The frozen 1-RSB scenario is part of a recent and intriguing explanation of the performance of learning algorithms by Baldassi, Ingrosso, Lucibello, Saglietti, and Zecchina. We prove this structural result by comparing the symmetric Ising perceptron model to a planted model and proving a comparison result between the two models. Our main technical tool towards this comparison is an inductive argument for the concentration of the logarithm of number of solutions in the model.

References

[1]
Emmanuel Abbe. 2017. Community detection and stochastic block models: recent developments. The Journal of Machine Learning Research, 18, 1, 2017. Pages 6446–6531.
[2]
Dimitris Achlioptas and Amin Coja-Oghlan. 2008. Algorithmic barriers from phase transitions. In 2008 49th Annual IEEE Symposium on Foundations of Computer Science. Pages 793–802.
[3]
Dimitris Achlioptas, Amin Coja-Oghlan, and Federico Ricci-Tersenghi. 2011. On the solution-space geometry of random constraint satisfaction problems. Random Structures & Algorithms, 38, 3, 2011. Pages 251–268.
[4]
Dimitris Achlioptas and Cristopher Moore. 2002. The asymptotic order of the random k-SAT threshold. In The 43rd Annual IEEE Symposium on Foundations of Computer Science, 2002. Proceedings. Pages 779–788.
[5]
Benjamin Aubin, Will Perkins, and Lenka Zdeborová. 2019. Storage capacity in symmetric binary perceptrons. Journal of Physics A: Mathematical and Theoretical, 2019.
[6]
Carlo Baldassi. 2009. Generalization learning in a perceptron with binary synapses. Journal of Statistical Physics, 136, 5, 2009. Pages 902–916.
[7]
Carlo Baldassi, Christian Borgs, Jennifer T Chayes, Alessandro Ingrosso, Carlo Lucibello, Luca Saglietti, and Riccardo Zecchina. 2016. Unreasonable effectiveness of learning neural networks: From accessible states and robust ensembles to basic algorithmic schemes. Proceedings of the National Academy of Sciences, 113, 48, 2016. Pages E7655–E7662.
[8]
Carlo Baldassi, Alfredo Braunstein, Nicolas Brunel, and Riccardo Zecchina. 2007. Efficient supervised learning in networks with binary synapses. Proceedings of the National Academy of Sciences, 104, 26, 2007. Pages 11079–11084.
[9]
Carlo Baldassi, Riccardo Della Vecchia, Carlo Lucibello, and Riccardo Zecchina. 2020. Clustering of solutions in the symmetric binary perceptron. Journal of Statistical Mechanics: Theory and Experiment, 2020, 7, 2020. Pages 30403.
[10]
Carlo Baldassi, Alessandro Ingrosso, Carlo Lucibello, Luca Saglietti, and Riccardo Zecchina. 2015. Subdominant dense clusters allow for simple learning and high computational performance in neural networks with discrete synapses. Physical review letters, 115, 12, 2015. Pages 128101.
[11]
Carlo Baldassi, Alessandro Ingrosso, Carlo Lucibello, Luca Saglietti, and Riccardo Zecchina. 2016. Local entropy as a measure for sampling solutions in constraint satisfaction problems. Journal of Statistical Mechanics: Theory and Experiment, 2016, 2, 2016. Pages 9921.
[12]
Victor Bapst, Amin Coja-Oghlan, and Charilaos Efthymiou. 2017. Planting colourings silently. Combinatorics, probability and computing, 26, 3, 2017. Pages 338–366.
[13]
Victor Bapst, Amin Coja-Oghlan, Samuel Hetterich, Felicia Raß mann, and Dan Vilenchik. 2016. The condensation phase transition in random graph coloring. Communications in Mathematical Physics, 341, 2, 2016. Pages 543–606.
[14]
Zsolt Bartha, Nike Sun, and Yumeng Zhang. 2019. Breaking of 1RSB in random regular MAX-NAE-SAT. In 2019 IEEE 60th Annual Symposium on Foundations of Computer Science (FOCS). Pages 1405–1416.
[15]
Alfredo Braunstein and Riccardo Zecchina. 2006. Learning by message passing in networks of discrete synapses. Physical review letters, 96, 3, 2006. Pages 12417.
[16]
Amin Coja-Oghlan, Charilaos Efthymiou, Nor Jaafari, Mihyun Kang, and Tobias Kapetanopoulos. 2018. Charting the replica symmetric phase. Communications in Mathematical Physics, 359, 2, 2018. Pages 603–698.
[17]
Amin Coja-Oghlan, Florent Krzakala, Will Perkins, and Lenka Zdeborová. 2018. Information-theoretic thresholds from the cavity method. Advances in Mathematics, 333, 2018. Pages 694–795.
[18]
Amin Coja-Oghlan and Lenka Zdeborová. 2012. The condensation transition in random hypergraph 2-coloring. In Proceedings of the twenty-third annual ACM-SIAM symposium on Discrete Algorithms. Pages 241–250.
[19]
Thomas M Cover. 1965. Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition. IEEE Transactions on Electronic Computers, 3, 1965. Pages 326–334.
[20]
Jian Ding, Allan Sly, and Nike Sun. 2014. Satisfiability threshold for random regular NAE-SAT. In Proceedings of the forty-sixth annual ACM symposium on Theory of computing. Pages 814–822.
[21]
Jian Ding and Nike Sun. 2019. Capacity lower bound for the Ising perceptron. In Proceedings of the 51st Annual ACM SIGACT Symposium on Theory of Computing. Pages 816–827.
[22]
Ehud Friedgut. 1999. Sharp thresholds of graph properties, and the k–sat problem. Journal of the American mathematical Society, 12, 4, 1999. Pages 1017–1054.
[23]
Elizabeth Gardner. 1987. Maximum storage capacity in neural networks. EPL (Europhysics Letters), 4, 4, 1987. Pages 481.
[24]
E Gardner and B Derrida. 1988. Optimal storage properties of neural network models. Journal of Physics A: Mathematical and general, 21, 1, 1988. Pages 271.
[25]
D. L. Hanson and F. T. Wright. 1971. A bound on tail probabilities for quadratic forms in independent random variables. Ann. Math. Statist., 42, 1971. Pages 1079–1083. issn:0003-4851 https://doi.org/10.1214/aoms/1177693335
[26]
Haiping Huang and Yoshiyuki Kabashima. 2014. Origin of the computational hardness for learning with binary synapses. Physical Review E, 90, 5, 2014. Pages 052813.
[27]
Haiping Huang, KY Michael Wong, and Yoshiyuki Kabashima. 2013. Entropy landscape of solutions in the binary perceptron problem. Journal of Physics A: Mathematical and Theoretical, 46, 37, 2013. Pages 375002.
[28]
Jeong Han Kim and James R Roche. 1998. Covering cubes by random half cubes, with applications to binary neural networks. J. Comput. System Sci., 56, 2, 1998. Pages 223–252.
[29]
Werner Krauth and Marc Mézard. 1989. Storage capacity of memory networks with binary couplings. Journal de Physique, 50, 20, 1989. Pages 3057–3066.
[30]
Florent Krzakał a, Andrea Montanari, Federico Ricci-Tersenghi, Guilhem Semerjian, and Lenka Zdeborová. 2007. Gibbs states and the set of solutions of random constraint satisfaction problems. Proceedings of the National Academy of Sciences, 104, 25, 2007. Pages 10318–10323.
[31]
Florent Krzakala and Lenka Zdeborová. 2009. Hiding quiet solutions in random constraint satisfaction problems. Physical review letters, 102, 23, 2009. Pages 238701.
[32]
David Mitchell, Bart Selman, and Hector Levesque. 1992. Hard and easy distributions of SAT problems. In AAAI. 92, Pages 459–465.
[33]
Michael Molloy. 2018. The freezing threshold for k-colourings of a random graph. Journal of the ACM (JACM), 65, 2, 2018. Pages 1–62.
[34]
Andrea Montanari, Ricardo Restrepo, and Prasad Tetali. 2011. Reconstruction and clustering in random constraint satisfaction problems. SIAM Journal on Discrete Mathematics, 25, 2, 2011. Pages 771–808.
[35]
Alexander Markowitsch Ostrowski. 1938. Sur l'approximation du determinant de Fredholm par les determinants des syst\`emes d'équations linéaires. Ark. Math. Stockholm, 26A, 1938. Pages 1–15.
[36]
Allan Sly, Nike Sun, and Yumeng Zhang. 2016. The number of solutions for random regular NAE-SAT. In 2016 IEEE 57th Annual Symposium on Foundations of Computer Science (FOCS). Pages 724–731.
[37]
Haim Sompolinsky, Naftali Tishby, and H Sebastian Seung. 1990. Learning from examples in large neural networks. Physical Review Letters, 65, 13, 1990. Pages 1683.
[38]
Michel Talagrand. 1999. Intersecting random half cubes. Random Structures & Algorithms, 15, 3-4, 1999. Pages 436–449.
[39]
Michel Talagrand. 2010. Mean field models for spin glasses: Volume I: Basic examples. 54, Springer Science & Business Media.
[40]
David J Thouless, Philip W Anderson, and Robert G Palmer. 1977. Solution of 'Solvable model of a spin glass'. Philos. Mag., 35, 3, 1977. Pages 593–601.
[41]
Changji Xu. 2019. Sharp threshold for the Ising perceptron model. arXiv preprint arXiv:1905.05978, 2019.
[42]
Lenka Zdeborová and Florent Krzakał a. 2007. Phase transitions in the coloring of random graphs. Physical Review E, 76, 3, 2007. Pages 12889.
[43]
Lenka Zdeborová and Florent Krzakala. 2016. Statistical physics of inference: Thresholds and algorithms. Advances in Physics, 65, 5, 2016. Pages 453–552.
[44]
Lenka Zdeborová and Marc Mézard. 2008. Constraint satisfaction problems with isolated solutions are hard. Journal of Statistical Mechanics: Theory and Experiment, 2008, 12, 2008. Pages P12004.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
STOC 2021: Proceedings of the 53rd Annual ACM SIGACT Symposium on Theory of Computing
June 2021
1797 pages
ISBN:9781450380539
DOI:10.1145/3406325
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 15 June 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Ising perceptron
  2. Perceptron model
  3. frozen solutions
  4. solution space

Qualifiers

  • Research-article

Funding Sources

Conference

STOC '21
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,469 of 4,586 submissions, 32%

Upcoming Conference

STOC '25
57th Annual ACM Symposium on Theory of Computing (STOC 2025)
June 23 - 27, 2025
Prague , Czech Republic

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)73
  • Downloads (Last 6 weeks)7
Reflects downloads up to 22 Dec 2024

Other Metrics

Citations

Cited By

View all

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media