skip to main content
10.1145/3649329.3656250acmconferencesArticle/Chapter ViewAbstractPublication PagesdacConference Proceedingsconference-collections
research-article

NeuroSelect: Learning to Select Clauses in SAT Solvers

Published: 07 November 2024 Publication History

Abstract

Modern SAT solvers depend on conflict-driven clause learning to avoid recurring conflicts. Deleting less valuable learned clauses is a crucial component of modern SAT solvers to ensure efficiency. However, a single clause deletion policy cannot guarantee optimal performance on all SAT instances. This paper introduces a new clause deletion metric to diversify existing clause deletion policies. Then, we propose to use machine learning to evaluate and select clause deletion policies adaptively based on the input instance. We show that our method can reduce the runtime of the state-of-the-art SAT solver Kissat by 5.8% on large industry benchmarks.

References

[1]
S. COOK, "The complexity of theorem proving procedure," in Proc. STOC, 1971.
[2]
P. Vaezipoor, G. Lederman, Y. Wu, R. Grosse, and F. Bacchus, "Learning clause deletion heuristics with reinforcement learning," in AITP, 2020.
[3]
B. Bünz and M. Lamm, "Graph neural networks and boolean satisfiability," arXiv preprint arXiv:1702.03592, 2017.
[4]
D. Selsam, M. Lamm, B. Benedikt, P. Liang, L. de Moura, D. L. Dill et al., "Learning a SAT solver from single-bit supervision," in Proc. ICLR, 2018.
[5]
D. Selsam and N. Bjørner, "Guiding high-performance SAT solvers with unsat-core predictions," in Proc. SAT, 2019.
[6]
V. Kurin, S. Godil, S. Whiteson, and B. Catanzaro, "Can Q-learning with graph networks learn a generalizable branching heuristic for a sat solver?" Proc. NIPS, vol. 33, pp. 9608--9621, 2020.
[7]
W. Zhang, Z. Sun, Q. Zhu, G. Li, S. Cai, Y. Xiong, and L. Zhang, "Nlocalsat: Boosting local search with solution prediction," arXiv preprint arXiv:2001.09398, 2020.
[8]
J. M. Han, "Enhancing sat solvers with glue variable predictions," arXiv preprint arXiv:2007.02559, 2020.
[9]
J. H. Liang, C. Oh, M. Mathew, C. Thomas, C. Li, and V. Ganesh, "Machine learning-based restart policy for CDCL SAT solvers," in Proc. SAT, 2018.
[10]
F. Scarselli, M. Gori, A. C. Tsoi, M. Hagenbuchner, and G. Monfardini, "The graph neural network model," IEEE Transactions on Neural Networks (TNN), vol. 20, no. 1, pp. 61--80, 2008.
[11]
K. Xu, W. Hu, J. Leskovec, and S. Jegelka, "How powerful are graph neural networks?" in Proc. ICLR, 2018.
[12]
V. P. Dwivedi and X. Bresson, "A generalization of transformer networks to graphs," arXiv preprint arXiv:2012.09699, 2020.
[13]
D. Kreuzer, D. Beaini, W. Hamilton, V. Létourneau, and P. Tossou, "Rethinking graph transformers with spectral attention," Proc. NIPS, vol. 34, pp. 21 618--21 629, 2021.
[14]
F. Shi, C. Lee, M. K. Bashar, N. Shukla, S.-C. Zhu, and V. Narayanan, "Transformer-based machine learning for fast sat solvers and logic synthesis," arXiv preprint arXiv:2107.07116, 2021.
[15]
Z. Shi, M. Li, Y. Liu, S. Khan, J. Huang, H. Zhen, M. Yuan, and Q. Xu, "Satformer: Transformer-based UNSAT core learning," in Proc. ICCAD, 2023.
[16]
A. Biere and M. Fleury, "Gimsatul, IsaSAT and Kissat entering the SAT Competition 2022," in SAT Competition, 2022.
[17]
W. Wang, Y. Hu, M. Tiwari, S. Khurshid, K. McMillan, and R. Miikkulainen, "Neurocomb: Improving SAT solving with graph neural networks," arXiv preprint arXiv:2110.14053, 2021.
[18]
Q. Wu, W. Zhao, C. Yang, H. Zhang, F. Nie, H. Jiang, Y. Bian, and J. Yan, "Sgformer: Simplifying and empowering transformers for large-graph representations," in Proc. NIPS, 2023.
[19]
Z. Li, J. Guo, and X. Si, "G4SATBench: Benchmarking and advancing sat solving with graph neural networks," arXiv preprint arXiv:2309.16941, 2023.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
DAC '24: Proceedings of the 61st ACM/IEEE Design Automation Conference
June 2024
2159 pages
ISBN:9798400706011
DOI:10.1145/3649329
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 07 November 2024

Check for updates

Qualifiers

  • Research-article

Conference

DAC '24
Sponsor:
DAC '24: 61st ACM/IEEE Design Automation Conference
June 23 - 27, 2024
CA, San Francisco, USA

Acceptance Rates

Overall Acceptance Rate 1,770 of 5,499 submissions, 32%

Upcoming Conference

DAC '25
62nd ACM/IEEE Design Automation Conference
June 22 - 26, 2025
San Francisco , CA , USA

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)41
  • Downloads (Last 6 weeks)41
Reflects downloads up to 23 Dec 2024

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media