LIPIcs.FSTTCS.2023.2.pdf
- Filesize: 355 kB
- 1 pages
Sum-of-squares semidefinite programming hierarchy is a sequence of increasingly complex semidefinite programs to reason about systems of polynomial inequalities. The k-th-level of the sum-of-squares SDP hierarchy is a semidefinite program that can be solved in time n^O(k). Sum-of-squares SDP hierarchies subsume fundamental algorithmic techniques such as linear programming and spectral methods. Many state-of-the-art algorithms for approximating NP-hard optimization problems are captured in the first few levels of the hierarchy. More recently, sum-of-squares SDPs have been applied extensively towards designing algorithms for average case problems. These include planted problems, random constraint satisfaction problems, and computational problems arising in statistics. From the standpoint of complexity theory, sum-of-squares SDPs can be applied towards measuring the average-case hardness of a problem. Most natural optimization problems can often be shown to be solvable by degree n sum-of-squares SDP, which corresponds to an exponential time algorithm. The smallest degree of the sum-of-squares relaxation needed to solve a problem can be used as a measure of the computational complexity of the problem. This approach seems especially useful for understanding average-case complexity under natural distributions. For example, the sum-of-squares degree has been used to nearly characterize the computational complexity of refuting random CSPs as a function of the number of constraints. Using the sum-of-squares degree as a proxy measure for average case complexity opens the door to formalizing certain computational phase transitions that have been conjectured for average case problems such as recovery in stochastic block models. In this talk, we discuss applications of this approach to average-case complexity and present some open problems.
Feedback for Dagstuhl Publishing