Cooperation control in Parallel SAT Solving: a Multi-armed Bandit Approach

Abstract : In recent years, Parallel SAT solvers have leveraged with the so called Parallel Portfolio architecture. In this setting, a collection of independent Conflict-Directed Clause Learning (CDCL) algorithms compete and cooperate through Clause Sharing. However, when the number of cores increases, systematic clause sharing between CDCLs can slow down the search performance. Previous work has shown how the efficiency of the approach can be improved through controlling dynamically the amount of information shared by the cores [Hamadi et al., IJCAI09], specifically the allowed length of the shared clauses. In the present paper, a new approach is used to control the cooperation topology (pairs of units able to exchange clauses). This approach, referred to as Bandit Ensemble for parallel SAT Solving (BESS), relies on a multi-armed bandit formalization of the cooperation choices. BESS is empirically validated on the recent 2012 SAT challenge benchmark.
Document type :
Reports
Complete list of metadatas

https://hal.inria.fr/hal-00733282
Contributor : Nadjib Lazaar <>
Submitted on : Monday, December 17, 2012 - 5:54:04 PM
Last modification on : Wednesday, March 27, 2019 - 4:41:27 PM
Long-term archiving on : Sunday, December 18, 2016 - 4:14:23 AM

Files

RR-8070.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-00733282, version 2

Citation

Nadjib Lazaar, Youssef Hamadi, Said Jabbour, Michèle Sebag. Cooperation control in Parallel SAT Solving: a Multi-armed Bandit Approach. [Research Report] RR-8070, INRIA. 2012, pp.18. ⟨hal-00733282v2⟩

Share

Metrics

Record views

884

Files downloads

494