Incorporating Multi-armed Bandit with Local Search for MaxSAT - Université de Picardie Jules Verne Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2022

Incorporating Multi-armed Bandit with Local Search for MaxSAT

Résumé

Partial MaxSAT (PMS) and Weighted PMS (WPMS) are two practical generalizations of the MaxSAT problem. In this paper, we propose a local search algorithm for these problems, called BandHS, which applies two multi-armed bandits to guide the search directions when escaping local optima. One bandit is combined with all the soft clauses to help the algorithm select to satisfy appropriate soft clauses, and the other bandit with all the literals in hard clauses to help the algorithm select appropriate literals to satisfy the hard clauses. These two bandits can improve the algorithm's search ability in both feasible and infeasible solution spaces. We further propose an initialization method for (W)PMS that prioritizes both unit and binary clauses when producing the initial solutions. Extensive experiments demonstrate the excellent performance and generalization capability of our proposed methods, that greatly boost the state-of-the-art local search algorithm, SATLike3.0, and the state-of-the-art SAT-based incomplete solver, NuWLS-c.

Dates et versions

hal-04154091 , version 1 (06-07-2023)

Identifiants

Citer

Jiongzhi Zheng, Kun He, Jianrong Zhou, Yan Jin, Chu-Min Li, et al.. Incorporating Multi-armed Bandit with Local Search for MaxSAT. 2023. ⟨hal-04154091⟩
3 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More