z-logo
open-access-imgOpen Access
Observe Before Play: Multi-Armed Bandit with Pre-Observations
Author(s) -
Jinhang Zuo,
Xiaoxi Zhang,
Carlee JoeWong
Publication year - 2020
Publication title -
proceedings of the ... aaai conference on artificial intelligence
Language(s) - English
Resource type - Journals
eISSN - 2374-3468
pISSN - 2159-5399
DOI - 10.1609/aaai.v34i04.6187
Subject(s) - regret , upper and lower bounds , computer science , logarithm , heuristics , bernoulli's principle , multi armed bandit , bounded function , mathematical optimization , mathematics , machine learning , engineering , mathematical analysis , aerospace engineering
We consider the stochastic multi-armed bandit (MAB) problem in a setting where a player can pay to pre-observe arm rewards before playing an arm in each round. Apart from the usual trade-off between exploring new arms to find the best one and exploiting the arm believed to offer the highest reward, we encounter an additional dilemma: pre-observing more arms gives a higher chance to play the best one, but incurs a larger cost. For the single-player setting, we design an Observe-Before-Play Upper Confidence Bound (OBP-UCB) algorithm for K arms with Bernoulli rewards, and prove a T-round regret upper bound O(K2log T). In the multi-player setting, collisions will occur when players select the same arm to play in the same round. We design a centralized algorithm, C-MP-OBP, and prove its T-round regret relative to an offline greedy strategy is upper bounded in O(K4/M2log T) for K arms and M players. We also propose distributed versions of the C-MP-OBP policy, called D-MP-OBP and D-MP-Adapt-OBP, achieving logarithmic regret with respect to collision-free target policies. Experiments on synthetic data and wireless channel traces show that C-MP-OBP and D-MP-OBP outperform random heuristics and offline optimal policies that do not allow pre-observations.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here