Main » Synth-ICR » 2019 » 2019 »

Multi-Armed Bandits: Theory and Applications to Online Learning in Networks

Qing Zhao


Anthology ID:
DBLP:series/synthesis/2019Zhao
Volume:
2019
Year:
2019
Venue:
synthesis_series
Publisher:
Morgan & Claypool Publishers
URL:
https://doi.org/10.2200/S00941ED2V01Y201907CNT022
DOI:
10.2200/S00941ED2V01Y201907CNT022
DBLP:
series/synthesis/2019Zhao
BibTeX:
Download