Introduction to Multi-Armed Bandits

Introduction to Multi-Armed Bandits#

This online textbook provides a comprehensive introduction of Multi-Armed Bandits (MAB), covering theoretical foundations, algorithm design, and advanced applications in sequential decision-making under uncertainty. Written by Dr. Fangli Ying (ECUST) for Teaching Fellow in a Summer Camp with Prof. Osman Yağan (CMU), it progresses from stochastic bandit fundamentals (UCB, ETC algorithms) to Bayesian methods (Thompson Sampling), structured bandits with hidden parameters, and adversarial settings, featuring rigorous mathematical analysis, regret bounds, and real-world case studies in recommendation systems, wireless communications, and healthcare.