Skip to main content
Ctrl+K
Multi-Armed Bandits - Home Multi-Armed Bandits - Home
  • Introduction to Multi-Armed Bandits
  • 1. Introduction to Multi-Armed Bandits
  • 2.Stochastic Multi-Armed Bandits
  • 3. Explore-then-Commit (ETC) & Upper-Confidence-Bound (UCB)*
  • 4. Principles and Performance Comparison of ETC and UCB
  • 5. Multi-Armed Bandits with Probing
  • 6. Thompson Sampling
  • 7. A Unified Approach to Translate Classic Bandit Algorithms to the Structured Bandit Setting
  • 8. Full Feedback and Adversarial Costs & Adversarial Bandits
  • Repository
  • Open issue

Index

By Dr. Fangli Ying (ECUST) & Dr. Osman Yagan (CMU)

© Copyright 2024, Fangli Ying & Osman Yagan.