Two-Armed Restless Bandits with Imperfect Information: Stochastic Control and Indexability / Roland G. Fryer, Jr., Philipp Harms.
Material type:![Text](/opac-tmpl/lib/famfamfam/BK.png)
- Hardcopy version available to institutional subscribers
Item type | Home library | Collection | Call number | Status | Date due | Barcode | Item holds | |
---|---|---|---|---|---|---|---|---|
Working Paper | Biblioteca Digital | Colección NBER | nber w19043 (Browse shelf(Opens below)) | Not For Loan |
Collection: Colección NBER Close shelf browser (Hides shelf browser)
May 2013.
We present a two-armed bandit model of decision making under uncertainty where the expected return to investing in the "risky arm'' increases when choosing that arm and decreases when choosing the "safe'' arm. These dynamics are natural in applications such as human capital development, job search, and occupational choice. Using new insights from stochastic control, along with a monotonicity condition on the payoff dynamics, we show that optimal strategies in our model are stopping rules that can be characterized by an index which formally coincides with Gittins' index. Our result implies the indexability of a new class of "restless'' bandit models.
Hardcopy version available to institutional subscribers
System requirements: Adobe [Acrobat] Reader required for PDF files.
Mode of access: World Wide Web.
Print version record
There are no comments on this title.