BACK TO INDEX

Publications of L. Wei
Thesis
  1. L. Wei. Optimal Learning of Deployment and Search Strategies for Robotic Teams. PhD thesis, Electrical and Computer Engineering, Michigan State University, August 2021. Keyword(s): Multiarmed Bandits, Search and Surveillance, Network Systems, Decision Making.


Articles in journal, book chapters
  1. L. Wei and V. Srivastava. Nonstationary Stochastic Bandits: UCB Policies and Minimax Regret. IEEE Open Journal of Control Systems, 3:128-142, 2024. Keyword(s): Decision Making, Multiarmed Bandits.


  2. L. Wei and V. Srivastava. Minimax Policy for Heavy-tailed Bandits. IEEE Control Systems Letters, 5(4):1423-1428, 2021. Note: Available at: arXiv preprint arXiv:2007.10493. Keyword(s): Decision Making, Multiarmed Bandits.


Conference articles
  1. A. McDonald, L. Wei, and V. Srivastava. Online Estimation and Coverage Control with Heterogeneous Sensing Information. In IEEE Conference on Control Technology and Applications, San Diego, CA, pages 144-149, August 2021. Keyword(s): Network Systems, Decision Making, Search and Surveillance.


  2. L. Wei, A. McDonald, and V. Srivastava. Multi-Robot Gaussian Process Estimation and Coverage: A Deterministic Sequencing Algorithm and Regret Analysis. In International Conference on Robotics and Automation, Xi'an China, pages 9080-9085, May 2021. Keyword(s): Network Systems, Decision Making, Search and Surveillance.


  3. L. Wei, X. Tan, and V. Srivastava. Expedited Multi-Target Search with Guaranteed Performance via Multi-fidelity Gaussian Processes. In IEEE/RSJ International Conference on Intelligent Robots and Systems, Las Vegas, NV (Virtual), pages 7095-7100, October 2020. Keyword(s): Decision Making, Search and Surveillance, Multiarmed Bandits.


  4. L. Wei and V. Srivastava. On Abruptly-Changing and Slowly-Varying Multiarmed Bandit Problems. In American Control Conference, Milwaukee, WI, pages 6291-6296, June 2018. Keyword(s): Decision Making, Multiarmed Bandits.


  5. L. Wei and V. Srivastava. On Distributed Multi-player Multiarmed Bandit Problems in Abruptly Changing Environment. In IEEE Conf. on Decision and Control, Miami Beach, FL, pages 5783-5788, December 2018. Keyword(s): Decision Making, Multiarmed Bandits.



BACK TO INDEX




Last modified: Fri Apr 5 13:52:01 2024
Author: vaibhav.