(image credits)

  Bokun Wang

I am a Ph.D. student under the supervision of Tianbao Yang. I got my bachelor's degree in Computer Science from University of Electronic Science and Technology of China in 2018. My current research interests are stochastic optimization and distributed optimization for machine learning.

Recent Papers [Full List]

Bokun Wang and Tianbao Yang. Finite-Sum Coupled Compositional Stochastic Optimization: Theory and Applications. International Conference on Machine Learning (ICML), 2022. [Paper, Code]

Bokun Wang, Shiqian Ma, and Lingzhou Xue. Riemannian Stochastic Proximal Gradient Methods for Nonsmooth Optimization over the Stiefel Manifold. Journal of Machine Learning Research (JMLR), 2022. [Paper, Code]

Konstantin Mishchenko, Bokun Wang, Dmitry Kovalev, and Peter Richtárik. IntSGD: Adaptive Floatless Compression of Stochastic Gradients. International Conference on Learning Representations (ICLR), 2022 (Spotlight). [Paper, Code, Poster, Slides]


King Abdullah University of Science and Technology (KAUST), research Intern, advised by Peter Richtárik, September 2020 - August 2021.

Shenzhen Research Institute of Big Data (SRIBD), research Intern, advised by Tong Zhang, May 2020 - August 2020.

University of California Davis (UC Davis), research Intern, advised by Shiqian Ma, July 2019 - September 2019.

University of California Davis (UC Davis), teaching assistant of ECS 32B: Introduction to Data Structures, ECS 154A: Computer Architecture, ECS 170: Introduction to Artificial Intelligence, and ECS 271: Machine Learning & Discovery, January 2018 - March 2020.


Reviewer of NeurIPS 2021, ICML 2022.