Follow
Siqi Zhang
Title
Cited by
Cited by
Year
Biased stochastic first-order methods for conditional stochastic optimization and applications in meta learning
Y Hu, S Zhang, X Chen, N He
Advances in Neural Information Processing Systems 33, 2759-2770, 2020
68*2020
On the convergence rate of stochastic mirror descent for nonsmooth nonconvex optimization
S Zhang, N He
arXiv preprint arXiv:1806.04781, 2018
592018
The complexity of nonconvex-strongly-concave minimax optimization
S Zhang, J Yang, C Guzmán, N Kiyavash, N He
Uncertainty in Artificial Intelligence, 482-492, 2021
552021
A catalyst framework for minimax optimization
J Yang, S Zhang, N Kiyavash, N He
Advances in Neural Information Processing Systems 33, 5667-5678, 2020
512020
First-Order Optimization Inspired from Finite-Time Convergent Flows
S Zhang, M Benosman, O Romero, A Cherian
arXiv preprint arXiv:2010.02990, 2020
4*2020
Uniform convergence and generalization for nonconvex stochastic minimax problems
S Zhang, Y Hu, L Zhang, N He
OPT 2022: Optimization for Machine Learning (NeurIPS 2022 Workshop), 2022
22022
Communication-Efficient Gradient Descent-Accent Methods for Distributed Variational Inequalities: Unified Analysis and Local Updates
S Zhang, S Choudhury, SU Stich, N Loizou
arXiv preprint arXiv:2306.05100, 2023
12023
ProxSkip for Stochastic Variational Inequalities: A Federated Learning Algorithm for Provable Communication Acceleration
S Zhang, N Loizou
12022
The system can't perform the operation now. Try again later.
Articles 1–8