Publications

Preprints

  • Liu, Y., & Davanloo Tajbakhsh, S. (2023). Adaptive Stochastic Optimization Algorithms for Problems with Biased Oracles. (preprint) arXiv:2306.07810. [arXiv]
  • Zhang, D., & Davanloo Tajbakhsh, S. (2022). Riemannian Stochastic Gradient Method for Nested Composition Optimization. (preprint) arXiv:2207.09350. [arXiv]

Selected Publications

  • Liu, Y., & Davanloo Tajbakhsh, S. (2023). Stochastic Composition Optimization of Functions Without Lipschitz Continuous Gradient. Journal of Optimization Theory and Applications, 1-51. [journal, arXiv, code]
  • Zhang, D., & Davanloo Tajbakhsh, S. (2023). Riemannian Stochastic Variance-Reduced Cubic Regularized Newton Method for Submanifold Optimization. Journal of Optimization Theory and Applications196(1), 324-361. [journal, arXiv, code]
  • Zhang, D., Liu, Y., & Davanloo Tajbakhsh, S. (2022). A First-Order Optimization Algorithm for Statistical Learning with Hierarchical Sparsity Structure. INFORMS Journal on Computing34(2), 1126-1140. [journal, arXiv, code]
  • Davanloo Tajbakhsh, S., Aybat, N. S., & Del Castillo, E. (2020). On the Theoretical Guarantees for Parameter Estimation of Gaussian Random Field Models: A Sparse Precision Matrix Approach. Journal of Machine Learning Research21(1), 8962-9002.  [journal, arXiv, code]
  • Davanloo Tajbakhsh, S., Aybat, N. S., & Del Castillo, E. (2018). Generalized Sparse Precision Matrix Selection for Fitting Multivariate Gaussian Random Fields to Large Datasets. Statistica Sinica, 941-962. [journal, arXiv, code]