学术期刊
切换导航
首 页
文 章
期 刊
投 稿
预 印
会 议
书 籍
新 闻
合 作
我 们
按学科分类
Journals by Subject
按期刊分类
Journals by Title
核心OA期刊
Core OA Journal
数学与物理
Math & Physics
化学与材料
Chemistry & Materials
生命科学
Life Sciences
医药卫生
Medicine & Health
信息通讯
Information & Communication
工程技术
Engineering & Technology
地球与环境
Earth & Environment
经济与管理
Economics & Management
人文社科
Humanities & Social Sciences
合作期刊
Cooperation Journals
首页
数学与物理
应用数学进展
Vol. 11 No. 7 (July 2022)
期刊菜单
最新文章
历史文章
检索
领域
编委
投稿须知
文章处理费
最新文章
历史文章
检索
领域
编委
投稿须知
文章处理费
一种基于方差缩减的临近随机牛顿算法
A Proximal Random Newton Algorithm Based on Variance Reduction
DOI:
10.12677/AAM.2022.117497
,
PDF
,
,
,
被引量
作者:
杜康乐
:浙江师范大学,数学与计算机科学学院,浙江 金华
关键词:
复合优化问题
;
机器学习
;
方差缩减
;
临近随机算法
;
牛顿算法
;
Compound Optimization Problems
;
Machine Learning
;
Variance Reduce
;
Proximal Stochastic Method
;
Newton-Type Method
摘要:
本文研究了优化问题中的一类复合优化问题。 对于凸非光滑的目标函数,在临近牛顿算法的基础上,引入方差缩减的方法,提出了一种新的一一基于方差缩减的随机牛顿算法(SN V R),并进行了 收敛性分析。 与ProxSGD, ProxGD, ProxSV RG方法相比,SN V R有更快的收敛速度。
Abstract:
This paper studies a class of compound optimization problems in optimization prob- lems. For convex nonsmooth objective functions, a new random Newton algorithm based on variance reduction (SNVR) is proposed by introducing the method of vari- ance reduction on the basis of proximal Newton algorithm, and the convergence is analyzed. Compared with the methods of ProxSGD, ProxGD, proxSVRG , SNVR has better robustness and scalability, and has faster convergence speed.
文章引用:
杜康乐. 一种基于方差缩减的临近随机牛顿算法[J]. 应用数学进展, 2022, 11(7): 4708-4717.
https://doi.org/10.12677/AAM.2022.117497
参考文献
[1]
Bottou, L. (2010) Large-Scale Machine Learning with Stochastic Gradient Descent. In: Lechevallier, Y. and Saporta, G., Eds., Proceedings of COMPSTAT’2010, Physica-Verlag HD, 177-186.
https://doi.org/10.1007/978-3-7908-2604-3 16
[2]
Byrd, R.H., Hansen, S.L., Nocedal, J. and Singer, Y. (2016) A Stochastic Quasi-Newton Method for Large-Scale Optimization. SIAM Journal on Optimization, 26, 1008-1031.
https://doi.org/10.1137/140954362
[3]
Lee, J.D., Sun, Y. and Saunders, M.A. (2012) Proximal Newton-Type Methods for Minimizing Composite Functions. SIAM Journal on Optimization, 24, 1420-1443.
https://doi.org/10.1137/130921428
[4]
Xiao, L. and Zhang, T. (2014) A Proximal Stochastic Gradient Method with Progressive Variance Reduction. SIAM Journal on Optimization, 24, 2057-2075.
https://doi.org/10.1137/140961791
[5]
Moritz, P., Nishihara, R. and Jordan, M.I. (2016) A Linearly-Convergent Stochastic L-BFGS Algorithm. 19th International Conference on Artificial Intelligence and Statistics (AISTATS), 51, 249-258.
[6]
Defazio, A., Bach, F. and Lacoste-Julien, S. (2014) SAGA: A Fast Incremental Gradient Method with Support for Non-Strongly Convex Composite Objectives. Proceedings of the 27th International Conference on Neural Information Processing Systems, 1, 1646-1654.
[7]
Jin, X.B., Zhang, X.Y., Huang, K., et al. (2019) Stochastic Conjugate Gradient Algorithm with Variance Reduction. IEEE Transactions on Neural Networks and Learning Systems, 30, 1360-1369.
https://doi.org/10.1109/TNNLS.2018.2868835
投稿
为你推荐
友情链接
科研出版社
开放图书馆