語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Improving the Adaptive Moment Estima...
~
Alblwi, Abdalrahman.
FindBook
Google Book
Amazon
博客來
Improving the Adaptive Moment Estimation Optimization Methods for Modern Machine Learning.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Improving the Adaptive Moment Estimation Optimization Methods for Modern Machine Learning./
作者:
Alblwi, Abdalrahman.
出版者:
Ann Arbor : ProQuest Dissertations & Theses, : 2020,
面頁冊數:
43 p.
附註:
Source: Masters Abstracts International, Volume: 82-04.
Contained By:
Masters Abstracts International82-04.
標題:
Electrical engineering. -
電子資源:
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=27994120
ISBN:
9798678106827
Improving the Adaptive Moment Estimation Optimization Methods for Modern Machine Learning.
Alblwi, Abdalrahman.
Improving the Adaptive Moment Estimation Optimization Methods for Modern Machine Learning.
- Ann Arbor : ProQuest Dissertations & Theses, 2020 - 43 p.
Source: Masters Abstracts International, Volume: 82-04.
Thesis (M.S.)--University of Delaware, 2020.
This item must not be sold to any third party vendors.
Optimization Algorithms for Neural Networks (NNs) have become a crucial and fundamental key in Artificial Intelligence. Inside Neural Networks architectures, adapting learning rates via an optimizer is a fundamental step in training neural networks. Correspondingly, choosing any optimizer can differentiate the NNs implementations performance. The Adaptive Moment Estimation method (ADAM) is one of the most common and powerful used algorithms in training Neural Networks. However, the ADAM method has shown substandard convergence in certain cases due to fluctuating and unstable learning rates. In this thesis, we integrate two state-of-the-art algorithms: the Adaptive Moment Estimation method and a normalized Momentum. We apply the normalized Momentum of the previous gradients into the current ADAM method update to preserve and stabilize the direction of the learning rate. In this work, we test our proposed method, ADAM Plus, on the MNIST digit recognition database, CIFAR10 database, Bayesian Neural Network and U-net models. The empirical results show our proposed method has better convergence and accuracy performance. ADAM Plus demonstrates improved performance in multiple convolution neural networks applications compared to various adaptive optimization algorithms.
ISBN: 9798678106827Subjects--Topical Terms:
649834
Electrical engineering.
Subjects--Index Terms:
Machine learning
Improving the Adaptive Moment Estimation Optimization Methods for Modern Machine Learning.
LDR
:02378nmm a2200349 4500
001
2276702
005
20210510091852.5
008
220723s2020 ||||||||||||||||| ||eng d
020
$a
9798678106827
035
$a
(MiAaPQ)AAI27994120
035
$a
AAI27994120
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Alblwi, Abdalrahman.
$3
3554997
245
1 0
$a
Improving the Adaptive Moment Estimation Optimization Methods for Modern Machine Learning.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2020
300
$a
43 p.
500
$a
Source: Masters Abstracts International, Volume: 82-04.
500
$a
Advisor: Barner, Kenneth E.
502
$a
Thesis (M.S.)--University of Delaware, 2020.
506
$a
This item must not be sold to any third party vendors.
520
$a
Optimization Algorithms for Neural Networks (NNs) have become a crucial and fundamental key in Artificial Intelligence. Inside Neural Networks architectures, adapting learning rates via an optimizer is a fundamental step in training neural networks. Correspondingly, choosing any optimizer can differentiate the NNs implementations performance. The Adaptive Moment Estimation method (ADAM) is one of the most common and powerful used algorithms in training Neural Networks. However, the ADAM method has shown substandard convergence in certain cases due to fluctuating and unstable learning rates. In this thesis, we integrate two state-of-the-art algorithms: the Adaptive Moment Estimation method and a normalized Momentum. We apply the normalized Momentum of the previous gradients into the current ADAM method update to preserve and stabilize the direction of the learning rate. In this work, we test our proposed method, ADAM Plus, on the MNIST digit recognition database, CIFAR10 database, Bayesian Neural Network and U-net models. The empirical results show our proposed method has better convergence and accuracy performance. ADAM Plus demonstrates improved performance in multiple convolution neural networks applications compared to various adaptive optimization algorithms.
590
$a
School code: 0060.
650
4
$a
Electrical engineering.
$3
649834
650
4
$a
Artificial intelligence.
$3
516317
650
4
$a
Applied mathematics.
$3
2122814
653
$a
Machine learning
653
$a
Optimization Algorithms
653
$a
Neural Networks
690
$a
0544
690
$a
0800
690
$a
0364
710
2
$a
University of Delaware.
$b
Electrical and Computer Engineering.
$3
3183620
773
0
$t
Masters Abstracts International
$g
82-04.
790
$a
0060
791
$a
M.S.
792
$a
2020
793
$a
English
856
4 0
$u
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=27994120
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9428436
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入