語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Architecture optimization, training ...
~
Wang, Xiaoyu.
FindBook
Google Book
Amazon
博客來
Architecture optimization, training convergence and network estimation robustness of a fully connected recurrent neural network.
紀錄類型:
書目-語言資料,印刷品 : Monograph/item
正題名/作者:
Architecture optimization, training convergence and network estimation robustness of a fully connected recurrent neural network./
作者:
Wang, Xiaoyu.
面頁冊數:
199 p.
附註:
Source: Dissertation Abstracts International, Volume: 71-05, Section: B, page: 3131.
Contained By:
Dissertation Abstracts International71-05B.
標題:
Engineering, Mechanical. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3402564
ISBN:
9781109748215
Architecture optimization, training convergence and network estimation robustness of a fully connected recurrent neural network.
Wang, Xiaoyu.
Architecture optimization, training convergence and network estimation robustness of a fully connected recurrent neural network.
- 199 p.
Source: Dissertation Abstracts International, Volume: 71-05, Section: B, page: 3131.
Thesis (Ph.D.)--Clemson University, 2010.
Recurrent neural networks (RNN) have been rapidly developed in recent years. Applications of RNN can be found in system identification, optimization, image processing, pattern reorganization, classification, clustering, memory association, etc.
ISBN: 9781109748215Subjects--Topical Terms:
783786
Engineering, Mechanical.
Architecture optimization, training convergence and network estimation robustness of a fully connected recurrent neural network.
LDR
:03543nam 2200373 4500
001
1392738
005
20110218131343.5
008
130515s2010 ||||||||||||||||| ||eng d
020
$a
9781109748215
035
$a
(UMI)AAI3402564
035
$a
AAI3402564
040
$a
UMI
$c
UMI
100
1
$a
Wang, Xiaoyu.
$3
1280728
245
1 0
$a
Architecture optimization, training convergence and network estimation robustness of a fully connected recurrent neural network.
300
$a
199 p.
500
$a
Source: Dissertation Abstracts International, Volume: 71-05, Section: B, page: 3131.
500
$a
Adviser: Yong Huang.
502
$a
Thesis (Ph.D.)--Clemson University, 2010.
520
$a
Recurrent neural networks (RNN) have been rapidly developed in recent years. Applications of RNN can be found in system identification, optimization, image processing, pattern reorganization, classification, clustering, memory association, etc.
520
$a
In this study, an optimized RNN is proposed to model nonlinear dynamical systems. A fully connected RNN is developed first which is modified from a fully forward connected neural network (FFCNN) by accommodating recurrent connections among its hidden neurons. In addition, a destructive structure optimization algorithm is applied and the extended Kalman filter (EKF) is adopted as a network's training algorithm. These two algorithms can seamlessly work together to generate the optimized RNN. The enhancement of the modeling performance of the optimized network comes from three parts: (1) its prototype - the FFCNN has advantages over multilayer perceptron network (MLP), the most widely used network, in terms of modeling accuracy and generalization ability; (2) the recurrency in RNN network make it more capable of modeling non-linear dynamical systems; and (3) the structure optimization algorithm further improves RNN's modeling performance in generalization ability and robustness.
520
$a
Performance studies of the proposed network are highlighted in training convergence and robustness. For the training convergence study, the Lyapunov method is used to adapt some training parameters to guarantee the training convergence, while the maximum likelihood method is used to estimate some other parameters to accelerate the training process. In addition, robustness analysis is conducted to develop a robustness measure considering uncertainties propagation through RNN via unscented transform.
520
$a
Two case studies, the modeling of a benchmark non-linear dynamical system and a tool wear progression in hard turning, are carried out to testify the development in this dissertation.
520
$a
The work detailed in this dissertation focuses on the creation of: (1) a new method to prove/guarantee the training convergence of RNN, and (2) a new method to quantify the robustness of RNN using uncertainty propagation analysis. With the proposed study, RNN and related algorithms are developed to model nonlinear dynamical system which can benefit modeling applications such as the condition monitoring studies in terms of robustness and accuracy in the future.
590
$a
School code: 0050.
650
4
$a
Engineering, Mechanical.
$3
783786
650
4
$a
Artificial Intelligence.
$3
769149
650
4
$a
Computer Science.
$3
626642
690
$a
0548
690
$a
0800
690
$a
0984
710
2
$a
Clemson University.
$b
Mechanical Engineering.
$3
1023734
773
0
$t
Dissertation Abstracts International
$g
71-05B.
790
1 0
$a
Huang, Yong,
$e
advisor
790
1 0
$a
Gowdy, John
$e
committee member
790
1 0
$a
Jalili, Nader
$e
committee member
790
1 0
$a
Vahidi, Ardalan
$e
committee member
790
$a
0050
791
$a
Ph.D.
792
$a
2010
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3402564
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9155877
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入