語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Evolutionary learning = advances in ...
~
Zhou, Zhi-Hua.
FindBook
Google Book
Amazon
博客來
Evolutionary learning = advances in theories and algorithms /
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Evolutionary learning/ by Zhi-Hua Zhou, Yang Yu, Chao Qian.
其他題名:
advances in theories and algorithms /
作者:
Zhou, Zhi-Hua.
其他作者:
Yu, Yang.
出版者:
Singapore :Springer Singapore : : 2019.,
面頁冊數:
xii, 361 p. :ill., digital ;24 cm.
內容註:
1.Introduction -- 2. Preliminaries -- 3. Running Time Analysis: Convergence-based Analysis -- 4. Running Time Analysis: Switch Analysis -- 5. Running Time Analysis: Comparison and Unification -- 6. Approximation Analysis: SEIP -- 7. Boundary Problems of EAs -- 8. Recombination -- 9. Representation -- 10. Inaccurate Fitness Evaluation -- 11. Population -- 12. Constrained Optimization -- 13. Selective Ensemble -- 14. Subset Selection -- 15. Subset Selection: k-Submodular Maximization -- 16. Subset Selection: Ratio Minimization -- 17. Subset Selection: Noise -- 18. Subset Selection: Acceleration.
Contained By:
Springer eBooks
標題:
Machine learning. -
電子資源:
https://doi.org/10.1007/978-981-13-5956-9
ISBN:
9789811359569
Evolutionary learning = advances in theories and algorithms /
Zhou, Zhi-Hua.
Evolutionary learning
advances in theories and algorithms /[electronic resource] :by Zhi-Hua Zhou, Yang Yu, Chao Qian. - Singapore :Springer Singapore :2019. - xii, 361 p. :ill., digital ;24 cm.
1.Introduction -- 2. Preliminaries -- 3. Running Time Analysis: Convergence-based Analysis -- 4. Running Time Analysis: Switch Analysis -- 5. Running Time Analysis: Comparison and Unification -- 6. Approximation Analysis: SEIP -- 7. Boundary Problems of EAs -- 8. Recombination -- 9. Representation -- 10. Inaccurate Fitness Evaluation -- 11. Population -- 12. Constrained Optimization -- 13. Selective Ensemble -- 14. Subset Selection -- 15. Subset Selection: k-Submodular Maximization -- 16. Subset Selection: Ratio Minimization -- 17. Subset Selection: Noise -- 18. Subset Selection: Acceleration.
Many machine learning tasks involve solving complex optimization problems, such as working on non-differentiable, non-continuous, and non-unique objective functions; in some cases it can prove difficult to even define an explicit objective function. Evolutionary learning applies evolutionary algorithms to address optimization problems in machine learning, and has yielded encouraging outcomes in many applications. However, due to the heuristic nature of evolutionary optimization, most outcomes to date have been empirical and lack theoretical support. This shortcoming has kept evolutionary learning from being well received in the machine learning community, which favors solid theoretical approaches. Recently there have been considerable efforts to address this issue. This book presents a range of those efforts, divided into four parts. Part I briefly introduces readers to evolutionary learning and provides some preliminaries, while Part II presents general theoretical tools for the analysis of running time and approximation performance in evolutionary algorithms. Based on these general tools, Part III presents a number of theoretical findings on major factors in evolutionary optimization, such as recombination, representation, inaccurate fitness evaluation, and population. In closing, Part IV addresses the development of evolutionary learning algorithms with provable theoretical guarantees for several representative tasks, in which evolutionary learning offers excellent performance.
ISBN: 9789811359569
Standard No.: 10.1007/978-981-13-5956-9doiSubjects--Topical Terms:
533906
Machine learning.
LC Class. No.: Q325.5 / .Z46 2019
Dewey Class. No.: 006.31
Evolutionary learning = advances in theories and algorithms /
LDR
:03092nmm a2200325 a 4500
001
2191191
003
DE-He213
005
20190523151322.0
006
m d
007
cr nn 008maaau
008
200504s2019 si s 0 eng d
020
$a
9789811359569
$q
(electronic bk.)
020
$a
9789811359552
$q
(paper)
024
7
$a
10.1007/978-981-13-5956-9
$2
doi
035
$a
978-981-13-5956-9
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
Q325.5
$b
.Z46 2019
072
7
$a
UYQ
$2
bicssc
072
7
$a
COM004000
$2
bisacsh
072
7
$a
UYQ
$2
thema
082
0 4
$a
006.31
$2
23
090
$a
Q325.5
$b
.Z63 2019
100
1
$a
Zhou, Zhi-Hua.
$3
927817
245
1 0
$a
Evolutionary learning
$h
[electronic resource] :
$b
advances in theories and algorithms /
$c
by Zhi-Hua Zhou, Yang Yu, Chao Qian.
260
$a
Singapore :
$b
Springer Singapore :
$b
Imprint: Springer,
$c
2019.
300
$a
xii, 361 p. :
$b
ill., digital ;
$c
24 cm.
505
0
$a
1.Introduction -- 2. Preliminaries -- 3. Running Time Analysis: Convergence-based Analysis -- 4. Running Time Analysis: Switch Analysis -- 5. Running Time Analysis: Comparison and Unification -- 6. Approximation Analysis: SEIP -- 7. Boundary Problems of EAs -- 8. Recombination -- 9. Representation -- 10. Inaccurate Fitness Evaluation -- 11. Population -- 12. Constrained Optimization -- 13. Selective Ensemble -- 14. Subset Selection -- 15. Subset Selection: k-Submodular Maximization -- 16. Subset Selection: Ratio Minimization -- 17. Subset Selection: Noise -- 18. Subset Selection: Acceleration.
520
$a
Many machine learning tasks involve solving complex optimization problems, such as working on non-differentiable, non-continuous, and non-unique objective functions; in some cases it can prove difficult to even define an explicit objective function. Evolutionary learning applies evolutionary algorithms to address optimization problems in machine learning, and has yielded encouraging outcomes in many applications. However, due to the heuristic nature of evolutionary optimization, most outcomes to date have been empirical and lack theoretical support. This shortcoming has kept evolutionary learning from being well received in the machine learning community, which favors solid theoretical approaches. Recently there have been considerable efforts to address this issue. This book presents a range of those efforts, divided into four parts. Part I briefly introduces readers to evolutionary learning and provides some preliminaries, while Part II presents general theoretical tools for the analysis of running time and approximation performance in evolutionary algorithms. Based on these general tools, Part III presents a number of theoretical findings on major factors in evolutionary optimization, such as recombination, representation, inaccurate fitness evaluation, and population. In closing, Part IV addresses the development of evolutionary learning algorithms with provable theoretical guarantees for several representative tasks, in which evolutionary learning offers excellent performance.
650
0
$a
Machine learning.
$3
533906
650
1 4
$a
Artificial Intelligence.
$3
769149
650
2 4
$a
Algorithm Analysis and Problem Complexity.
$3
891007
650
2 4
$a
Math Applications in Computer Science.
$3
891004
700
1
$a
Yu, Yang.
$3
1672361
700
1
$a
Qian, Chao.
$3
3410317
710
2
$a
SpringerLink (Online service)
$3
836513
773
0
$t
Springer eBooks
856
4 0
$u
https://doi.org/10.1007/978-981-13-5956-9
950
$a
Computer Science (Springer-11645)
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9373835
電子資源
11.線上閱覽_V
電子書
EB Q325.5 .Z46 2019
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入