語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Hyperparameter optimization in machi...
~
Agrawal, Tanay.
FindBook
Google Book
Amazon
博客來
Hyperparameter optimization in machine learning = make your machine learning and deep learning models more efficient /
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Hyperparameter optimization in machine learning/ by Tanay Agrawal.
其他題名:
make your machine learning and deep learning models more efficient /
作者:
Agrawal, Tanay.
出版者:
Berkeley, CA :Apress : : 2021.,
面頁冊數:
xix, 166 p. :ill., digital ;24 cm.
內容註:
Chapter 1: Hyperparameters -- Chapter 2: Brute Force Hyperparameter Tuning -- Chapter 3: Distributed Hyperparameter Optimization -- Chapter 4: Sequential Model-Based Global Optimization and Its Hierarchical -- Chapter 5: Using HyperOpt -- Chapter 6: Hyperparameter Generating Condition Generative Adversarial Neural.
Contained By:
Springer Nature eBook
標題:
Machine learning. -
電子資源:
https://doi.org/10.1007/978-1-4842-6579-6
ISBN:
9781484265796
Hyperparameter optimization in machine learning = make your machine learning and deep learning models more efficient /
Agrawal, Tanay.
Hyperparameter optimization in machine learning
make your machine learning and deep learning models more efficient /[electronic resource] :by Tanay Agrawal. - Berkeley, CA :Apress :2021. - xix, 166 p. :ill., digital ;24 cm.
Chapter 1: Hyperparameters -- Chapter 2: Brute Force Hyperparameter Tuning -- Chapter 3: Distributed Hyperparameter Optimization -- Chapter 4: Sequential Model-Based Global Optimization and Its Hierarchical -- Chapter 5: Using HyperOpt -- Chapter 6: Hyperparameter Generating Condition Generative Adversarial Neural.
Dive into hyperparameter tuning of machine learning models and focus on what hyperparameters are and how they work. This book discusses different techniques of hyperparameters tuning, from the basics to advanced methods. This is a step-by-step guide to hyperparameter optimization, starting with what hyperparameters are and how they affect different aspects of machine learning models. It then goes through some basic (brute force) algorithms of hyperparameter optimization. Further, the author addresses the problem of time and memory constraints, using distributed optimization methods. Next you'll discuss Bayesian optimization for hyperparameter search, which learns from its previous history. The book discusses different frameworks, such as Hyperopt and Optuna, which implements sequential model-based global optimization (SMBO) algorithms. During these discussions, you'll focus on different aspects such as creation of search spaces and distributed optimization of these libraries. Hyperparameter Optimization in Machine Learning creates an understanding of how these algorithms work and how you can use them in real-life data science problems. The final chapter summaries the role of hyperparameter optimization in automated machine learning and ends with a tutorial to create your own AutoML script. Hyperparameter optimization is tedious task, so sit back and let these algorithms do your work. You will: Discover how changes in hyperparameters affect the model's performance. Apply different hyperparameter tuning algorithms to data science problems Work with Bayesian optimization methods to create efficient machine learning and deep learning models Distribute hyperparameter optimization using a cluster of machines Approach automated machine learning using hyperparameter optimization.
ISBN: 9781484265796
Standard No.: 10.1007/978-1-4842-6579-6doiSubjects--Topical Terms:
533906
Machine learning.
LC Class. No.: Q325.5 / .A47 2021
Dewey Class. No.: 006.31
Hyperparameter optimization in machine learning = make your machine learning and deep learning models more efficient /
LDR
:03184nmm a2200325 a 4500
001
2236639
003
DE-He213
005
20201128115901.0
006
m d
007
cr nn 008maaau
008
211111s2021 cau s 0 eng d
020
$a
9781484265796
$q
(electronic bk.)
020
$a
9781484265789
$q
(paper)
024
7
$a
10.1007/978-1-4842-6579-6
$2
doi
035
$a
978-1-4842-6579-6
040
$a
GP
$c
GP
$e
rda
041
0
$a
eng
050
4
$a
Q325.5
$b
.A47 2021
072
7
$a
UYQM
$2
bicssc
072
7
$a
COM004000
$2
bisacsh
072
7
$a
UYQM
$2
thema
082
0 4
$a
006.31
$2
23
090
$a
Q325.5
$b
.A277 2021
100
1
$a
Agrawal, Tanay.
$3
3488202
245
1 0
$a
Hyperparameter optimization in machine learning
$h
[electronic resource] :
$b
make your machine learning and deep learning models more efficient /
$c
by Tanay Agrawal.
260
$a
Berkeley, CA :
$b
Apress :
$b
Imprint: Apress,
$c
2021.
300
$a
xix, 166 p. :
$b
ill., digital ;
$c
24 cm.
505
0
$a
Chapter 1: Hyperparameters -- Chapter 2: Brute Force Hyperparameter Tuning -- Chapter 3: Distributed Hyperparameter Optimization -- Chapter 4: Sequential Model-Based Global Optimization and Its Hierarchical -- Chapter 5: Using HyperOpt -- Chapter 6: Hyperparameter Generating Condition Generative Adversarial Neural.
520
$a
Dive into hyperparameter tuning of machine learning models and focus on what hyperparameters are and how they work. This book discusses different techniques of hyperparameters tuning, from the basics to advanced methods. This is a step-by-step guide to hyperparameter optimization, starting with what hyperparameters are and how they affect different aspects of machine learning models. It then goes through some basic (brute force) algorithms of hyperparameter optimization. Further, the author addresses the problem of time and memory constraints, using distributed optimization methods. Next you'll discuss Bayesian optimization for hyperparameter search, which learns from its previous history. The book discusses different frameworks, such as Hyperopt and Optuna, which implements sequential model-based global optimization (SMBO) algorithms. During these discussions, you'll focus on different aspects such as creation of search spaces and distributed optimization of these libraries. Hyperparameter Optimization in Machine Learning creates an understanding of how these algorithms work and how you can use them in real-life data science problems. The final chapter summaries the role of hyperparameter optimization in automated machine learning and ends with a tutorial to create your own AutoML script. Hyperparameter optimization is tedious task, so sit back and let these algorithms do your work. You will: Discover how changes in hyperparameters affect the model's performance. Apply different hyperparameter tuning algorithms to data science problems Work with Bayesian optimization methods to create efficient machine learning and deep learning models Distribute hyperparameter optimization using a cluster of machines Approach automated machine learning using hyperparameter optimization.
650
0
$a
Machine learning.
$3
533906
650
0
$a
Mathematical optimization
$x
Computer programs.
$3
2111783
650
0
$a
Open source software.
$3
581998
650
0
$a
Computer programming.
$3
527209
650
1 4
$a
Machine Learning.
$3
3382522
650
2 4
$a
Python.
$3
3201289
650
2 4
$a
Open Source.
$3
2210577
710
2
$a
SpringerLink (Online service)
$3
836513
773
0
$t
Springer Nature eBook
856
4 0
$u
https://doi.org/10.1007/978-1-4842-6579-6
950
$a
Professional and Applied Computing (SpringerNature-12059)
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9398524
電子資源
11.線上閱覽_V
電子書
EB Q325.5 .A47 2021
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入