語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
FindBook
Google Book
Amazon
博客來
Advancing Optimization for Modern Machine Learning.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Advancing Optimization for Modern Machine Learning./
作者:
Levy, Daniel.
出版者:
Ann Arbor : ProQuest Dissertations & Theses, : 2021,
面頁冊數:
301 p.
附註:
Source: Dissertations Abstracts International, Volume: 83-09, Section: B.
Contained By:
Dissertations Abstracts International83-09B.
標題:
Convex analysis. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=29003828
ISBN:
9798209787136
Advancing Optimization for Modern Machine Learning.
Levy, Daniel.
Advancing Optimization for Modern Machine Learning.
- Ann Arbor : ProQuest Dissertations & Theses, 2021 - 301 p.
Source: Dissertations Abstracts International, Volume: 83-09, Section: B.
Thesis (Ph.D.)--Stanford University, 2021.
This item must not be sold to any third party vendors.
Machine learning is a transformative computational tool on its way to revolutionizing a number of technologies and scientific applications. However, recent successes in artificial intelligence and machine learning, and the resulting imminent widespread deployment of models have transformed the classical machine learning pipeline. First of, the sheer scale of available data---in both quantity and dimensionality---has exploded. Furthermore, modern machine learning architectures come with an exponential number of design choices and hyperparameters, yet they are all optimized using generic stochastic gradient methods. This highlights the need for adaptive gradient methods that perform adequately without prior knowledge of the instance they will be given. Institutions then deploy these models in the wild and expect them to provide good predictions even on out-of-distribution inputs---this emphasizes the need for robust models. Finally, as we collect evermore user data, we wish that models trained on this data do not compromise the privacy of individuals present in the training set as we release these models to the public. In this thesis, we show that solving these emerging problems require fundamental advances in optimization. More specifically, we first present new theoretical results on understanding the optimality of adaptive gradient algorithms and show a practical use case of adaptive methods in the context of gradient-based samplers. We then present scalable methods for min-max optimization with the goal of efficiently solving robust objectives. We conclude by developing private optimization methods that optimally learn under more stringent privacy requirements, as well as adaptive methods that add the "right amount of noise" and significantly decrease the price of privacy on easy instances.
ISBN: 9798209787136Subjects--Topical Terms:
3681761
Convex analysis.
Advancing Optimization for Modern Machine Learning.
LDR
:02888nmm a2200349 4500
001
2345733
005
20220613063807.5
008
241004s2021 ||||||||||||||||| ||eng d
020
$a
9798209787136
035
$a
(MiAaPQ)AAI29003828
035
$a
(MiAaPQ)STANFORDnv133bt5905
035
$a
AAI29003828
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Levy, Daniel.
$3
3684727
245
1 0
$a
Advancing Optimization for Modern Machine Learning.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2021
300
$a
301 p.
500
$a
Source: Dissertations Abstracts International, Volume: 83-09, Section: B.
500
$a
Advisor: Duchi, John;Liang, Percy.
502
$a
Thesis (Ph.D.)--Stanford University, 2021.
506
$a
This item must not be sold to any third party vendors.
520
$a
Machine learning is a transformative computational tool on its way to revolutionizing a number of technologies and scientific applications. However, recent successes in artificial intelligence and machine learning, and the resulting imminent widespread deployment of models have transformed the classical machine learning pipeline. First of, the sheer scale of available data---in both quantity and dimensionality---has exploded. Furthermore, modern machine learning architectures come with an exponential number of design choices and hyperparameters, yet they are all optimized using generic stochastic gradient methods. This highlights the need for adaptive gradient methods that perform adequately without prior knowledge of the instance they will be given. Institutions then deploy these models in the wild and expect them to provide good predictions even on out-of-distribution inputs---this emphasizes the need for robust models. Finally, as we collect evermore user data, we wish that models trained on this data do not compromise the privacy of individuals present in the training set as we release these models to the public. In this thesis, we show that solving these emerging problems require fundamental advances in optimization. More specifically, we first present new theoretical results on understanding the optimality of adaptive gradient algorithms and show a practical use case of adaptive methods in the context of gradient-based samplers. We then present scalable methods for min-max optimization with the goal of efficiently solving robust objectives. We conclude by developing private optimization methods that optimally learn under more stringent privacy requirements, as well as adaptive methods that add the "right amount of noise" and significantly decrease the price of privacy on easy instances.
590
$a
School code: 0212.
650
4
$a
Convex analysis.
$3
3681761
650
4
$a
Multilingualism.
$3
598147
650
4
$a
Algorithms.
$3
536374
650
4
$a
Privacy.
$3
528582
650
4
$a
Optimization.
$3
891104
650
4
$a
Bilingual education.
$3
2122778
650
4
$a
Computer science.
$3
523869
650
4
$a
Education.
$3
516579
650
4
$a
Language.
$3
643551
650
4
$a
Mathematics.
$3
515831
690
$a
0282
690
$a
0984
690
$a
0515
690
$a
0679
690
$a
0405
710
2
$a
Stanford University.
$3
754827
773
0
$t
Dissertations Abstracts International
$g
83-09B.
790
$a
0212
791
$a
Ph.D.
792
$a
2021
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=29003828
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9468171
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入