語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Recent Advances in Randomized Method...
~
Liu, Jie.
FindBook
Google Book
Amazon
博客來
Recent Advances in Randomized Methods for Big Data Optimization.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Recent Advances in Randomized Methods for Big Data Optimization./
作者:
Liu, Jie.
出版者:
Ann Arbor : ProQuest Dissertations & Theses, : 2018,
面頁冊數:
187 p.
附註:
Source: Dissertation Abstracts International, Volume: 80-07(E), Section: B.
Contained By:
Dissertation Abstracts International80-07B(E).
標題:
Artificial intelligence. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10973840
ISBN:
9780438888036
Recent Advances in Randomized Methods for Big Data Optimization.
Liu, Jie.
Recent Advances in Randomized Methods for Big Data Optimization.
- Ann Arbor : ProQuest Dissertations & Theses, 2018 - 187 p.
Source: Dissertation Abstracts International, Volume: 80-07(E), Section: B.
Thesis (Ph.D.)--Lehigh University, 2018.
In this thesis, we discuss and develop randomized algorithms for big data problems. In particular, we study the finite-sum optimization with newly emerged variance-reduction optimization methods (Chapter 2), explore the efficiency of second-order information applied to both convex and non-convex finite-sum objectives (Chapter 3) and employ the fast first-order method in power system problems (Chapter 4).
ISBN: 9780438888036Subjects--Topical Terms:
516317
Artificial intelligence.
Recent Advances in Randomized Methods for Big Data Optimization.
LDR
:03923nmm a2200349 4500
001
2201359
005
20190429062347.5
008
201008s2018 ||||||||||||||||| ||eng d
020
$a
9780438888036
035
$a
(MiAaPQ)AAI10973840
035
$a
(MiAaPQ)lehigh:12024
035
$a
AAI10973840
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Liu, Jie.
$3
1026842
245
1 0
$a
Recent Advances in Randomized Methods for Big Data Optimization.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2018
300
$a
187 p.
500
$a
Source: Dissertation Abstracts International, Volume: 80-07(E), Section: B.
500
$a
Adviser: Martin Takac.
502
$a
Thesis (Ph.D.)--Lehigh University, 2018.
520
$a
In this thesis, we discuss and develop randomized algorithms for big data problems. In particular, we study the finite-sum optimization with newly emerged variance-reduction optimization methods (Chapter 2), explore the efficiency of second-order information applied to both convex and non-convex finite-sum objectives (Chapter 3) and employ the fast first-order method in power system problems (Chapter 4).
520
$a
In Chapter 2, we propose two variance-reduced gradient algorithms---mS2GD and SARAH. mS2GD incorporates a mini-batching scheme for improving the theoretical complexity and practical performance of SVRG/S2GD, aiming to minimize a strongly convex function represented as the sum of an average of a large number of smooth convex functions and a simple non-smooth convex regularizer. While SARAH, short for StochAstic Recursive grAdient algoritHm and using a stochastic recursive gradient, targets at minimizing the average of a large number of smooth functions for both convex and non-convex cases. Both methods fall into the category of variance-reduction optimization, and obtain a total complexity of O((n + kappa)log(1/epsilon)) to achieve an epsilon-accuracy solution for strongly convex objectives, while SARAH also maintains a sub-linear convergence for non-convex problems. Meanwhile, SARAH has a practical variant SARAH+ due to its linear convergence of the expected stochastic gradients in inner loops.
520
$a
In Chapter 3, we declare that randomized batches can be applied with second-order information, as to improve upon convergence in both theory and practice, with a framework of L-BFGS as a novel approach to finite-sum optimization problems. We provide theoretical analyses for both convex and non-convex objectives. Meanwhile, we propose LBFGS-F as a variant where Fisher information matrix is used instead of Hessian information, and prove it applicable to a distributed environment within the popular applications of least-square and cross-entropy losses.
520
$a
In Chapter 4, we develop fast randomized algorithms for solving polynomial optimization problems on the applications of alternating-current optimal power flows (ACOPF) in power system field. The traditional research on power system problem focuses on solvers using second-order method, while no randomized algorithms have been developed. First, we propose a coordinate-descent algorithm as an online solver, applied for solving time-varying optimization problems in power systems. We bound the difference between the current approximate optimal cost generated by our algorithm and the optimal cost for a relaxation using the most recent data from above by a function of the properties of the instance and the rate of change to the instance over time. Second, we focus on a steady-state problem in power systems, and study means of switching from solving a convex relaxation to Newton method working on a non-convex (augmented) Lagrangian of the problem.
590
$a
School code: 0105.
650
4
$a
Artificial intelligence.
$3
516317
650
4
$a
Statistics.
$3
517247
650
4
$a
Industrial engineering.
$3
526216
690
$a
0800
690
$a
0463
690
$a
0546
710
2
$a
Lehigh University.
$b
Information and Systems Engineering.
$3
3175565
773
0
$t
Dissertation Abstracts International
$g
80-07B(E).
790
$a
0105
791
$a
Ph.D.
792
$a
2018
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10973840
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9377908
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入