語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Convex optimization with computation...
~
Zaslavski, Alexander J.
FindBook
Google Book
Amazon
博客來
Convex optimization with computational errors
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Convex optimization with computational errors/ by Alexander J. Zaslavski.
作者:
Zaslavski, Alexander J.
出版者:
Cham :Springer International Publishing : : 2020.,
面頁冊數:
xi, 360 p. :ill., digital ;24 cm.
內容註:
Preface -- 1. Introduction -- 2. Subgradient Projection Algorithm -- 3. The Mirror Descent Algorithm -- 4. Gradient Algorithm with a Smooth Objective Function -- 5. An Extension of the Gradient Algorithm -- 6. Continuous Subgradient Method -- 7. An optimization problems with a composite objective function -- 8. A zero-sum game with two-players -- 9. PDA-based method for convex optimization -- 10 Minimization of quasiconvex functions -- 11. Minimization of sharp weakly convex functions -- 12. A Projected Subgradient Method for Nonsmooth Problems -- References. -Index.
Contained By:
Springer eBooks
標題:
Mathematical optimization. -
電子資源:
https://doi.org/10.1007/978-3-030-37822-6
ISBN:
9783030378226
Convex optimization with computational errors
Zaslavski, Alexander J.
Convex optimization with computational errors
[electronic resource] /by Alexander J. Zaslavski. - Cham :Springer International Publishing :2020. - xi, 360 p. :ill., digital ;24 cm. - Springer optimization and its applications,v.1551931-6828 ;. - Springer optimization and its applications ;v.155..
Preface -- 1. Introduction -- 2. Subgradient Projection Algorithm -- 3. The Mirror Descent Algorithm -- 4. Gradient Algorithm with a Smooth Objective Function -- 5. An Extension of the Gradient Algorithm -- 6. Continuous Subgradient Method -- 7. An optimization problems with a composite objective function -- 8. A zero-sum game with two-players -- 9. PDA-based method for convex optimization -- 10 Minimization of quasiconvex functions -- 11. Minimization of sharp weakly convex functions -- 12. A Projected Subgradient Method for Nonsmooth Problems -- References. -Index.
This book studies approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are well known as important tools for solving optimization problems. The research presented continues from the author's (c) 2016 book Numerical Optimization with Computational Errors. Both books study algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to obtain the approximate solution and the number of iterations needed. The discussion takes into consideration that for every algorithm, its iteration consists of several steps; computational errors for various steps are generally different. This fact, which was not accounted for in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps-a calculation of a subgradient of the objective function and a calculation of a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are generally different. The book is of interest for researchers and engineers working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book will appeal specifically to researchers and engineers working in optimization as well as to experts in applications of optimization to engineering and economics.
ISBN: 9783030378226
Standard No.: 10.1007/978-3-030-37822-6doiSubjects--Topical Terms:
517763
Mathematical optimization.
LC Class. No.: QA402.5 / .Z375 2020
Dewey Class. No.: 519.6
Convex optimization with computational errors
LDR
:03174nmm a2200349 a 4500
001
2215810
003
DE-He213
005
20200624154947.0
006
m d
007
cr nn 008maaau
008
201120s2020 sz s 0 eng d
020
$a
9783030378226
$q
(electronic bk.)
020
$a
9783030378219
$q
(paper)
024
7
$a
10.1007/978-3-030-37822-6
$2
doi
035
$a
978-3-030-37822-6
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
QA402.5
$b
.Z375 2020
072
7
$a
PBKQ
$2
bicssc
072
7
$a
MAT005000
$2
bisacsh
072
7
$a
PBKQ
$2
thema
072
7
$a
PBU
$2
thema
082
0 4
$a
519.6
$2
23
090
$a
QA402.5
$b
.Z38 2020
100
1
$a
Zaslavski, Alexander J.
$3
814771
245
1 0
$a
Convex optimization with computational errors
$h
[electronic resource] /
$c
by Alexander J. Zaslavski.
260
$a
Cham :
$b
Springer International Publishing :
$b
Imprint: Springer,
$c
2020.
300
$a
xi, 360 p. :
$b
ill., digital ;
$c
24 cm.
490
1
$a
Springer optimization and its applications,
$x
1931-6828 ;
$v
v.155
505
0
$a
Preface -- 1. Introduction -- 2. Subgradient Projection Algorithm -- 3. The Mirror Descent Algorithm -- 4. Gradient Algorithm with a Smooth Objective Function -- 5. An Extension of the Gradient Algorithm -- 6. Continuous Subgradient Method -- 7. An optimization problems with a composite objective function -- 8. A zero-sum game with two-players -- 9. PDA-based method for convex optimization -- 10 Minimization of quasiconvex functions -- 11. Minimization of sharp weakly convex functions -- 12. A Projected Subgradient Method for Nonsmooth Problems -- References. -Index.
520
$a
This book studies approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are well known as important tools for solving optimization problems. The research presented continues from the author's (c) 2016 book Numerical Optimization with Computational Errors. Both books study algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to obtain the approximate solution and the number of iterations needed. The discussion takes into consideration that for every algorithm, its iteration consists of several steps; computational errors for various steps are generally different. This fact, which was not accounted for in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps-a calculation of a subgradient of the objective function and a calculation of a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are generally different. The book is of interest for researchers and engineers working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book will appeal specifically to researchers and engineers working in optimization as well as to experts in applications of optimization to engineering and economics.
650
0
$a
Mathematical optimization.
$3
517763
650
1 4
$a
Calculus of Variations and Optimal Control; Optimization.
$3
898674
650
2 4
$a
Computational Mathematics and Numerical Analysis.
$3
891040
710
2
$a
SpringerLink (Online service)
$3
836513
773
0
$t
Springer eBooks
830
0
$a
Springer optimization and its applications ;
$v
v.155.
$3
3447640
856
4 0
$u
https://doi.org/10.1007/978-3-030-37822-6
950
$a
Mathematics and Statistics (Springer-11649)
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9390714
電子資源
11.線上閱覽_V
電子書
EB QA402.5 .Z375 2020
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入