Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Numerical optimization with computat...
~
Zaslavski, Alexander J.
Linked to FindBook
Google Book
Amazon
博客來
Numerical optimization with computational errors
Record Type:
Electronic resources : Monograph/item
Title/Author:
Numerical optimization with computational errors/ by Alexander J. Zaslavski.
Author:
Zaslavski, Alexander J.
Published:
Cham :Springer International Publishing : : 2016.,
Description:
ix, 304 p. :ill., digital ;24 cm.
[NT 15003449]:
1. Introduction -- 2. Subgradient Projection Algorithm -- 3. The Mirror Descent Algorithm -- 4. Gradient Algorithm with a Smooth Objective Function -- 5. An Extension of the Gradient Algorithm -- 6. Weiszfeld's Method -- 7. The Extragradient Method for Convex Optimization -- 8. A Projected Subgradient Method for Nonsmooth Problems -- 9. Proximal Point Method in Hilbert Spaces -- 10. Proximal Point Methods in Metric Spaces -- 11. Maximal Monotone Operators and the Proximal Point Algorithm -- 12. The Extragradient Method for Solving Variational Inequalities -- 13. A Common Solution of a Family of Variational Inequalities -- 14. Continuous Subgradient Method -- 15. Penalty Methods -- 16. Newton's method -- References -- Index.
Contained By:
Springer eBooks
Subject:
Mathematical optimization. -
Online resource:
http://dx.doi.org/10.1007/978-3-319-30921-7
ISBN:
9783319309217
Numerical optimization with computational errors
Zaslavski, Alexander J.
Numerical optimization with computational errors
[electronic resource] /by Alexander J. Zaslavski. - Cham :Springer International Publishing :2016. - ix, 304 p. :ill., digital ;24 cm. - Springer optimization and its applications,v.1081931-6828 ;. - Springer optimization and its applications ;v.52..
1. Introduction -- 2. Subgradient Projection Algorithm -- 3. The Mirror Descent Algorithm -- 4. Gradient Algorithm with a Smooth Objective Function -- 5. An Extension of the Gradient Algorithm -- 6. Weiszfeld's Method -- 7. The Extragradient Method for Convex Optimization -- 8. A Projected Subgradient Method for Nonsmooth Problems -- 9. Proximal Point Method in Hilbert Spaces -- 10. Proximal Point Methods in Metric Spaces -- 11. Maximal Monotone Operators and the Proximal Point Algorithm -- 12. The Extragradient Method for Solving Variational Inequalities -- 13. A Common Solution of a Family of Variational Inequalities -- 14. Continuous Subgradient Method -- 15. Penalty Methods -- 16. Newton's method -- References -- Index.
This book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative. This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton's method.
ISBN: 9783319309217
Standard No.: 10.1007/978-3-319-30921-7doiSubjects--Topical Terms:
517763
Mathematical optimization.
LC Class. No.: QA402.5
Dewey Class. No.: 519.6
Numerical optimization with computational errors
LDR
:02836nmm a2200349 a 4500
001
2036016
003
DE-He213
005
20161012172553.0
006
m d
007
cr nn 008maaau
008
161117s2016 gw s 0 eng d
020
$a
9783319309217
$q
(electronic bk.)
020
$a
9783319309200
$q
(paper)
024
7
$a
10.1007/978-3-319-30921-7
$2
doi
035
$a
978-3-319-30921-7
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
QA402.5
072
7
$a
PBKQ
$2
bicssc
072
7
$a
PBU
$2
bicssc
072
7
$a
MAT005000
$2
bisacsh
072
7
$a
MAT029020
$2
bisacsh
082
0 4
$a
519.6
$2
23
090
$a
QA402.5
$b
.Z38 2016
100
1
$a
Zaslavski, Alexander J.
$3
814771
245
1 0
$a
Numerical optimization with computational errors
$h
[electronic resource] /
$c
by Alexander J. Zaslavski.
260
$a
Cham :
$b
Springer International Publishing :
$b
Imprint: Springer,
$c
2016.
300
$a
ix, 304 p. :
$b
ill., digital ;
$c
24 cm.
490
1
$a
Springer optimization and its applications,
$x
1931-6828 ;
$v
v.108
505
0
$a
1. Introduction -- 2. Subgradient Projection Algorithm -- 3. The Mirror Descent Algorithm -- 4. Gradient Algorithm with a Smooth Objective Function -- 5. An Extension of the Gradient Algorithm -- 6. Weiszfeld's Method -- 7. The Extragradient Method for Convex Optimization -- 8. A Projected Subgradient Method for Nonsmooth Problems -- 9. Proximal Point Method in Hilbert Spaces -- 10. Proximal Point Methods in Metric Spaces -- 11. Maximal Monotone Operators and the Proximal Point Algorithm -- 12. The Extragradient Method for Solving Variational Inequalities -- 13. A Common Solution of a Family of Variational Inequalities -- 14. Continuous Subgradient Method -- 15. Penalty Methods -- 16. Newton's method -- References -- Index.
520
$a
This book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative. This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton's method.
650
0
$a
Mathematical optimization.
$3
517763
650
1 4
$a
Mathematics.
$3
515831
650
2 4
$a
Calculus of Variations and Optimal Control; Optimization.
$3
898674
650
2 4
$a
Numerical Analysis.
$3
892626
650
2 4
$a
Operations Research, Management Science.
$3
1532996
710
2
$a
SpringerLink (Online service)
$3
836513
773
0
$t
Springer eBooks
830
0
$a
Springer optimization and its applications ;
$v
v.52.
$3
1565660
856
4 0
$u
http://dx.doi.org/10.1007/978-3-319-30921-7
950
$a
Mathematics and Statistics (Springer-11649)
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9279860
電子資源
11.線上閱覽_V
電子書
EB QA402.5
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login