Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Nonlinear conjugate gradient methods...
~
Andrei, Neculai.
Linked to FindBook
Google Book
Amazon
博客來
Nonlinear conjugate gradient methods for unconstrained optimization
Record Type:
Electronic resources : Monograph/item
Title/Author:
Nonlinear conjugate gradient methods for unconstrained optimization/ by Neculai Andrei.
Author:
Andrei, Neculai.
Published:
Cham :Springer International Publishing : : 2020.,
Description:
xxviii, 498 p. :ill., digital ;24 cm.
[NT 15003449]:
1. Introduction -- 2. Linear Conjugate Gradient Algorithm -- 3. General Convergence Results for Nonlinear Conjugate Gradient Methods -- 4. Standard Conjugate Gradient Methods -- 5. Acceleration of Conjugate Gradient Algorithms -- 6. Hybrid and Parameterized Conjugate Gradient Methods -- 7. Conjugate Gradient Methods as Modifications of the Standard Schemes -- 8. Conjugate Gradient Methods Memoryless BFGS Preconditioned -- 9. Three-Term Conjugate Gradient Methods -- 10. Other Conjugate Gradient Methods -- 11. Discussion and Conclusions -- References -- Author Index -- Subject Index.
Contained By:
Springer eBooks
Subject:
Conjugate gradient methods. -
Online resource:
https://doi.org/10.1007/978-3-030-42950-8
ISBN:
9783030429508
Nonlinear conjugate gradient methods for unconstrained optimization
Andrei, Neculai.
Nonlinear conjugate gradient methods for unconstrained optimization
[electronic resource] /by Neculai Andrei. - Cham :Springer International Publishing :2020. - xxviii, 498 p. :ill., digital ;24 cm. - Springer optimization and its applications,v.1581931-6828 ;. - Springer optimization and its applications ;v.158..
1. Introduction -- 2. Linear Conjugate Gradient Algorithm -- 3. General Convergence Results for Nonlinear Conjugate Gradient Methods -- 4. Standard Conjugate Gradient Methods -- 5. Acceleration of Conjugate Gradient Algorithms -- 6. Hybrid and Parameterized Conjugate Gradient Methods -- 7. Conjugate Gradient Methods as Modifications of the Standard Schemes -- 8. Conjugate Gradient Methods Memoryless BFGS Preconditioned -- 9. Three-Term Conjugate Gradient Methods -- 10. Other Conjugate Gradient Methods -- 11. Discussion and Conclusions -- References -- Author Index -- Subject Index.
Two approaches are known for solving large-scale unconstrained optimization problems-the limited-memory quasi-Newton method (truncated Newton method) and the conjugate gradient method. This is the first book to detail conjugate gradient methods, showing their properties and convergence characteristics as well as their performance in solving large-scale unconstrained optimization problems and applications. Comparisons to the limited-memory and truncated Newton methods are also discussed. Topics studied in detail include: linear conjugate gradient methods, standard conjugate gradient methods, acceleration of conjugate gradient methods, hybrid, modifications of the standard scheme, memoryless BFGS preconditioned, and three-term. Other conjugate gradient methods with clustering the eigenvalues or with the minimization of the condition number of the iteration matrix, are also treated. For each method, the convergence analysis, the computational performances and the comparisons versus other conjugate gradient methods are given. The theory behind the conjugate gradient algorithms presented as a methodology is developed with a clear, rigorous, and friendly exposition; the reader will gain an understanding of their properties and their convergence and will learn to develop and prove the convergence of his/her own methods. Numerous numerical studies are supplied with comparisons and comments on the behavior of conjugate gradient algorithms for solving a collection of 800 unconstrained optimization problems of different structures and complexities with the number of variables in the range [1000,10000]. The book is addressed to all those interested in developing and using new advanced techniques for solving unconstrained optimization complex problems. Mathematical programming researchers, theoreticians and practitioners in operations research, practitioners in engineering and industry researchers, as well as graduate students in mathematics, Ph.D. and master students in mathematical programming, will find plenty of information and practical applications for solving large-scale unconstrained optimization problems and applications by conjugate gradient methods.
ISBN: 9783030429508
Standard No.: 10.1007/978-3-030-42950-8doiSubjects--Topical Terms:
659632
Conjugate gradient methods.
LC Class. No.: QA218 / .A537 2020
Dewey Class. No.: 512.94
Nonlinear conjugate gradient methods for unconstrained optimization
LDR
:03837nmm a2200337 a 4500
001
2221268
003
DE-He213
005
20201029134333.0
006
m d
007
cr nn 008maaau
008
201216s2020 sz s 0 eng d
020
$a
9783030429508
$q
(electronic bk.)
020
$a
9783030429492
$q
(paper)
024
7
$a
10.1007/978-3-030-42950-8
$2
doi
035
$a
978-3-030-42950-8
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
QA218
$b
.A537 2020
072
7
$a
PBU
$2
bicssc
072
7
$a
MAT003000
$2
bisacsh
072
7
$a
PBU
$2
thema
082
0 4
$a
512.94
$2
23
090
$a
QA218
$b
.A559 2020
100
1
$a
Andrei, Neculai.
$3
3270881
245
1 0
$a
Nonlinear conjugate gradient methods for unconstrained optimization
$h
[electronic resource] /
$c
by Neculai Andrei.
260
$a
Cham :
$b
Springer International Publishing :
$b
Imprint: Springer,
$c
2020.
300
$a
xxviii, 498 p. :
$b
ill., digital ;
$c
24 cm.
490
1
$a
Springer optimization and its applications,
$x
1931-6828 ;
$v
v.158
505
0
$a
1. Introduction -- 2. Linear Conjugate Gradient Algorithm -- 3. General Convergence Results for Nonlinear Conjugate Gradient Methods -- 4. Standard Conjugate Gradient Methods -- 5. Acceleration of Conjugate Gradient Algorithms -- 6. Hybrid and Parameterized Conjugate Gradient Methods -- 7. Conjugate Gradient Methods as Modifications of the Standard Schemes -- 8. Conjugate Gradient Methods Memoryless BFGS Preconditioned -- 9. Three-Term Conjugate Gradient Methods -- 10. Other Conjugate Gradient Methods -- 11. Discussion and Conclusions -- References -- Author Index -- Subject Index.
520
$a
Two approaches are known for solving large-scale unconstrained optimization problems-the limited-memory quasi-Newton method (truncated Newton method) and the conjugate gradient method. This is the first book to detail conjugate gradient methods, showing their properties and convergence characteristics as well as their performance in solving large-scale unconstrained optimization problems and applications. Comparisons to the limited-memory and truncated Newton methods are also discussed. Topics studied in detail include: linear conjugate gradient methods, standard conjugate gradient methods, acceleration of conjugate gradient methods, hybrid, modifications of the standard scheme, memoryless BFGS preconditioned, and three-term. Other conjugate gradient methods with clustering the eigenvalues or with the minimization of the condition number of the iteration matrix, are also treated. For each method, the convergence analysis, the computational performances and the comparisons versus other conjugate gradient methods are given. The theory behind the conjugate gradient algorithms presented as a methodology is developed with a clear, rigorous, and friendly exposition; the reader will gain an understanding of their properties and their convergence and will learn to develop and prove the convergence of his/her own methods. Numerous numerical studies are supplied with comparisons and comments on the behavior of conjugate gradient algorithms for solving a collection of 800 unconstrained optimization problems of different structures and complexities with the number of variables in the range [1000,10000]. The book is addressed to all those interested in developing and using new advanced techniques for solving unconstrained optimization complex problems. Mathematical programming researchers, theoreticians and practitioners in operations research, practitioners in engineering and industry researchers, as well as graduate students in mathematics, Ph.D. and master students in mathematical programming, will find plenty of information and practical applications for solving large-scale unconstrained optimization problems and applications by conjugate gradient methods.
650
0
$a
Conjugate gradient methods.
$3
659632
650
0
$a
Constrained optimization.
$3
1066669
650
1 4
$a
Optimization.
$3
891104
650
2 4
$a
Mathematical Modeling and Industrial Mathematics.
$3
891089
710
2
$a
SpringerLink (Online service)
$3
836513
773
0
$t
Springer eBooks
830
0
$a
Springer optimization and its applications ;
$v
v.158.
$3
3459381
856
4 0
$u
https://doi.org/10.1007/978-3-030-42950-8
950
$a
Mathematics and Statistics (Springer-11649)
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9394847
電子資源
11.線上閱覽_V
電子書
EB QA218 .A537 2020
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login