語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Optimization in Banach spaces
~
Zaslavski, Alexander J.
FindBook
Google Book
Amazon
博客來
Optimization in Banach spaces
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Optimization in Banach spaces/ by Alexander J. Zaslavski.
作者:
Zaslavski, Alexander J.
出版者:
Cham :Springer International Publishing : : 2022.,
面頁冊數:
viii, 126 p. :ill., digital ;24 cm.
內容註:
Preface -- Introduction -- Convex optimization -- Nonconvex optimization -- Continuous algorithms -- References.
Contained By:
Springer Nature eBook
標題:
Mathematical optimization. -
電子資源:
https://doi.org/10.1007/978-3-031-12644-4
ISBN:
9783031126444
Optimization in Banach spaces
Zaslavski, Alexander J.
Optimization in Banach spaces
[electronic resource] /by Alexander J. Zaslavski. - Cham :Springer International Publishing :2022. - viii, 126 p. :ill., digital ;24 cm. - SpringerBriefs in optimization,2191-575X. - SpringerBriefs in optimization..
Preface -- Introduction -- Convex optimization -- Nonconvex optimization -- Continuous algorithms -- References.
The book is devoted to the study of constrained minimization problems on closed and convex sets in Banach spaces with a Frechet differentiable objective function. Such problems are well studied in a finite-dimensional space and in an infinite-dimensional Hilbert space. When the space is Hilbert there are many algorithms for solving optimization problems including the gradient projection algorithm which is one of the most important tools in the optimization theory, nonlinear analysis and their applications. An optimization problem is described by an objective function and a set of feasible points. For the gradient projection algorithm each iteration consists of two steps. The first step is a calculation of a gradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error. In our recent research we show that the gradient projection algorithm generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. It should be mentioned that the properties of a Hilbert space play an important role. When we consider an optimization problem in a general Banach space the situation becomes more difficult and less understood. On the other hand such problems arise in the approximation theory. The book is of interest for mathematicians working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book which appeals specifically to this audience is the study of algorithms for convex and nonconvex minimization problems in a general Banach space. The book is of interest for experts in applications of optimization to the approximation theory. In this book the goal is to obtain a good approximate solution of the constrained optimization problem in a general Banach space under the presence of computational errors. It is shown that the algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and prove a convergence result for an unconstrained problem which is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems are studied in Chapter 3. In Chapter 4 we study continuous algorithms for minimization problems under the presence of computational errors. The algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and prove a convergence result for an unconstrained problem which is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems are studied in Chapter 3. In Chapter 4 we study continuous algorithms for minimization problems under the presence of computational errors.
ISBN: 9783031126444
Standard No.: 10.1007/978-3-031-12644-4doiSubjects--Topical Terms:
517763
Mathematical optimization.
LC Class. No.: QA402.5 / .Z37 2022
Dewey Class. No.: 519.6
Optimization in Banach spaces
LDR
:04278nmm a2200337 a 4500
001
2304282
003
DE-He213
005
20220929102557.0
006
m d
007
cr nn 008maaau
008
230409s2022 sz s 0 eng d
020
$a
9783031126444
$q
(electronic bk.)
020
$a
9783031126437
$q
(paper)
024
7
$a
10.1007/978-3-031-12644-4
$2
doi
035
$a
978-3-031-12644-4
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
QA402.5
$b
.Z37 2022
072
7
$a
PBU
$2
bicssc
072
7
$a
MAT003000
$2
bisacsh
072
7
$a
PBU
$2
thema
082
0 4
$a
519.6
$2
23
090
$a
QA402.5
$b
.Z38 2022
100
1
$a
Zaslavski, Alexander J.
$3
814771
245
1 0
$a
Optimization in Banach spaces
$h
[electronic resource] /
$c
by Alexander J. Zaslavski.
260
$a
Cham :
$b
Springer International Publishing :
$b
Imprint: Springer,
$c
2022.
300
$a
viii, 126 p. :
$b
ill., digital ;
$c
24 cm.
490
1
$a
SpringerBriefs in optimization,
$x
2191-575X
505
0
$a
Preface -- Introduction -- Convex optimization -- Nonconvex optimization -- Continuous algorithms -- References.
520
$a
The book is devoted to the study of constrained minimization problems on closed and convex sets in Banach spaces with a Frechet differentiable objective function. Such problems are well studied in a finite-dimensional space and in an infinite-dimensional Hilbert space. When the space is Hilbert there are many algorithms for solving optimization problems including the gradient projection algorithm which is one of the most important tools in the optimization theory, nonlinear analysis and their applications. An optimization problem is described by an objective function and a set of feasible points. For the gradient projection algorithm each iteration consists of two steps. The first step is a calculation of a gradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error. In our recent research we show that the gradient projection algorithm generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. It should be mentioned that the properties of a Hilbert space play an important role. When we consider an optimization problem in a general Banach space the situation becomes more difficult and less understood. On the other hand such problems arise in the approximation theory. The book is of interest for mathematicians working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book which appeals specifically to this audience is the study of algorithms for convex and nonconvex minimization problems in a general Banach space. The book is of interest for experts in applications of optimization to the approximation theory. In this book the goal is to obtain a good approximate solution of the constrained optimization problem in a general Banach space under the presence of computational errors. It is shown that the algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and prove a convergence result for an unconstrained problem which is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems are studied in Chapter 3. In Chapter 4 we study continuous algorithms for minimization problems under the presence of computational errors. The algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and prove a convergence result for an unconstrained problem which is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems are studied in Chapter 3. In Chapter 4 we study continuous algorithms for minimization problems under the presence of computational errors.
650
0
$a
Mathematical optimization.
$3
517763
650
0
$a
Banach spaces.
$3
579190
650
1 4
$a
Optimization.
$3
891104
650
2 4
$a
Numerical Analysis.
$3
892626
710
2
$a
SpringerLink (Online service)
$3
836513
773
0
$t
Springer Nature eBook
830
0
$a
SpringerBriefs in optimization.
$3
1566137
856
4 0
$u
https://doi.org/10.1007/978-3-031-12644-4
950
$a
Mathematics and Statistics (SpringerNature-11649)
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9445831
電子資源
11.線上閱覽_V
電子書
EB QA402.5 .Z37 2022
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入