語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
FindBook
Google Book
Amazon
博客來
Exploring the Power of Rescaling.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Exploring the Power of Rescaling./
作者:
Li, Dan.
出版者:
Ann Arbor : ProQuest Dissertations & Theses, : 2016,
面頁冊數:
124 p.
附註:
Source: Dissertations Abstracts International, Volume: 77-08, Section: B.
Contained By:
Dissertations Abstracts International77-08B.
標題:
Applied Mathematics. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10014086
ISBN:
9781339483313
Exploring the Power of Rescaling.
Li, Dan.
Exploring the Power of Rescaling.
- Ann Arbor : ProQuest Dissertations & Theses, 2016 - 124 p.
Source: Dissertations Abstracts International, Volume: 77-08, Section: B.
Thesis (Ph.D.)--Lehigh University, 2016.
This item must not be sold to any third party vendors.
The goal of our research is a comprehensive exploration of the power of rescaling to improve the efficiency of various algorithms for linear optimization and related problems. Linear optimization and linear feasibility problems arguably yield the fundamental problems of optimization. Advances in solving these problems impact the core of optimization theory, and consequently its practical applications. The development and analysis of solution methods for linear optimization is one of the major topics in optimization research. Although the polynomial time ellipsoid method has excellent theoretical properties, however it turned out to be inefficient in practice. Still today, in spite of the dominance of interior point methods, various algorithms, such as perceptron algorithms, rescaling perceptron algorithms, von Neumann algorithms, Chubanov's method, and linear optimization related problems, such as the colorful feasibility problem - whose complexity status is still undecided - are studied. Motivated by the successful application of a rescaling principle on the perceptron algorithm, our research aims to explore the power of rescaling on other algorithms too, and improve their computational complexity. We focus on algorithms for solving linear feasibility and related problems, whose complexity depend on a quantity ρ, which is a condition number for measuring the distance to the feasibility or infeasibility of the problem. These algorithms include the von Neumann algorithm and the perceptron algorithm. First, we discuss the close duality relationship between the perceptron and the von Neumann algorithms. This observation allows us to transit one algorithm as a variant of the other, as well as we can transit their complexity results. The discovery of this duality not only provides a profound insight into both of the algorithms, but also results in new variants of the algorithms. Based on this duality relationship, we propose a deterministic rescaling von Neumann algorithm. It computationally outperforms the original von Neumann algorithm. Though its complexity has not been proved yet, we construct a von Neumann example which shows that the rescaling steps cannot keep the quantity ρ increasing monotonically. Showing a monotonic increase of ρ is a common technique used to prove the complexity of rescaling algorithms. Therefore, this von Neumann example actually shows that another proof method needs to be discovered in order to obtain the complexity of this deterministic rescaling von Neumann algorithm. Furthermore, this von Neumann example serves as the foundation of a perceptron example, which verifies that ρ is not always increasing after one rescaling step in the polynomial time deterministic rescaling perceptron algorithm either. After that, we adapt the idea of Chubanov's method to our rescaling frame and develop a polynomial-time column-wise rescaling von Neumann algorithm. Chubanov recently proposed a simple polynomial-time algorithm for solving homogeneous linear systems with positive variables. The Basic Procedure of Chubanov's method can either find a feasible solution, or identify an upper bound for at least one coordinate of any feasible solution. The column-wise rescaling von Neumann algorithm combines the Basic Procedure with column-wise rescaling to identify zero coordinates in all feasible solutions and remove the corresponding columns from the coefficient matrix. This is the first variant of the von Neumann algorithm with polynomial-time complexity. Furthermore, compared with the original von Neumann algorithm which returns an approximate solution, this rescaling variant guarantees an exact solution for feasible problems. Finally, we develop the methodology of higher order rescaling and propose a higher-order perceptron algorithm. We implement the perceptron improvement phase by utilizing parallel processors. Therefore, in a multi-core environment we may obtain several rescaling vectors without extra wall-clock time. Once we use these rescaling vectors in a single higher-order rescaling step, better rescaling rates may be expected and thus computational efficiency is improved.
ISBN: 9781339483313Subjects--Topical Terms:
1669109
Applied Mathematics.
Subjects--Index Terms:
Duality relationship
Exploring the Power of Rescaling.
LDR
:05510nmm a2200421 4500
001
2348480
005
20220912135552.5
008
241004s2016 ||||||||||||||||| ||eng d
020
$a
9781339483313
035
$a
(MiAaPQ)AAI10014086
035
$a
(MiAaPQ)lehigh:11431
035
$a
AAI10014086
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Li, Dan.
$3
1279371
245
1 0
$a
Exploring the Power of Rescaling.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2016
300
$a
124 p.
500
$a
Source: Dissertations Abstracts International, Volume: 77-08, Section: B.
500
$a
Publisher info.: Dissertation/Thesis.
500
$a
Includes supplementary digital materials.
500
$a
Advisor: Terlaky, Tamas.
502
$a
Thesis (Ph.D.)--Lehigh University, 2016.
506
$a
This item must not be sold to any third party vendors.
520
$a
The goal of our research is a comprehensive exploration of the power of rescaling to improve the efficiency of various algorithms for linear optimization and related problems. Linear optimization and linear feasibility problems arguably yield the fundamental problems of optimization. Advances in solving these problems impact the core of optimization theory, and consequently its practical applications. The development and analysis of solution methods for linear optimization is one of the major topics in optimization research. Although the polynomial time ellipsoid method has excellent theoretical properties, however it turned out to be inefficient in practice. Still today, in spite of the dominance of interior point methods, various algorithms, such as perceptron algorithms, rescaling perceptron algorithms, von Neumann algorithms, Chubanov's method, and linear optimization related problems, such as the colorful feasibility problem - whose complexity status is still undecided - are studied. Motivated by the successful application of a rescaling principle on the perceptron algorithm, our research aims to explore the power of rescaling on other algorithms too, and improve their computational complexity. We focus on algorithms for solving linear feasibility and related problems, whose complexity depend on a quantity ρ, which is a condition number for measuring the distance to the feasibility or infeasibility of the problem. These algorithms include the von Neumann algorithm and the perceptron algorithm. First, we discuss the close duality relationship between the perceptron and the von Neumann algorithms. This observation allows us to transit one algorithm as a variant of the other, as well as we can transit their complexity results. The discovery of this duality not only provides a profound insight into both of the algorithms, but also results in new variants of the algorithms. Based on this duality relationship, we propose a deterministic rescaling von Neumann algorithm. It computationally outperforms the original von Neumann algorithm. Though its complexity has not been proved yet, we construct a von Neumann example which shows that the rescaling steps cannot keep the quantity ρ increasing monotonically. Showing a monotonic increase of ρ is a common technique used to prove the complexity of rescaling algorithms. Therefore, this von Neumann example actually shows that another proof method needs to be discovered in order to obtain the complexity of this deterministic rescaling von Neumann algorithm. Furthermore, this von Neumann example serves as the foundation of a perceptron example, which verifies that ρ is not always increasing after one rescaling step in the polynomial time deterministic rescaling perceptron algorithm either. After that, we adapt the idea of Chubanov's method to our rescaling frame and develop a polynomial-time column-wise rescaling von Neumann algorithm. Chubanov recently proposed a simple polynomial-time algorithm for solving homogeneous linear systems with positive variables. The Basic Procedure of Chubanov's method can either find a feasible solution, or identify an upper bound for at least one coordinate of any feasible solution. The column-wise rescaling von Neumann algorithm combines the Basic Procedure with column-wise rescaling to identify zero coordinates in all feasible solutions and remove the corresponding columns from the coefficient matrix. This is the first variant of the von Neumann algorithm with polynomial-time complexity. Furthermore, compared with the original von Neumann algorithm which returns an approximate solution, this rescaling variant guarantees an exact solution for feasible problems. Finally, we develop the methodology of higher order rescaling and propose a higher-order perceptron algorithm. We implement the perceptron improvement phase by utilizing parallel processors. Therefore, in a multi-core environment we may obtain several rescaling vectors without extra wall-clock time. Once we use these rescaling vectors in a single higher-order rescaling step, better rescaling rates may be expected and thus computational efficiency is improved.
590
$a
School code: 0105.
650
4
$a
Applied Mathematics.
$3
1669109
650
4
$a
Industrial engineering.
$3
526216
650
4
$a
Operations research.
$3
547123
653
$a
Duality relationship
653
$a
Higher-order rescaling
653
$a
Linear optimization problems
653
$a
Perceptron algorithm
653
$a
Rescaling algorithm
653
$a
Von neumann algorithm
690
$a
0364
690
$a
0546
690
$a
0796
710
2
$a
Lehigh University.
$b
Industrial Engineering.
$3
2092122
773
0
$t
Dissertations Abstracts International
$g
77-08B.
790
$a
0105
791
$a
Ph.D.
792
$a
2016
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10014086
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9470918
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入