Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Geometric Methods in Statistics and ...
~
Wong, Sze Wai.
Linked to FindBook
Google Book
Amazon
博客來
Geometric Methods in Statistics and Optimization.
Record Type:
Electronic resources : Monograph/item
Title/Author:
Geometric Methods in Statistics and Optimization./
Author:
Wong, Sze Wai.
Published:
Ann Arbor : ProQuest Dissertations & Theses, : 2018,
Description:
114 p.
Notes:
Source: Dissertations Abstracts International, Volume: 80-01, Section: B.
Contained By:
Dissertations Abstracts International80-01B.
Subject:
Applied Mathematics. -
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10808337
ISBN:
9780438084421
Geometric Methods in Statistics and Optimization.
Wong, Sze Wai.
Geometric Methods in Statistics and Optimization.
- Ann Arbor : ProQuest Dissertations & Theses, 2018 - 114 p.
Source: Dissertations Abstracts International, Volume: 80-01, Section: B.
Thesis (Ph.D.)--The University of Chicago, 2018.
This item must not be sold to any third party vendors.
Statistical estimation problems in multivariate analysis and machine learning often seek linear relations among variables. This translates to finding an affine subspace from the sample data set that, in an appropriate sense, either best represents the data set or best separates it into components. In other words, statistical estimation problems are optimization problems on the affine Grassmannian, a noncompact smooth manifold that parameterizes all affine subspaces of a fixed dimension. The affine Grassmannian is a natural generalization of Euclidean space, points being 0-dimensional affine subspaces. The main objective of the first part of this work is to show that, like the Euclidean space, the affine Grassmannian can serve as a concrete computational platform for data analytic problems - points on the affine Grassmannian can be concretely represented and readily manipulated; distances, metrics, probability densities, geodesics, exponential maps, parallel transports, etc, all have closed-form expressions that can be easily calculated; and optimization algorithms, including steepest descent, Newton, conjugate gradient, have efficient affine Grassmannian analogues that use only standard numerical linear algebra. We then extend the framework to a nest of linear subspaces, that represent the variables in different regimes. Diving into the multi-scale representation of the data revealed by these problems requires a systematic study of nest of linear subspaces, which form a compact Riemannian manifold called the flag manifold. The main goal of this work is to show that flag manifold can be represented by matrix groups concisely and computed easily, and optimization on flag manifold can be performed with matrix operations, which bridges the gap between algebra and geometry. Lastly, we study the Yates's algorithm that was first proposed to exploit the structure of full factorial designed experiment to obtain least squares estimates for factor effects for all factors and their relevant interactions. In short it is an organized way to do iterative summation which avoids repeated computation. Many well-known algorithms including Fast Fourier transform and Walsh transform turned out to be special cases of Yates's method. Here we show that Yates's algorithm is optimal in the sense of a contraction of tensors but may be improved when considered from the perspective of bilinear complexity. We also show that it is a projection of a tensor network and point out its relationship with tensor train and tree tensor network.
ISBN: 9780438084421Subjects--Topical Terms:
1669109
Applied Mathematics.
Geometric Methods in Statistics and Optimization.
LDR
:03643nmm a2200349 4500
001
2197527
005
20190923134340.5
008
200811s2018 ||||||||||||||||| ||eng d
020
$a
9780438084421
035
$a
(MiAaPQ)AAI10808337
035
$a
(MiAaPQ)uchicago:14319
035
$a
AAI10808337
035
$a
2197527
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Wong, Sze Wai.
$3
3422348
245
1 0
$a
Geometric Methods in Statistics and Optimization.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2018
300
$a
114 p.
500
$a
Source: Dissertations Abstracts International, Volume: 80-01, Section: B.
500
$a
Publisher info.: Dissertation/Thesis.
500
$a
Advisor: Lim, Lek-Heng.
502
$a
Thesis (Ph.D.)--The University of Chicago, 2018.
506
$a
This item must not be sold to any third party vendors.
520
$a
Statistical estimation problems in multivariate analysis and machine learning often seek linear relations among variables. This translates to finding an affine subspace from the sample data set that, in an appropriate sense, either best represents the data set or best separates it into components. In other words, statistical estimation problems are optimization problems on the affine Grassmannian, a noncompact smooth manifold that parameterizes all affine subspaces of a fixed dimension. The affine Grassmannian is a natural generalization of Euclidean space, points being 0-dimensional affine subspaces. The main objective of the first part of this work is to show that, like the Euclidean space, the affine Grassmannian can serve as a concrete computational platform for data analytic problems - points on the affine Grassmannian can be concretely represented and readily manipulated; distances, metrics, probability densities, geodesics, exponential maps, parallel transports, etc, all have closed-form expressions that can be easily calculated; and optimization algorithms, including steepest descent, Newton, conjugate gradient, have efficient affine Grassmannian analogues that use only standard numerical linear algebra. We then extend the framework to a nest of linear subspaces, that represent the variables in different regimes. Diving into the multi-scale representation of the data revealed by these problems requires a systematic study of nest of linear subspaces, which form a compact Riemannian manifold called the flag manifold. The main goal of this work is to show that flag manifold can be represented by matrix groups concisely and computed easily, and optimization on flag manifold can be performed with matrix operations, which bridges the gap between algebra and geometry. Lastly, we study the Yates's algorithm that was first proposed to exploit the structure of full factorial designed experiment to obtain least squares estimates for factor effects for all factors and their relevant interactions. In short it is an organized way to do iterative summation which avoids repeated computation. Many well-known algorithms including Fast Fourier transform and Walsh transform turned out to be special cases of Yates's method. Here we show that Yates's algorithm is optimal in the sense of a contraction of tensors but may be improved when considered from the perspective of bilinear complexity. We also show that it is a projection of a tensor network and point out its relationship with tensor train and tree tensor network.
590
$a
School code: 0330.
650
4
$a
Applied Mathematics.
$3
1669109
650
4
$a
Mathematics.
$3
515831
650
4
$a
Statistics.
$3
517247
690
$a
0364
690
$a
0405
690
$a
0463
710
2
$a
The University of Chicago.
$b
Statistics.
$3
1673632
773
0
$t
Dissertations Abstracts International
$g
80-01B.
790
$a
0330
791
$a
Ph.D.
792
$a
2018
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10808337
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9375786
電子資源
01.外借(書)_YB
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login