語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Geometric Methods in Statistics and ...
~
Wong, Sze Wai.
FindBook
Google Book
Amazon
博客來
Geometric Methods in Statistics and Optimization.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Geometric Methods in Statistics and Optimization./
作者:
Wong, Sze Wai.
出版者:
Ann Arbor : ProQuest Dissertations & Theses, : 2018,
面頁冊數:
114 p.
附註:
Source: Dissertations Abstracts International, Volume: 80-01, Section: B.
Contained By:
Dissertations Abstracts International80-01B.
標題:
Applied Mathematics. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10808337
ISBN:
9780438084421
Geometric Methods in Statistics and Optimization.
Wong, Sze Wai.
Geometric Methods in Statistics and Optimization.
- Ann Arbor : ProQuest Dissertations & Theses, 2018 - 114 p.
Source: Dissertations Abstracts International, Volume: 80-01, Section: B.
Thesis (Ph.D.)--The University of Chicago, 2018.
This item must not be sold to any third party vendors.
Statistical estimation problems in multivariate analysis and machine learning often seek linear relations among variables. This translates to finding an affine subspace from the sample data set that, in an appropriate sense, either best represents the data set or best separates it into components. In other words, statistical estimation problems are optimization problems on the affine Grassmannian, a noncompact smooth manifold that parameterizes all affine subspaces of a fixed dimension. The affine Grassmannian is a natural generalization of Euclidean space, points being 0-dimensional affine subspaces. The main objective of the first part of this work is to show that, like the Euclidean space, the affine Grassmannian can serve as a concrete computational platform for data analytic problems - points on the affine Grassmannian can be concretely represented and readily manipulated; distances, metrics, probability densities, geodesics, exponential maps, parallel transports, etc, all have closed-form expressions that can be easily calculated; and optimization algorithms, including steepest descent, Newton, conjugate gradient, have efficient affine Grassmannian analogues that use only standard numerical linear algebra. We then extend the framework to a nest of linear subspaces, that represent the variables in different regimes. Diving into the multi-scale representation of the data revealed by these problems requires a systematic study of nest of linear subspaces, which form a compact Riemannian manifold called the flag manifold. The main goal of this work is to show that flag manifold can be represented by matrix groups concisely and computed easily, and optimization on flag manifold can be performed with matrix operations, which bridges the gap between algebra and geometry. Lastly, we study the Yates's algorithm that was first proposed to exploit the structure of full factorial designed experiment to obtain least squares estimates for factor effects for all factors and their relevant interactions. In short it is an organized way to do iterative summation which avoids repeated computation. Many well-known algorithms including Fast Fourier transform and Walsh transform turned out to be special cases of Yates's method. Here we show that Yates's algorithm is optimal in the sense of a contraction of tensors but may be improved when considered from the perspective of bilinear complexity. We also show that it is a projection of a tensor network and point out its relationship with tensor train and tree tensor network.
ISBN: 9780438084421Subjects--Topical Terms:
1669109
Applied Mathematics.
Geometric Methods in Statistics and Optimization.
LDR
:03643nmm a2200349 4500
001
2197527
005
20190923134340.5
008
200811s2018 ||||||||||||||||| ||eng d
020
$a
9780438084421
035
$a
(MiAaPQ)AAI10808337
035
$a
(MiAaPQ)uchicago:14319
035
$a
AAI10808337
035
$a
2197527
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Wong, Sze Wai.
$3
3422348
245
1 0
$a
Geometric Methods in Statistics and Optimization.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2018
300
$a
114 p.
500
$a
Source: Dissertations Abstracts International, Volume: 80-01, Section: B.
500
$a
Publisher info.: Dissertation/Thesis.
500
$a
Advisor: Lim, Lek-Heng.
502
$a
Thesis (Ph.D.)--The University of Chicago, 2018.
506
$a
This item must not be sold to any third party vendors.
520
$a
Statistical estimation problems in multivariate analysis and machine learning often seek linear relations among variables. This translates to finding an affine subspace from the sample data set that, in an appropriate sense, either best represents the data set or best separates it into components. In other words, statistical estimation problems are optimization problems on the affine Grassmannian, a noncompact smooth manifold that parameterizes all affine subspaces of a fixed dimension. The affine Grassmannian is a natural generalization of Euclidean space, points being 0-dimensional affine subspaces. The main objective of the first part of this work is to show that, like the Euclidean space, the affine Grassmannian can serve as a concrete computational platform for data analytic problems - points on the affine Grassmannian can be concretely represented and readily manipulated; distances, metrics, probability densities, geodesics, exponential maps, parallel transports, etc, all have closed-form expressions that can be easily calculated; and optimization algorithms, including steepest descent, Newton, conjugate gradient, have efficient affine Grassmannian analogues that use only standard numerical linear algebra. We then extend the framework to a nest of linear subspaces, that represent the variables in different regimes. Diving into the multi-scale representation of the data revealed by these problems requires a systematic study of nest of linear subspaces, which form a compact Riemannian manifold called the flag manifold. The main goal of this work is to show that flag manifold can be represented by matrix groups concisely and computed easily, and optimization on flag manifold can be performed with matrix operations, which bridges the gap between algebra and geometry. Lastly, we study the Yates's algorithm that was first proposed to exploit the structure of full factorial designed experiment to obtain least squares estimates for factor effects for all factors and their relevant interactions. In short it is an organized way to do iterative summation which avoids repeated computation. Many well-known algorithms including Fast Fourier transform and Walsh transform turned out to be special cases of Yates's method. Here we show that Yates's algorithm is optimal in the sense of a contraction of tensors but may be improved when considered from the perspective of bilinear complexity. We also show that it is a projection of a tensor network and point out its relationship with tensor train and tree tensor network.
590
$a
School code: 0330.
650
4
$a
Applied Mathematics.
$3
1669109
650
4
$a
Mathematics.
$3
515831
650
4
$a
Statistics.
$3
517247
690
$a
0364
690
$a
0405
690
$a
0463
710
2
$a
The University of Chicago.
$b
Statistics.
$3
1673632
773
0
$t
Dissertations Abstracts International
$g
80-01B.
790
$a
0330
791
$a
Ph.D.
792
$a
2018
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10808337
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9375786
電子資源
01.外借(書)_YB
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入