語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
New Algorithms for Supervised Dimens...
~
Zhang, Ning.
FindBook
Google Book
Amazon
博客來
New Algorithms for Supervised Dimension Reduction.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
New Algorithms for Supervised Dimension Reduction./
作者:
Zhang, Ning.
出版者:
Ann Arbor : ProQuest Dissertations & Theses, : 2019,
面頁冊數:
98 p.
附註:
Source: Dissertations Abstracts International, Volume: 80-11, Section: B.
Contained By:
Dissertations Abstracts International80-11B.
標題:
Statistics. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=13813625
ISBN:
9781392146460
New Algorithms for Supervised Dimension Reduction.
Zhang, Ning.
New Algorithms for Supervised Dimension Reduction.
- Ann Arbor : ProQuest Dissertations & Theses, 2019 - 98 p.
Source: Dissertations Abstracts International, Volume: 80-11, Section: B.
Thesis (Ph.D.)--Middle Tennessee State University, 2019.
This item must not be sold to any third party vendors.
Advances in data collection and storage capabilities during the past decades have led to information overload in most sciences and ushered in a big data era. Data of big volume, as well as high dimensionality, become ubiquitous in many scientific domains. They present many mathematical challenges as well as some opportunities and are bound to give rise to new theoretical developments.Dimension reduction aims to explore low dimensional representation for high dimensional data. It helps promote the understanding of the data structure through visualization and enhance the predictive performance of machine learning algorithms by preventing the "curse of dimensionality." As high dimensional data become ubiquitous in modern sciences, dimension reduction methods are playing more and more important roles in data analysis. The contribution of this dissertation is to propose some new algorithms for supervised dimension reduction that can handle high dimensional data more efficiently.The first new algorithm is the overlapping sliced inverse regression (OSIR). Sliced inverse regression (SIR) is a pioneer tool for supervised dimension reduction. It identifies the subspace of significant factors with intrinsic lower dimensionality, specifically known as the effective dimension reduction (EDR) space. OSIR refines SIR through an overlapping slicing scheme and can estimate the EDR space and determine the number of effective factors more accurately. We show that the overlapping procedure has the potential to identify the information contained in the derivatives of the inverse regression curve, which helps to explain the superiority of OSIR. We prove that OSIR algorithm is "the square root of"n-consistent. We also propose the use of bagging and bootstrapping techniques to further improve the accuracy of OSIR.Online learning has attracted great attention due to the increasing demand for systems that have the ability of learning and evolving. When the data to be processed is also high dimensional, and dimension reduction is necessary for visualization or prediction enhancement, online dimension reduction will play an essential role. We propose four new online learning approaches for supervised dimension reduction, namely, the incremental sliced inverse regression, the covariance-free incremental sliced inverse regression, the incremental overlapping sliced inverse regression, and the covariance-free incremental overlapping sliced inverse regression. All four methods are able to update the EDR space fast and efficiently when new observations come in. The effectiveness and efficiency of all four algorithms are verified by simulations and real data applications.
ISBN: 9781392146460Subjects--Topical Terms:
517247
Statistics.
New Algorithms for Supervised Dimension Reduction.
LDR
:03730nmm a2200325 4500
001
2209217
005
20191025102857.5
008
201008s2019 ||||||||||||||||| ||eng d
020
$a
9781392146460
035
$a
(MiAaPQ)AAI13813625
035
$a
(MiAaPQ)mtsu:11130
035
$a
AAI13813625
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Zhang, Ning.
$3
1035476
245
1 0
$a
New Algorithms for Supervised Dimension Reduction.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2019
300
$a
98 p.
500
$a
Source: Dissertations Abstracts International, Volume: 80-11, Section: B.
500
$a
Publisher info.: Dissertation/Thesis.
500
$a
Advisor: Wu, Qiang.
502
$a
Thesis (Ph.D.)--Middle Tennessee State University, 2019.
506
$a
This item must not be sold to any third party vendors.
520
$a
Advances in data collection and storage capabilities during the past decades have led to information overload in most sciences and ushered in a big data era. Data of big volume, as well as high dimensionality, become ubiquitous in many scientific domains. They present many mathematical challenges as well as some opportunities and are bound to give rise to new theoretical developments.Dimension reduction aims to explore low dimensional representation for high dimensional data. It helps promote the understanding of the data structure through visualization and enhance the predictive performance of machine learning algorithms by preventing the "curse of dimensionality." As high dimensional data become ubiquitous in modern sciences, dimension reduction methods are playing more and more important roles in data analysis. The contribution of this dissertation is to propose some new algorithms for supervised dimension reduction that can handle high dimensional data more efficiently.The first new algorithm is the overlapping sliced inverse regression (OSIR). Sliced inverse regression (SIR) is a pioneer tool for supervised dimension reduction. It identifies the subspace of significant factors with intrinsic lower dimensionality, specifically known as the effective dimension reduction (EDR) space. OSIR refines SIR through an overlapping slicing scheme and can estimate the EDR space and determine the number of effective factors more accurately. We show that the overlapping procedure has the potential to identify the information contained in the derivatives of the inverse regression curve, which helps to explain the superiority of OSIR. We prove that OSIR algorithm is "the square root of"n-consistent. We also propose the use of bagging and bootstrapping techniques to further improve the accuracy of OSIR.Online learning has attracted great attention due to the increasing demand for systems that have the ability of learning and evolving. When the data to be processed is also high dimensional, and dimension reduction is necessary for visualization or prediction enhancement, online dimension reduction will play an essential role. We propose four new online learning approaches for supervised dimension reduction, namely, the incremental sliced inverse regression, the covariance-free incremental sliced inverse regression, the incremental overlapping sliced inverse regression, and the covariance-free incremental overlapping sliced inverse regression. All four methods are able to update the EDR space fast and efficiently when new observations come in. The effectiveness and efficiency of all four algorithms are verified by simulations and real data applications.
590
$a
School code: 0170.
650
4
$a
Statistics.
$3
517247
650
4
$a
Computer science.
$3
523869
690
$a
0463
690
$a
0984
710
2
$a
Middle Tennessee State University.
$b
College of Basic & Applied Sciences.
$3
2098587
773
0
$t
Dissertations Abstracts International
$g
80-11B.
790
$a
0170
791
$a
Ph.D.
792
$a
2019
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=13813625
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9385766
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入