語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Dimension reduction in regression an...
~
Ye, Zhishen.
FindBook
Google Book
Amazon
博客來
Dimension reduction in regression analysis.
紀錄類型:
書目-語言資料,印刷品 : Monograph/item
正題名/作者:
Dimension reduction in regression analysis./
作者:
Ye, Zhishen.
面頁冊數:
121 p.
附註:
Chair: Robert E. Weiss.
Contained By:
Dissertation Abstracts International62-02B.
標題:
Biology, Biostatistics. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3005944
ISBN:
049315115X
Dimension reduction in regression analysis.
Ye, Zhishen.
Dimension reduction in regression analysis.
- 121 p.
Chair: Robert E. Weiss.
Thesis (Ph.D.)--University of California, Los Angeles, 2001.
Dimension reduction in regression analysis reduces the dimension of the predictor vector p-dimension <bold>x</bold> without specifying a parametric model and without loss of information on the regression of <italic> y</italic> on <bold>x</bold>. The problem is how to identify the low dimensional projection of <bold>x</bold>. The projection can be represented by the <italic> k</italic> eigenvectors corresponding to the nonzero eigenvalues of a <italic> k</italic> rank <italic>p</italic> by <italic>p</italic> matrix with (<italic> k</italic> < <italic>p</italic>).
ISBN: 049315115XSubjects--Topical Terms:
1018416
Biology, Biostatistics.
Dimension reduction in regression analysis.
LDR
:03260nam 2200313 a 45
001
929182
005
20110427
008
110427s2001 eng d
020
$a
049315115X
035
$a
(UnM)AAI3005944
035
$a
AAI3005944
040
$a
UnM
$c
UnM
100
1
$a
Ye, Zhishen.
$3
1252667
245
1 0
$a
Dimension reduction in regression analysis.
300
$a
121 p.
500
$a
Chair: Robert E. Weiss.
500
$a
Source: Dissertation Abstracts International, Volume: 62-02, Section: B, page: 0634.
502
$a
Thesis (Ph.D.)--University of California, Los Angeles, 2001.
520
$a
Dimension reduction in regression analysis reduces the dimension of the predictor vector p-dimension <bold>x</bold> without specifying a parametric model and without loss of information on the regression of <italic> y</italic> on <bold>x</bold>. The problem is how to identify the low dimensional projection of <bold>x</bold>. The projection can be represented by the <italic> k</italic> eigenvectors corresponding to the nonzero eigenvalues of a <italic> k</italic> rank <italic>p</italic> by <italic>p</italic> matrix with (<italic> k</italic> < <italic>p</italic>).
520
$a
In this dissertation, we introduce the concepts of a <italic>target matrix </italic>, which is a population <italic>p</italic> by <italic>p</italic> matrix to be estimated, and <italic>estimation methods</italic> which estimate the target matrix from data. We present a new perspective on three existing methods, SIR (Li, 1991), SAVE (Cook and Weisberg, 1991) and pHd (Li, 1992). Their <italic>target matrices</italic> and <italic>estimation methods</italic> are identified and distinguished. A system is built to identify and construct more <italic>target matrices</italic> and therefore more potential methods. SIR, SAVE, and pHd are unified as special cases of a broad new class of methods. In particular, we propose methods based on linear combinations of known <italic> target matrices</italic>.
520
$a
Because there are now so many methods of dimension reduction, we introduce methodology to select among different target matrices and estimation methods. A k-dimensional estimate is the first <italic>k</italic> eigenvectors of an estimated <italic>target matrix</italic>. The <italic>variability</italic> of the estimate is defined and assessed using a resampling plan.
520
$a
In general, no target matrix is guaranteed to identify the entire k-dimension projection of <bold>x</bold>. Assume the response <italic>y</italic> is categorical and the predictors <bold>x</bold> given <italic>y</italic> are normally distributed, the SAVE target matrix is shown to be able to recover the entire k-dimension projection. We introduce Bayesian modeling techniques as a new estimation method to estimate a target matrix, and study the behavior of various posterior estimates of the SAVE matrix with different priors. Examples with small sample size illustrate that Bayesian techniques can be useful.
590
$a
School code: 0031.
650
4
$a
Biology, Biostatistics.
$3
1018416
650
4
$a
Statistics.
$3
517247
690
$a
0308
690
$a
0463
710
2 0
$a
University of California, Los Angeles.
$3
626622
773
0
$t
Dissertation Abstracts International
$g
62-02B.
790
$a
0031
790
1 0
$a
Weiss, Robert E.,
$e
advisor
791
$a
Ph.D.
792
$a
2001
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3005944
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9100486
電子資源
11.線上閱覽_V
電子書
EB W9100486
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入