語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Information theory and learning: A ...
~
Nemenman, Ilya Mark.
FindBook
Google Book
Amazon
博客來
Information theory and learning: A physical approach.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Information theory and learning: A physical approach./
作者:
Nemenman, Ilya Mark.
面頁冊數:
131 p.
附註:
Source: Dissertation Abstracts International, Volume: 61-08, Section: B, page: 4202.
Contained By:
Dissertation Abstracts International61-08B.
標題:
Physics, General. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=9981559
ISBN:
0599880384
Information theory and learning: A physical approach.
Nemenman, Ilya Mark.
Information theory and learning: A physical approach.
- 131 p.
Source: Dissertation Abstracts International, Volume: 61-08, Section: B, page: 4202.
Thesis (Ph.D.)--Princeton University, 2000.
We try to establish a unified information theoretic approach to learning and to explore some of its applications. First, we define predictive information as the mutual information between the past and the future of a time series, discuss its behavior as a function of the length of the series, and explain how other quantities of interest studied previously in learning theory---as well as in dynamical systems and statistical mechanics---emerge from this universally definable concept. We then prove that predictive information provides the unique measure for the complexity of dynamics underlying the time series and show that there are classes of models characterized by power-law growth of the predictive information that are qualitatively more complex than any of the systems that have been investigated before. Further, we investigate numerically the learning of a nonparametric probability density, which is an example of a problem with power-law complexity, and show that the proper Bayesian formulation of this problem provides for the 'Occam' factors that punish overly complex models and thus allow one to learn not only a solution within a specific model class, but also the class itself using the data only and with very few a priori assumptions. We study a possible information theoretic method that regularizes the learning of an undersampled discrete variable, and show that learning in such a setup goes through stages of very different complexities. Finally, we discuss how all of these ideas may be useful in various problems in physics, statistics, and, most importantly, biology.
ISBN: 0599880384Subjects--Topical Terms:
1018488
Physics, General.
Information theory and learning: A physical approach.
LDR
:02461nmm 2200277 4500
001
1857414
005
20041123145129.5
008
130614s2000 eng d
020
$a
0599880384
035
$a
(UnM)AAI9981559
035
$a
AAI9981559
040
$a
UnM
$c
UnM
100
1
$a
Nemenman, Ilya Mark.
$3
1945133
245
1 0
$a
Information theory and learning: A physical approach.
300
$a
131 p.
500
$a
Source: Dissertation Abstracts International, Volume: 61-08, Section: B, page: 4202.
500
$a
Adviser: William Bialek.
502
$a
Thesis (Ph.D.)--Princeton University, 2000.
520
$a
We try to establish a unified information theoretic approach to learning and to explore some of its applications. First, we define predictive information as the mutual information between the past and the future of a time series, discuss its behavior as a function of the length of the series, and explain how other quantities of interest studied previously in learning theory---as well as in dynamical systems and statistical mechanics---emerge from this universally definable concept. We then prove that predictive information provides the unique measure for the complexity of dynamics underlying the time series and show that there are classes of models characterized by power-law growth of the predictive information that are qualitatively more complex than any of the systems that have been investigated before. Further, we investigate numerically the learning of a nonparametric probability density, which is an example of a problem with power-law complexity, and show that the proper Bayesian formulation of this problem provides for the 'Occam' factors that punish overly complex models and thus allow one to learn not only a solution within a specific model class, but also the class itself using the data only and with very few a priori assumptions. We study a possible information theoretic method that regularizes the learning of an undersampled discrete variable, and show that learning in such a setup goes through stages of very different complexities. Finally, we discuss how all of these ideas may be useful in various problems in physics, statistics, and, most importantly, biology.
590
$a
School code: 0181.
650
4
$a
Physics, General.
$3
1018488
650
4
$a
Statistics.
$3
517247
690
$a
0605
690
$a
0463
710
2 0
$a
Princeton University.
$3
645579
773
0
$t
Dissertation Abstracts International
$g
61-08B.
790
1 0
$a
Bialek, William,
$e
advisor
790
$a
0181
791
$a
Ph.D.
792
$a
2000
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=9981559
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9176114
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入