語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Time series modeling with hidden var...
~
Mirowski, Piotr.
FindBook
Google Book
Amazon
博客來
Time series modeling with hidden variables and gradient-based algorithms.
紀錄類型:
書目-語言資料,印刷品 : Monograph/item
正題名/作者:
Time series modeling with hidden variables and gradient-based algorithms./
作者:
Mirowski, Piotr.
面頁冊數:
215 p.
附註:
Source: Dissertation Abstracts International, Volume: 72-06, Section: B, page: .
Contained By:
Dissertation Abstracts International72-06B.
標題:
Applied Mathematics. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3445313
ISBN:
9781124544687
Time series modeling with hidden variables and gradient-based algorithms.
Mirowski, Piotr.
Time series modeling with hidden variables and gradient-based algorithms.
- 215 p.
Source: Dissertation Abstracts International, Volume: 72-06, Section: B, page: .
Thesis (Ph.D.)--New York University, 2011.
We collect time series from real-world phenomena, such as gene interactions in biology or word frequencies in consecutive news articles. However, these data present us with an incomplete picture, as they result from complex dynamical processes involving unobserved state variables. Research on state-space models is motivated by simultaneously trying to infer hidden state variables from observations, as well as learning the associated dynamic and generative models.
ISBN: 9781124544687Subjects--Topical Terms:
1669109
Applied Mathematics.
Time series modeling with hidden variables and gradient-based algorithms.
LDR
:03614nam 2200397 4500
001
1401649
005
20111017084418.5
008
130515s2011 ||||||||||||||||| ||eng d
020
$a
9781124544687
035
$a
(UMI)AAI3445313
035
$a
AAI3445313
040
$a
UMI
$c
UMI
100
1
$a
Mirowski, Piotr.
$3
1680795
245
1 0
$a
Time series modeling with hidden variables and gradient-based algorithms.
300
$a
215 p.
500
$a
Source: Dissertation Abstracts International, Volume: 72-06, Section: B, page: .
500
$a
Adviser: Yann LeCun.
502
$a
Thesis (Ph.D.)--New York University, 2011.
520
$a
We collect time series from real-world phenomena, such as gene interactions in biology or word frequencies in consecutive news articles. However, these data present us with an incomplete picture, as they result from complex dynamical processes involving unobserved state variables. Research on state-space models is motivated by simultaneously trying to infer hidden state variables from observations, as well as learning the associated dynamic and generative models.
520
$a
To address this problem, I have developed tractable, gradient-based methods for training Dynamic Factor Graphs (DFG) with continuous latent variables. DFGs consist of (potentially highly nonlinear) factors modeling joint probabilities between hidden and observed variables. My hypothesis is that a principled inference of hidden variables is achievable in the energy-based framework, through gradient-based optimization to find the minimum-energy state sequence given observations. This enables higher-order nonlinearities than graphical models. Maximum likelihood learning is done by minimizing the expected energy over training sequences with respect to the factors' parameters. These alternated inference and parameter updates constitute a deterministic EM-like procedure.
520
$a
Using nonlinear factors such as deep, convolutional networks, DFGs were shown to reconstruct chaotic attractors, to outperform a time series prediction benchmark, and to successfully impute motion capture data in presence of occlusions. In a joint work with the NYU Plant Systems Biology Lab, DFGs have been subsequently employed to the discovery of gene regulation networks by learning the dynamics of mRNA expression levels.
520
$a
DFGs have also been extended into a deep auto-encoder architecture for time-stamped text documents, with word frequencies as inputs. I focused on collections of documents exhibiting temporal structure. Working as dynamic topic models, DFGs could extract latent trajectories from consecutive political speeches; applied to news articles, they achieved state-of-the-art text categorization and retrieval performance.
520
$a
Finally, I used DFGs to evaluate the likelihood of discrete sequences of words in text corpora, relying on dynamics on word embeddings. Collaborating with AT&T Labs Research on a project in speech recognition, we have improved on existing continuous statistical language models by enriching them with word features and long-range topic dependencies.
590
$a
School code: 0146.
650
4
$a
Applied Mathematics.
$3
1669109
650
4
$a
Biology, Bioinformatics.
$3
1018415
650
4
$a
Artificial Intelligence.
$3
769149
650
4
$a
Computer Science.
$3
626642
690
$a
0364
690
$a
0715
690
$a
0800
690
$a
0984
710
2
$a
New York University.
$b
Computer Science.
$3
1065424
773
0
$t
Dissertation Abstracts International
$g
72-06B.
790
1 0
$a
LeCun, Yann,
$e
advisor
790
1 0
$a
Shasha, Dennis
$e
committee member
790
1 0
$a
Bangalore, Srinivas
$e
committee member
790
1 0
$a
Bregler, Chris
$e
committee member
790
1 0
$a
Pavlovic, Vladimir
$e
committee member
790
$a
0146
791
$a
Ph.D.
792
$a
2011
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3445313
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9164788
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入