Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Time series modeling with hidden var...
~
Mirowski, Piotr.
Linked to FindBook
Google Book
Amazon
博客來
Time series modeling with hidden variables and gradient-based algorithms.
Record Type:
Language materials, printed : Monograph/item
Title/Author:
Time series modeling with hidden variables and gradient-based algorithms./
Author:
Mirowski, Piotr.
Description:
215 p.
Notes:
Source: Dissertation Abstracts International, Volume: 72-06, Section: B, page: .
Contained By:
Dissertation Abstracts International72-06B.
Subject:
Applied Mathematics. -
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3445313
ISBN:
9781124544687
Time series modeling with hidden variables and gradient-based algorithms.
Mirowski, Piotr.
Time series modeling with hidden variables and gradient-based algorithms.
- 215 p.
Source: Dissertation Abstracts International, Volume: 72-06, Section: B, page: .
Thesis (Ph.D.)--New York University, 2011.
We collect time series from real-world phenomena, such as gene interactions in biology or word frequencies in consecutive news articles. However, these data present us with an incomplete picture, as they result from complex dynamical processes involving unobserved state variables. Research on state-space models is motivated by simultaneously trying to infer hidden state variables from observations, as well as learning the associated dynamic and generative models.
ISBN: 9781124544687Subjects--Topical Terms:
1669109
Applied Mathematics.
Time series modeling with hidden variables and gradient-based algorithms.
LDR
:03614nam 2200397 4500
001
1401649
005
20111017084418.5
008
130515s2011 ||||||||||||||||| ||eng d
020
$a
9781124544687
035
$a
(UMI)AAI3445313
035
$a
AAI3445313
040
$a
UMI
$c
UMI
100
1
$a
Mirowski, Piotr.
$3
1680795
245
1 0
$a
Time series modeling with hidden variables and gradient-based algorithms.
300
$a
215 p.
500
$a
Source: Dissertation Abstracts International, Volume: 72-06, Section: B, page: .
500
$a
Adviser: Yann LeCun.
502
$a
Thesis (Ph.D.)--New York University, 2011.
520
$a
We collect time series from real-world phenomena, such as gene interactions in biology or word frequencies in consecutive news articles. However, these data present us with an incomplete picture, as they result from complex dynamical processes involving unobserved state variables. Research on state-space models is motivated by simultaneously trying to infer hidden state variables from observations, as well as learning the associated dynamic and generative models.
520
$a
To address this problem, I have developed tractable, gradient-based methods for training Dynamic Factor Graphs (DFG) with continuous latent variables. DFGs consist of (potentially highly nonlinear) factors modeling joint probabilities between hidden and observed variables. My hypothesis is that a principled inference of hidden variables is achievable in the energy-based framework, through gradient-based optimization to find the minimum-energy state sequence given observations. This enables higher-order nonlinearities than graphical models. Maximum likelihood learning is done by minimizing the expected energy over training sequences with respect to the factors' parameters. These alternated inference and parameter updates constitute a deterministic EM-like procedure.
520
$a
Using nonlinear factors such as deep, convolutional networks, DFGs were shown to reconstruct chaotic attractors, to outperform a time series prediction benchmark, and to successfully impute motion capture data in presence of occlusions. In a joint work with the NYU Plant Systems Biology Lab, DFGs have been subsequently employed to the discovery of gene regulation networks by learning the dynamics of mRNA expression levels.
520
$a
DFGs have also been extended into a deep auto-encoder architecture for time-stamped text documents, with word frequencies as inputs. I focused on collections of documents exhibiting temporal structure. Working as dynamic topic models, DFGs could extract latent trajectories from consecutive political speeches; applied to news articles, they achieved state-of-the-art text categorization and retrieval performance.
520
$a
Finally, I used DFGs to evaluate the likelihood of discrete sequences of words in text corpora, relying on dynamics on word embeddings. Collaborating with AT&T Labs Research on a project in speech recognition, we have improved on existing continuous statistical language models by enriching them with word features and long-range topic dependencies.
590
$a
School code: 0146.
650
4
$a
Applied Mathematics.
$3
1669109
650
4
$a
Biology, Bioinformatics.
$3
1018415
650
4
$a
Artificial Intelligence.
$3
769149
650
4
$a
Computer Science.
$3
626642
690
$a
0364
690
$a
0715
690
$a
0800
690
$a
0984
710
2
$a
New York University.
$b
Computer Science.
$3
1065424
773
0
$t
Dissertation Abstracts International
$g
72-06B.
790
1 0
$a
LeCun, Yann,
$e
advisor
790
1 0
$a
Shasha, Dennis
$e
committee member
790
1 0
$a
Bangalore, Srinivas
$e
committee member
790
1 0
$a
Bregler, Chris
$e
committee member
790
1 0
$a
Pavlovic, Vladimir
$e
committee member
790
$a
0146
791
$a
Ph.D.
792
$a
2011
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3445313
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9164788
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login