語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Low-rank graphical models and Bayesi...
~
Smith, Carl.
FindBook
Google Book
Amazon
博客來
Low-rank graphical models and Bayesian inference in the statistical analysis of noisy neural data.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Low-rank graphical models and Bayesian inference in the statistical analysis of noisy neural data./
作者:
Smith, Carl.
出版者:
Ann Arbor : ProQuest Dissertations & Theses, : 2013,
面頁冊數:
136 p.
附註:
Source: Dissertation Abstracts International, Volume: 75-02(E), Section: B.
Contained By:
Dissertation Abstracts International75-02B(E).
標題:
Statistics. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3598394
ISBN:
9781303469688
Low-rank graphical models and Bayesian inference in the statistical analysis of noisy neural data.
Smith, Carl.
Low-rank graphical models and Bayesian inference in the statistical analysis of noisy neural data.
- Ann Arbor : ProQuest Dissertations & Theses, 2013 - 136 p.
Source: Dissertation Abstracts International, Volume: 75-02(E), Section: B.
Thesis (Ph.D.)--Columbia University, 2013.
We develop new methods of Bayesian inference, largely in the context of analysis of neuroscience data. The work is broken into several parts. In the first part, we introduce a novel class of joint probability distributions in which exact inference is tractable. Previously it has been difficult to find general constructions for models in which efficient exact inference is possible, outside of certain classical cases. We identify a class of such models that are tractable owing to a certain "low-rank" structure in the potentials that couple neighboring variables. In the second part we develop methods to quantify and measure information loss in analysis of neuronal spike train data due to two types of noise, making use of the ideas developed in the first part. Information about neuronal identity or temporal resolution may be lost during spike detection and sorting, or precision of spike times may be corrupted by various effects. We quantify the information lost due to these effects for the relatively simple but sufficiently broad class of Markovian model neurons. We find that decoders that model the probability distribution of spike-neuron assignments significantly outperform decoders that use only the most likely spike assignments. We also apply the ideas of the low-rank models from the first section to defining a class of prior distributions over the space of stimuli (or other covariate) which, by conjugacy, preserve the tractability of inference. In the third part, we treat Bayesian methods for the estimation of sparse signals, with application to the locating of synapses in a dendritic tree. We develop a compartmentalized model of the dendritic tree. Building on previous work that applied and generalized ideas of least angle regression to obtain a fast Bayesian solution to the resulting estimation problem, we describe two other approaches to the same problem, one employing a horseshoe prior and the other using various spike-and-slab priors. In the last part, we revisit the low-rank models of the first section and apply them to the problem of inferring orientation selectivity maps from noisy observations of orientation preference. The relevant low-rank model exploits the self-conjugacy of the von Mises distribution on the circle. Because the orientation map model is loopy, we cannot do exact inference on the low-rank model by the forward backward algorithm, but block-wise Gibbs sampling by the forward backward algorithm speeds mixing. We explore another von Mises coupling potential Gibbs sampler that proves to effectively smooth noisily observed orientation maps.
ISBN: 9781303469688Subjects--Topical Terms:
517247
Statistics.
Low-rank graphical models and Bayesian inference in the statistical analysis of noisy neural data.
LDR
:03571nmm a2200301 4500
001
2159201
005
20180622095236.5
008
190424s2013 ||||||||||||||||| ||eng d
020
$a
9781303469688
035
$a
(MiAaPQ)AAI3598394
035
$a
(MiAaPQ)columbia:11623
035
$a
AAI3598394
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Smith, Carl.
$3
2147537
245
1 0
$a
Low-rank graphical models and Bayesian inference in the statistical analysis of noisy neural data.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2013
300
$a
136 p.
500
$a
Source: Dissertation Abstracts International, Volume: 75-02(E), Section: B.
500
$a
Adviser: Liam Paninski.
502
$a
Thesis (Ph.D.)--Columbia University, 2013.
520
$a
We develop new methods of Bayesian inference, largely in the context of analysis of neuroscience data. The work is broken into several parts. In the first part, we introduce a novel class of joint probability distributions in which exact inference is tractable. Previously it has been difficult to find general constructions for models in which efficient exact inference is possible, outside of certain classical cases. We identify a class of such models that are tractable owing to a certain "low-rank" structure in the potentials that couple neighboring variables. In the second part we develop methods to quantify and measure information loss in analysis of neuronal spike train data due to two types of noise, making use of the ideas developed in the first part. Information about neuronal identity or temporal resolution may be lost during spike detection and sorting, or precision of spike times may be corrupted by various effects. We quantify the information lost due to these effects for the relatively simple but sufficiently broad class of Markovian model neurons. We find that decoders that model the probability distribution of spike-neuron assignments significantly outperform decoders that use only the most likely spike assignments. We also apply the ideas of the low-rank models from the first section to defining a class of prior distributions over the space of stimuli (or other covariate) which, by conjugacy, preserve the tractability of inference. In the third part, we treat Bayesian methods for the estimation of sparse signals, with application to the locating of synapses in a dendritic tree. We develop a compartmentalized model of the dendritic tree. Building on previous work that applied and generalized ideas of least angle regression to obtain a fast Bayesian solution to the resulting estimation problem, we describe two other approaches to the same problem, one employing a horseshoe prior and the other using various spike-and-slab priors. In the last part, we revisit the low-rank models of the first section and apply them to the problem of inferring orientation selectivity maps from noisy observations of orientation preference. The relevant low-rank model exploits the self-conjugacy of the von Mises distribution on the circle. Because the orientation map model is loopy, we cannot do exact inference on the low-rank model by the forward backward algorithm, but block-wise Gibbs sampling by the forward backward algorithm speeds mixing. We explore another von Mises coupling potential Gibbs sampler that proves to effectively smooth noisily observed orientation maps.
590
$a
School code: 0054.
650
4
$a
Statistics.
$3
517247
650
4
$a
Neurosciences.
$3
588700
690
$a
0463
690
$a
0317
710
2
$a
Columbia University.
$b
Chemical Physics.
$3
3192599
773
0
$t
Dissertation Abstracts International
$g
75-02B(E).
790
$a
0054
791
$a
Ph.D.
792
$a
2013
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3598394
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9358748
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入