語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
FindBook
Google Book
Amazon
博客來
Building Theories of Neural Circuits with Machine Learning.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Building Theories of Neural Circuits with Machine Learning./
作者:
Bittner, Sean Robert.
面頁冊數:
1 online resource (167 pages)
附註:
Source: Dissertations Abstracts International, Volume: 83-02, Section: B.
Contained By:
Dissertations Abstracts International83-02B.
標題:
Neurosciences. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=28650141click for full text (PQDT)
ISBN:
9798535510484
Building Theories of Neural Circuits with Machine Learning.
Bittner, Sean Robert.
Building Theories of Neural Circuits with Machine Learning.
- 1 online resource (167 pages)
Source: Dissertations Abstracts International, Volume: 83-02, Section: B.
Thesis (Ph.D.)--Columbia University, 2021.
Includes bibliographical references
As theoretical neuroscience has grown as a field, machine learning techniques have played an increasingly important role in the development and evaluation of theories of neural computation. Today, machine learning is used in a variety of neuroscientific contexts from statistical inference to neural network training to normative modeling. This dissertation introduces machine learning techniques for use across the various domains of theoretical neuroscience, and the application of these techniques to build theories of neural circuits.First, we introduce a variety of optimization techniques for normative modeling of neural activity, which were used to evaluate theories of primary motor cortex (M1) and supplementary motor area (SMA). Specifically, neural responses during a cycling task performed by monkeys displayed distinctive dynamical geometries, which motivated hypotheses of how these geometries conferred computational properties necessary for the robust production of cyclic movements. By using normative optimization techniques to predict neural responses encoding muscle activity while ascribing to an "untangled" geometry, we found that minimal tangling was an accurate model of M1. Analyses with trajectory constrained RNNs showed that such an organization of M1 neural activity confers noise robustness, and that minimally "divergent" trajectories in SMA enable the tracking of contextual factors.In the remainder of the dissertation, we focus on the introduction and application of deep generative modeling techniques for theoretical neuroscience. Specifically, both techniques employ recent advancements in approaches to deep generative modeling -- normalizing flows -- to capture complex parametric structure in neural models. The first technique, which is designed for statistical generative models, enables look-up inference in intractable exponential family models. The efficiency of this technique is demonstrated by inferring neural firing rates in a log-gaussian poisson model of spiking responses to drift gratings in primary visual cortex. The second technique is designed for statistical inference in mechanistic models, where the inferred parameter distribution is constrained to produce emergent properties of computation. Once fit, the deep generative model confers analytic tools for quantifying the parametric structure giving rise to emergent properties. This technique was used for novel scientific insight into the nature of neuron-type variability in primary visual cortex and of distinct connectivity regimes of rapid task switching in superior colliculus.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2023
Mode of access: World Wide Web
ISBN: 9798535510484Subjects--Topical Terms:
588700
Neurosciences.
Subjects--Index Terms:
Machine learningIndex Terms--Genre/Form:
542853
Electronic books.
Building Theories of Neural Circuits with Machine Learning.
LDR
:03916nmm a2200385K 4500
001
2357222
005
20230622065020.5
006
m o d
007
cr mn ---uuuuu
008
241011s2021 xx obm 000 0 eng d
020
$a
9798535510484
035
$a
(MiAaPQ)AAI28650141
035
$a
AAI28650141
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
$d
NTU
100
1
$a
Bittner, Sean Robert.
$3
3697752
245
1 0
$a
Building Theories of Neural Circuits with Machine Learning.
264
0
$c
2021
300
$a
1 online resource (167 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertations Abstracts International, Volume: 83-02, Section: B.
500
$a
Advisor: Cunningham, John.
502
$a
Thesis (Ph.D.)--Columbia University, 2021.
504
$a
Includes bibliographical references
520
$a
As theoretical neuroscience has grown as a field, machine learning techniques have played an increasingly important role in the development and evaluation of theories of neural computation. Today, machine learning is used in a variety of neuroscientific contexts from statistical inference to neural network training to normative modeling. This dissertation introduces machine learning techniques for use across the various domains of theoretical neuroscience, and the application of these techniques to build theories of neural circuits.First, we introduce a variety of optimization techniques for normative modeling of neural activity, which were used to evaluate theories of primary motor cortex (M1) and supplementary motor area (SMA). Specifically, neural responses during a cycling task performed by monkeys displayed distinctive dynamical geometries, which motivated hypotheses of how these geometries conferred computational properties necessary for the robust production of cyclic movements. By using normative optimization techniques to predict neural responses encoding muscle activity while ascribing to an "untangled" geometry, we found that minimal tangling was an accurate model of M1. Analyses with trajectory constrained RNNs showed that such an organization of M1 neural activity confers noise robustness, and that minimally "divergent" trajectories in SMA enable the tracking of contextual factors.In the remainder of the dissertation, we focus on the introduction and application of deep generative modeling techniques for theoretical neuroscience. Specifically, both techniques employ recent advancements in approaches to deep generative modeling -- normalizing flows -- to capture complex parametric structure in neural models. The first technique, which is designed for statistical generative models, enables look-up inference in intractable exponential family models. The efficiency of this technique is demonstrated by inferring neural firing rates in a log-gaussian poisson model of spiking responses to drift gratings in primary visual cortex. The second technique is designed for statistical inference in mechanistic models, where the inferred parameter distribution is constrained to produce emergent properties of computation. Once fit, the deep generative model confers analytic tools for quantifying the parametric structure giving rise to emergent properties. This technique was used for novel scientific insight into the nature of neuron-type variability in primary visual cortex and of distinct connectivity regimes of rapid task switching in superior colliculus.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2023
538
$a
Mode of access: World Wide Web
650
4
$a
Neurosciences.
$3
588700
650
4
$a
Computer science.
$3
523869
650
4
$a
Artificial intelligence.
$3
516317
650
4
$a
Physiology.
$3
518431
650
4
$a
Population.
$3
518693
650
4
$a
Neurons.
$3
588699
650
4
$a
Brain research.
$3
3561789
650
4
$a
Optimization.
$3
891104
650
4
$a
Neural networks.
$3
677449
650
4
$a
Insects.
$3
516323
650
4
$a
Crustaceans.
$3
3548330
650
4
$a
Realism.
$3
528996
650
4
$a
Motion detectors.
$3
2111511
653
$a
Machine learning
653
$a
Neural circuits
653
$a
Theoretical neuroscience
655
7
$a
Electronic books.
$2
lcsh
$3
542853
690
$a
0317
690
$a
0984
690
$a
0800
690
$a
0719
710
2
$a
ProQuest Information and Learning Co.
$3
783688
710
2
$a
Columbia University.
$b
Neurobiology and Behavior.
$3
3173198
773
0
$t
Dissertations Abstracts International
$g
83-02B.
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=28650141
$z
click for full text (PQDT)
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9479578
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入