語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Learning and coding in biological ne...
~
Fiete, Ila Rani.
FindBook
Google Book
Amazon
博客來
Learning and coding in biological neural networks.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Learning and coding in biological neural networks./
作者:
Fiete, Ila Rani.
面頁冊數:
133 p.
附註:
Source: Dissertation Abstracts International, Volume: 65-05, Section: B, page: 2271.
Contained By:
Dissertation Abstracts International65-05B.
標題:
Biology, Neuroscience. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3131836
ISBN:
0496790765
Learning and coding in biological neural networks.
Fiete, Ila Rani.
Learning and coding in biological neural networks.
- 133 p.
Source: Dissertation Abstracts International, Volume: 65-05, Section: B, page: 2271.
Thesis (Ph.D.)--Harvard University, 2004.
How can large groups of neurons that locally modify their activities learn to collectively perform a desired task? Do studies of learning in small networks tell us anything about learning in the fantastically large collection of neurons that make up a vertebrate brain? What factors do neurons optimize by encoding sensory inputs or motor commands in the way they do? In this thesis I present a collection of four theoretical works: each of the projects was motivated by specific constraints and complexities of biological neural networks, as revealed by experimental studies; together, they aim to partially address some of the central questions of neuroscience posed above.
ISBN: 0496790765Subjects--Topical Terms:
1017680
Biology, Neuroscience.
Learning and coding in biological neural networks.
LDR
:03326nmm 2200325 4500
001
1844467
005
20051017073512.5
008
130614s2004 eng d
020
$a
0496790765
035
$a
(UnM)AAI3131836
035
$a
AAI3131836
040
$a
UnM
$c
UnM
100
1
$a
Fiete, Ila Rani.
$3
1932660
245
1 0
$a
Learning and coding in biological neural networks.
300
$a
133 p.
500
$a
Source: Dissertation Abstracts International, Volume: 65-05, Section: B, page: 2271.
500
$a
Chair: Daniel S. Fisher.
502
$a
Thesis (Ph.D.)--Harvard University, 2004.
520
$a
How can large groups of neurons that locally modify their activities learn to collectively perform a desired task? Do studies of learning in small networks tell us anything about learning in the fantastically large collection of neurons that make up a vertebrate brain? What factors do neurons optimize by encoding sensory inputs or motor commands in the way they do? In this thesis I present a collection of four theoretical works: each of the projects was motivated by specific constraints and complexities of biological neural networks, as revealed by experimental studies; together, they aim to partially address some of the central questions of neuroscience posed above.
520
$a
We first study the role of sparse neural activity, as seen in the coding of sequential commands in a premotor area responsible for birdsong. We show that the sparse coding of temporal sequences in the songbird brain can, in a network where the feedforward plastic weights must translate the sparse sequential code into a time-varying muscle code, facilitate learning by minimizing synaptic interference.
520
$a
Next, we propose a biologically plausible synaptic plasticity rule that can perform goal-directed learning in recurrent networks of voltage-based spiking neurons that interact through conductances. Learning is based on the correlation of noisy local activity with a global reward signal; we prove that this rule performs stochastic gradient ascent on the reward. Thus, if the reward signal quantifies network performance on some desired task, the plasticity rule provably drives goal-directed learning in the network.
520
$a
To assess the convergence properties of the learning rule, we compare it with a known example of learning in the brain. Song-learning in finches is a clear example of a learned behavior, with detailed available neurophysiological data. With our learning rule, we train an anatomically accurate model birdsong network that drives a sound source to mimic an actual zebrafinch song. Simulation and theoretical results on the scalability of this rule show that learning with stochastic gradient ascent may be adequately fast to explain learning in the bird.
520
$a
Finally, we address the more general issue of the scalability of stochastic gradient learning on quadratic cost surfaces in linear systems, as a function of system size and task characteristics, by deriving analytical expressions for the learning curves.
590
$a
School code: 0084.
650
4
$a
Biology, Neuroscience.
$3
1017680
650
4
$a
Physics, General.
$3
1018488
690
$a
0317
690
$a
0605
710
2 0
$a
Harvard University.
$3
528741
773
0
$t
Dissertation Abstracts International
$g
65-05B.
790
1 0
$a
Fisher, Daniel S.,
$e
advisor
790
$a
0084
791
$a
Ph.D.
792
$a
2004
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3131836
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9193981
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入