Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Learning and coding in biological ne...
~
Fiete, Ila Rani.
Linked to FindBook
Google Book
Amazon
博客來
Learning and coding in biological neural networks.
Record Type:
Electronic resources : Monograph/item
Title/Author:
Learning and coding in biological neural networks./
Author:
Fiete, Ila Rani.
Description:
133 p.
Notes:
Source: Dissertation Abstracts International, Volume: 65-05, Section: B, page: 2271.
Contained By:
Dissertation Abstracts International65-05B.
Subject:
Biology, Neuroscience. -
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3131836
ISBN:
0496790765
Learning and coding in biological neural networks.
Fiete, Ila Rani.
Learning and coding in biological neural networks.
- 133 p.
Source: Dissertation Abstracts International, Volume: 65-05, Section: B, page: 2271.
Thesis (Ph.D.)--Harvard University, 2004.
How can large groups of neurons that locally modify their activities learn to collectively perform a desired task? Do studies of learning in small networks tell us anything about learning in the fantastically large collection of neurons that make up a vertebrate brain? What factors do neurons optimize by encoding sensory inputs or motor commands in the way they do? In this thesis I present a collection of four theoretical works: each of the projects was motivated by specific constraints and complexities of biological neural networks, as revealed by experimental studies; together, they aim to partially address some of the central questions of neuroscience posed above.
ISBN: 0496790765Subjects--Topical Terms:
1017680
Biology, Neuroscience.
Learning and coding in biological neural networks.
LDR
:03326nmm 2200325 4500
001
1844467
005
20051017073512.5
008
130614s2004 eng d
020
$a
0496790765
035
$a
(UnM)AAI3131836
035
$a
AAI3131836
040
$a
UnM
$c
UnM
100
1
$a
Fiete, Ila Rani.
$3
1932660
245
1 0
$a
Learning and coding in biological neural networks.
300
$a
133 p.
500
$a
Source: Dissertation Abstracts International, Volume: 65-05, Section: B, page: 2271.
500
$a
Chair: Daniel S. Fisher.
502
$a
Thesis (Ph.D.)--Harvard University, 2004.
520
$a
How can large groups of neurons that locally modify their activities learn to collectively perform a desired task? Do studies of learning in small networks tell us anything about learning in the fantastically large collection of neurons that make up a vertebrate brain? What factors do neurons optimize by encoding sensory inputs or motor commands in the way they do? In this thesis I present a collection of four theoretical works: each of the projects was motivated by specific constraints and complexities of biological neural networks, as revealed by experimental studies; together, they aim to partially address some of the central questions of neuroscience posed above.
520
$a
We first study the role of sparse neural activity, as seen in the coding of sequential commands in a premotor area responsible for birdsong. We show that the sparse coding of temporal sequences in the songbird brain can, in a network where the feedforward plastic weights must translate the sparse sequential code into a time-varying muscle code, facilitate learning by minimizing synaptic interference.
520
$a
Next, we propose a biologically plausible synaptic plasticity rule that can perform goal-directed learning in recurrent networks of voltage-based spiking neurons that interact through conductances. Learning is based on the correlation of noisy local activity with a global reward signal; we prove that this rule performs stochastic gradient ascent on the reward. Thus, if the reward signal quantifies network performance on some desired task, the plasticity rule provably drives goal-directed learning in the network.
520
$a
To assess the convergence properties of the learning rule, we compare it with a known example of learning in the brain. Song-learning in finches is a clear example of a learned behavior, with detailed available neurophysiological data. With our learning rule, we train an anatomically accurate model birdsong network that drives a sound source to mimic an actual zebrafinch song. Simulation and theoretical results on the scalability of this rule show that learning with stochastic gradient ascent may be adequately fast to explain learning in the bird.
520
$a
Finally, we address the more general issue of the scalability of stochastic gradient learning on quadratic cost surfaces in linear systems, as a function of system size and task characteristics, by deriving analytical expressions for the learning curves.
590
$a
School code: 0084.
650
4
$a
Biology, Neuroscience.
$3
1017680
650
4
$a
Physics, General.
$3
1018488
690
$a
0317
690
$a
0605
710
2 0
$a
Harvard University.
$3
528741
773
0
$t
Dissertation Abstracts International
$g
65-05B.
790
1 0
$a
Fisher, Daniel S.,
$e
advisor
790
$a
0084
791
$a
Ph.D.
792
$a
2004
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3131836
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9193981
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login