Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Approximate Bayesian Methods for Opt...
~
Morais, Michael J.
Linked to FindBook
Google Book
Amazon
博客來
Approximate Bayesian Methods for Optimal Neural Coding and Decision-Making.
Record Type:
Electronic resources : Monograph/item
Title/Author:
Approximate Bayesian Methods for Optimal Neural Coding and Decision-Making./
Author:
Morais, Michael J.
Published:
Ann Arbor : ProQuest Dissertations & Theses, : 2021,
Description:
139 p.
Notes:
Source: Dissertations Abstracts International, Volume: 82-12, Section: B.
Contained By:
Dissertations Abstracts International82-12B.
Subject:
Neurosciences. -
Online resource:
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=28496964
ISBN:
9798515256630
Approximate Bayesian Methods for Optimal Neural Coding and Decision-Making.
Morais, Michael J.
Approximate Bayesian Methods for Optimal Neural Coding and Decision-Making.
- Ann Arbor : ProQuest Dissertations & Theses, 2021 - 139 p.
Source: Dissertations Abstracts International, Volume: 82-12, Section: B.
Thesis (Ph.D.)--Princeton University, 2021.
This item must not be sold to any third party vendors.
One fundamental goal of theoretical neuroscience is to understand the normative principles governing the functional organization of neural circuits, and, in turn, to what extent they can be considered optimal. Calling neural representations of information in the brain ``optimal'' implies a multifarious equilibrium that balances robustness against flexibility, completeness against relevance, and so on, but it need only imply a solution to some optimization program. The exact forms of these programs varies with the modeling goals, neural circuits, tasks, or even animals under investigation. With this dissertation, we explore how we can define neural codes as optimal when they generate optimal behavior -- an easy principle to state, but a hard one to implement. Such a principle would bridge a gap between classical hypotheses of optimal neural coding, efficient coding and the Bayesian brain, with a common unified theory.In the first study, we analyzed neural population activity in V1 while monkeys performed a visual detection task, and found that a majority of the total choice-related variability is already present in V1 population activity. Such a prominent contribution of non-stimulus activity in classically sensory regions cannot be incorporated into existing models of neural coding, and demands models that can jointly optimize coding and decision-making within a single neural population.In the second study, we derived power-law efficient codes, a natural generalization of classical efficient codes, and show they are sufficient to replicate and explain a diverse set of psychophysical results. This broader family can maximize mutual information or minimize error of perceptual decisions, suggesting that psychophysical phenomena used to validate normative models could be more general features of perceptual systems than previously appreciated.In the third study, we translated the problem of joint model learning and decision-making into Bayesian machine learning, and extended a family of methods for decision-aware approximate inference to include a novel algorithm that we called loss-calibrated expectation propagation. How this problem can be solved by a non-biophysical system could be a constructive reference point for future studies into joint coding and decision-making, and the normative principles that drive decision-related variability in optimal sensory neural codes.
ISBN: 9798515256630Subjects--Topical Terms:
588700
Neurosciences.
Subjects--Index Terms:
Approximate inference
Approximate Bayesian Methods for Optimal Neural Coding and Decision-Making.
LDR
:03604nmm a2200385 4500
001
2281660
005
20210920103611.5
008
220723s2021 ||||||||||||||||| ||eng d
020
$a
9798515256630
035
$a
(MiAaPQ)AAI28496964
035
$a
AAI28496964
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Morais, Michael J.
$3
3560341
245
1 0
$a
Approximate Bayesian Methods for Optimal Neural Coding and Decision-Making.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2021
300
$a
139 p.
500
$a
Source: Dissertations Abstracts International, Volume: 82-12, Section: B.
500
$a
Advisor: Pillow, Jonathan W.
502
$a
Thesis (Ph.D.)--Princeton University, 2021.
506
$a
This item must not be sold to any third party vendors.
520
$a
One fundamental goal of theoretical neuroscience is to understand the normative principles governing the functional organization of neural circuits, and, in turn, to what extent they can be considered optimal. Calling neural representations of information in the brain ``optimal'' implies a multifarious equilibrium that balances robustness against flexibility, completeness against relevance, and so on, but it need only imply a solution to some optimization program. The exact forms of these programs varies with the modeling goals, neural circuits, tasks, or even animals under investigation. With this dissertation, we explore how we can define neural codes as optimal when they generate optimal behavior -- an easy principle to state, but a hard one to implement. Such a principle would bridge a gap between classical hypotheses of optimal neural coding, efficient coding and the Bayesian brain, with a common unified theory.In the first study, we analyzed neural population activity in V1 while monkeys performed a visual detection task, and found that a majority of the total choice-related variability is already present in V1 population activity. Such a prominent contribution of non-stimulus activity in classically sensory regions cannot be incorporated into existing models of neural coding, and demands models that can jointly optimize coding and decision-making within a single neural population.In the second study, we derived power-law efficient codes, a natural generalization of classical efficient codes, and show they are sufficient to replicate and explain a diverse set of psychophysical results. This broader family can maximize mutual information or minimize error of perceptual decisions, suggesting that psychophysical phenomena used to validate normative models could be more general features of perceptual systems than previously appreciated.In the third study, we translated the problem of joint model learning and decision-making into Bayesian machine learning, and extended a family of methods for decision-aware approximate inference to include a novel algorithm that we called loss-calibrated expectation propagation. How this problem can be solved by a non-biophysical system could be a constructive reference point for future studies into joint coding and decision-making, and the normative principles that drive decision-related variability in optimal sensory neural codes.
590
$a
School code: 0181.
650
4
$a
Neurosciences.
$3
588700
650
4
$a
Statistics.
$3
517247
650
4
$a
Logic.
$3
529544
653
$a
Approximate inference
653
$a
Bayesian statistics
653
$a
Decision-making
653
$a
Efficient coding
653
$a
Neural coding
653
$a
Perception
690
$a
0317
690
$a
0463
690
$a
0395
710
2
$a
Princeton University.
$b
Neuroscience.
$3
2099004
773
0
$t
Dissertations Abstracts International
$g
82-12B.
790
$a
0181
791
$a
Ph.D.
792
$a
2021
793
$a
English
856
4 0
$u
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=28496964
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9433393
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login