語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Boltzmann Energetics and Temporal Dy...
~
Umbria Pedroni, Bruno.
FindBook
Google Book
Amazon
博客來
Boltzmann Energetics and Temporal Dynamics of Learning Neuromorphic Systems.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Boltzmann Energetics and Temporal Dynamics of Learning Neuromorphic Systems./
作者:
Umbria Pedroni, Bruno.
出版者:
Ann Arbor : ProQuest Dissertations & Theses, : 2019,
面頁冊數:
222 p.
附註:
Source: Dissertations Abstracts International, Volume: 80-10, Section: B.
Contained By:
Dissertations Abstracts International80-10B.
標題:
Neurosciences. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=13806980
ISBN:
9781392020791
Boltzmann Energetics and Temporal Dynamics of Learning Neuromorphic Systems.
Umbria Pedroni, Bruno.
Boltzmann Energetics and Temporal Dynamics of Learning Neuromorphic Systems.
- Ann Arbor : ProQuest Dissertations & Theses, 2019 - 222 p.
Source: Dissertations Abstracts International, Volume: 80-10, Section: B.
Thesis (Ph.D.)--University of California, San Diego, 2019.
This item must not be added to any third party search indexes.
The brain's cognitive power does not arise on exacting digital precision in high-performance computing, but emerges from an extremely efficient and resilient collective form of computation extending over very large ensembles of sluggish, imprecise, and unreliable analog components. In contrast to the reliable spike generation mechanism of cortical neurons, synapses are regarded as the primary source of this probabilistic behavior owing to release failures and quantal fluctuations. It has been speculated that the overall power efficiency and noise tolerance of the brain is a result of this unreliability in communication between neurons. Inspired by the stochastic nature of brain dynamics, we present methods of exploiting these concepts in order to produce more efficient algorithms and systems in the realm of neuromorphic computing, offering links between two traditionally disjoint scientific disciplines: computational neuroscience concerned with constructing models of brain function, and machine learning concerned with realizing adaptive computational intelligence. The first part of the dissertation investigates extensions on the Boltzmann machine, a stochastic recurrent artificial neural network capable of learning probability distributions over its inputs. Boltzmann machines are interesting from a neuromorphic perspective due to the local and Hebbian nature of their learning rule, along with parallel processing between network layers with which biological neural networks also operate. Additionally, the neurons in these networks present probabilistic activation functions and communicate with binary events, similar to what has been observed in experimental recordings of neural data. In search of more biological plausibility in inference and learning, we present conditions for significant equivalence between Boltzmann machines with contrastive divergence machine learning and integrate-and-fire neuronal networks with spike-timing-dependent plasticity (STDP). Next, we extend our methods to networks whose sole source of stochasticity pertains to the synapse, showing that synaptic noise can produce an efficient means of sampling. As the hallmark learning rule in spiking neural networks, we then investigate how STDP can be readily performed using simply forward connectivity access, and compare different data structures for organizing synaptic weights for memory efficiency. In the second part of the dissertation, we focus on learning neuromorphic systems and applications, including a methodology and an automation tool for implementing generative models of Boltzmann machines with digital spiking neurons. Next, we demonstrate how sparsely active neurons are capable of producing efficient results in a small-footprint keyword spotting application. Lastly, we present our ongoing work in designing a very large-scale reconfigurable digital neuromorphic system, tailored for both the machine learning as well as the computational neuroscience communities, which exploits the stochastic and temporal coding strategies developed in the first part of the dissertation and serves as an openly shared platform for further community-driven research in low-precision computation and event-driven processing.
ISBN: 9781392020791Subjects--Topical Terms:
588700
Neurosciences.
Boltzmann Energetics and Temporal Dynamics of Learning Neuromorphic Systems.
LDR
:04424nmm a2200349 4500
001
2207093
005
20190913102448.5
008
201008s2019 ||||||||||||||||| ||eng d
020
$a
9781392020791
035
$a
(MiAaPQ)AAI13806980
035
$a
(MiAaPQ)ucsd:18154
035
$a
AAI13806980
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Umbria Pedroni, Bruno.
$3
3434033
245
1 0
$a
Boltzmann Energetics and Temporal Dynamics of Learning Neuromorphic Systems.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2019
300
$a
222 p.
500
$a
Source: Dissertations Abstracts International, Volume: 80-10, Section: B.
500
$a
Publisher info.: Dissertation/Thesis.
500
$a
Advisor: Cauwenberghs, Gert.
502
$a
Thesis (Ph.D.)--University of California, San Diego, 2019.
506
$a
This item must not be added to any third party search indexes.
506
$a
This item must not be sold to any third party vendors.
520
$a
The brain's cognitive power does not arise on exacting digital precision in high-performance computing, but emerges from an extremely efficient and resilient collective form of computation extending over very large ensembles of sluggish, imprecise, and unreliable analog components. In contrast to the reliable spike generation mechanism of cortical neurons, synapses are regarded as the primary source of this probabilistic behavior owing to release failures and quantal fluctuations. It has been speculated that the overall power efficiency and noise tolerance of the brain is a result of this unreliability in communication between neurons. Inspired by the stochastic nature of brain dynamics, we present methods of exploiting these concepts in order to produce more efficient algorithms and systems in the realm of neuromorphic computing, offering links between two traditionally disjoint scientific disciplines: computational neuroscience concerned with constructing models of brain function, and machine learning concerned with realizing adaptive computational intelligence. The first part of the dissertation investigates extensions on the Boltzmann machine, a stochastic recurrent artificial neural network capable of learning probability distributions over its inputs. Boltzmann machines are interesting from a neuromorphic perspective due to the local and Hebbian nature of their learning rule, along with parallel processing between network layers with which biological neural networks also operate. Additionally, the neurons in these networks present probabilistic activation functions and communicate with binary events, similar to what has been observed in experimental recordings of neural data. In search of more biological plausibility in inference and learning, we present conditions for significant equivalence between Boltzmann machines with contrastive divergence machine learning and integrate-and-fire neuronal networks with spike-timing-dependent plasticity (STDP). Next, we extend our methods to networks whose sole source of stochasticity pertains to the synapse, showing that synaptic noise can produce an efficient means of sampling. As the hallmark learning rule in spiking neural networks, we then investigate how STDP can be readily performed using simply forward connectivity access, and compare different data structures for organizing synaptic weights for memory efficiency. In the second part of the dissertation, we focus on learning neuromorphic systems and applications, including a methodology and an automation tool for implementing generative models of Boltzmann machines with digital spiking neurons. Next, we demonstrate how sparsely active neurons are capable of producing efficient results in a small-footprint keyword spotting application. Lastly, we present our ongoing work in designing a very large-scale reconfigurable digital neuromorphic system, tailored for both the machine learning as well as the computational neuroscience communities, which exploits the stochastic and temporal coding strategies developed in the first part of the dissertation and serves as an openly shared platform for further community-driven research in low-precision computation and event-driven processing.
590
$a
School code: 0033.
650
4
$a
Neurosciences.
$3
588700
650
4
$a
Computer Engineering.
$3
1567821
650
4
$a
Artificial intelligence.
$3
516317
690
$a
0317
690
$a
0464
690
$a
0800
710
2
$a
University of California, San Diego.
$b
Bioengineering.
$3
1017915
773
0
$t
Dissertations Abstracts International
$g
80-10B.
790
$a
0033
791
$a
Ph.D.
792
$a
2019
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=13806980
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9383642
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入