Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Multi-Scale Spatiotemporal Neural Co...
~
Diniz, Eduardo.
Linked to FindBook
Google Book
Amazon
博客來
Multi-Scale Spatiotemporal Neural Computation: On the Relationship Between Dynamical Attractors Spiking Neural Networks and Convolutional Neural Circuits.
Record Type:
Electronic resources : Monograph/item
Title/Author:
Multi-Scale Spatiotemporal Neural Computation: On the Relationship Between Dynamical Attractors Spiking Neural Networks and Convolutional Neural Circuits./
Author:
Diniz, Eduardo.
Published:
Ann Arbor : ProQuest Dissertations & Theses, : 2023,
Description:
181 p.
Notes:
Source: Dissertations Abstracts International, Volume: 85-12, Section: B.
Contained By:
Dissertations Abstracts International85-12B.
Subject:
Behavior. -
Online resource:
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=31095246
ISBN:
9798383009550
Multi-Scale Spatiotemporal Neural Computation: On the Relationship Between Dynamical Attractors Spiking Neural Networks and Convolutional Neural Circuits.
Diniz, Eduardo.
Multi-Scale Spatiotemporal Neural Computation: On the Relationship Between Dynamical Attractors Spiking Neural Networks and Convolutional Neural Circuits.
- Ann Arbor : ProQuest Dissertations & Theses, 2023 - 181 p.
Source: Dissertations Abstracts International, Volume: 85-12, Section: B.
Thesis (Ph.D.)--University of Pittsburgh, 2023.
Understanding spatiotemporal neural dynamics and developing biologically-inspired artificial neural networks remain open challenges in computational neuroscience. Critical gaps persist in elucidating cortical rhythms, memory consolidation, and biological networks' remarkable spatiotemporal processing capabilities. This dissertation hypothesizes that asymmetric connectivity and dedicated fast-slow processing pathways in neural systems enhance depth, robustness, and versatility in handling complex spatiotemporal patterns. Our first contribution is elucidating how neurons communicate and synchronize activity via temporally precise spikes by examining the dynamics of spike-coding networks. Developing models of cortical neural oscillators reveal the origins of spontaneous transitions between active and silent states underlying slow-wave sleep rhythms, demonstrating how the intricate balance of excitation and inhibition orchestrates these oscillations. Our second is to establish a mathematical equivalence between Hopfield networks' associative memory models and spike-coding networks by showing that fast and slow asymmetric connectivity weights induce equivalent cyclic attractor dynamics in both systems. Introducing asymmetric weights in slow connections enables both models to learn and generate complex temporal firing sequences, transitioning between quasi-attractor states representing stored memories. Simulations demonstrate the efficacy of spike-coding networks for encoding and retrieving temporal sequences while performing the n-back working memory task. Our third contribution is to harness the potential of generative adversarial networks for unpaired cross-modality translation from 3 Tesla to 7 Tesla magnetic resonance imaging. We propose a fast-slow convolutional network architecture to enhance translation performance by balancing local and global information processing. This dissertation makes significant contributions by elucidating brain mechanisms underlying rhythms and memory, and unifying foundational computational frameworks while extracting principles to improve artificial neural network design.
ISBN: 9798383009550Subjects--Topical Terms:
532476
Behavior.
Multi-Scale Spatiotemporal Neural Computation: On the Relationship Between Dynamical Attractors Spiking Neural Networks and Convolutional Neural Circuits.
LDR
:03347nmm a2200349 4500
001
2398572
005
20240812064710.5
006
m o d
007
cr#unu||||||||
008
251215s2023 ||||||||||||||||| ||eng d
020
$a
9798383009550
035
$a
(MiAaPQ)AAI31095246
035
$a
(MiAaPQ)Pittsburgh45468
035
$a
AAI31095246
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Diniz, Eduardo.
$3
3768488
245
1 0
$a
Multi-Scale Spatiotemporal Neural Computation: On the Relationship Between Dynamical Attractors Spiking Neural Networks and Convolutional Neural Circuits.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2023
300
$a
181 p.
500
$a
Source: Dissertations Abstracts International, Volume: 85-12, Section: B.
500
$a
Advisor: El-Jaroudi, Amro;Dallal, Ahmed;Sun, Mingui;Ibrahim, Tamer S.;Aizenstein, Howard J.;Mao, Zhi-Hong.
502
$a
Thesis (Ph.D.)--University of Pittsburgh, 2023.
520
$a
Understanding spatiotemporal neural dynamics and developing biologically-inspired artificial neural networks remain open challenges in computational neuroscience. Critical gaps persist in elucidating cortical rhythms, memory consolidation, and biological networks' remarkable spatiotemporal processing capabilities. This dissertation hypothesizes that asymmetric connectivity and dedicated fast-slow processing pathways in neural systems enhance depth, robustness, and versatility in handling complex spatiotemporal patterns. Our first contribution is elucidating how neurons communicate and synchronize activity via temporally precise spikes by examining the dynamics of spike-coding networks. Developing models of cortical neural oscillators reveal the origins of spontaneous transitions between active and silent states underlying slow-wave sleep rhythms, demonstrating how the intricate balance of excitation and inhibition orchestrates these oscillations. Our second is to establish a mathematical equivalence between Hopfield networks' associative memory models and spike-coding networks by showing that fast and slow asymmetric connectivity weights induce equivalent cyclic attractor dynamics in both systems. Introducing asymmetric weights in slow connections enables both models to learn and generate complex temporal firing sequences, transitioning between quasi-attractor states representing stored memories. Simulations demonstrate the efficacy of spike-coding networks for encoding and retrieving temporal sequences while performing the n-back working memory task. Our third contribution is to harness the potential of generative adversarial networks for unpaired cross-modality translation from 3 Tesla to 7 Tesla magnetic resonance imaging. We propose a fast-slow convolutional network architecture to enhance translation performance by balancing local and global information processing. This dissertation makes significant contributions by elucidating brain mechanisms underlying rhythms and memory, and unifying foundational computational frameworks while extracting principles to improve artificial neural network design.
590
$a
School code: 0178.
650
4
$a
Behavior.
$3
532476
650
4
$a
Neurons.
$3
588699
650
4
$a
Memory.
$3
522110
650
4
$a
Brain research.
$3
3561789
650
4
$a
Neural networks.
$3
677449
650
4
$a
Neurosciences.
$3
588700
650
4
$a
Biology.
$3
522710
650
4
$a
Natural language processing.
$3
1073412
650
4
$a
Rhythm.
$3
586705
650
4
$a
System theory.
$3
525574
650
4
$a
Systems science.
$3
3168411
690
$a
0800
690
$a
0317
690
$a
0306
690
$a
0790
710
2
$a
University of Pittsburgh.
$3
958527
773
0
$t
Dissertations Abstracts International
$g
85-12B.
790
$a
0178
791
$a
Ph.D.
792
$a
2023
793
$a
English
856
4 0
$u
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=31095246
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9506892
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login