語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Space-time computing with temporal n...
~
Smith, James E.
FindBook
Google Book
Amazon
博客來
Space-time computing with temporal neural networks
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Space-time computing with temporal neural networks/ James E. Smith.
作者:
Smith, James E.
其他作者:
Martonosi, Margaret,
出版者:
San Rafael, California :Morgan & Claypool Publishers, : 2017.,
面頁冊數:
1 online resource (243 p.)
內容註:
Space-time computing with temporal neural networks -- Abstract, Keywords -- Contents -- Figure Credits -- Preface -- Acknowledgments -- Part I. Introduction to Space-Time Computing and Temporal Neural Networks -- Chapter 1. Introduction -- Chapter 2. Space-Time Computing -- Chapter 3. Biological Overview -- Part II. Modeling Temporal Neural Networks -- Chapter 4. Connecting TNNs with Biology -- Chapter 5. Neuron Modeling -- Chapter 6. Computing with Excitatory Neurons -- Chapter 7. System Architecture -- Part III. Extended Design Study: Clustering the MNIST Dataset -- Chapter 8. Simulator Implementation -- Chapter 9. Clustering the MNIST Dataset -- Chapter 10. Summary and Conclusions -- References -- Author Biography.
標題:
Neural networks (Computer science) -
電子資源:
http://portal.igpublish.com/iglibrary/search/MCPB0006321.htmlclick for full text
ISBN:
1627058907
Space-time computing with temporal neural networks
Smith, James E.
Space-time computing with temporal neural networks
[electronic resource] /James E. Smith. - 1st ed. - San Rafael, California :Morgan & Claypool Publishers,2017. - 1 online resource (243 p.) - Synthesis Lectures on Computer Architecture ;39.. - Synthesis Lectures on Computer Architecture ;39..
Includes bibliographical references and index.
Space-time computing with temporal neural networks -- Abstract, Keywords -- Contents -- Figure Credits -- Preface -- Acknowledgments -- Part I. Introduction to Space-Time Computing and Temporal Neural Networks -- Chapter 1. Introduction -- Chapter 2. Space-Time Computing -- Chapter 3. Biological Overview -- Part II. Modeling Temporal Neural Networks -- Chapter 4. Connecting TNNs with Biology -- Chapter 5. Neuron Modeling -- Chapter 6. Computing with Excitatory Neurons -- Chapter 7. System Architecture -- Part III. Extended Design Study: Clustering the MNIST Dataset -- Chapter 8. Simulator Implementation -- Chapter 9. Clustering the MNIST Dataset -- Chapter 10. Summary and Conclusions -- References -- Author Biography.
Understanding and implementing the brain's computational paradigm is the one true grand challenge facing computer researchers. Not only are the brain's computational capabilities far beyond those of conventional computers, its energy efficiency is truly remarkable. This book, written from the perspective of a computer designer and targeted at computer researchers, is intended to give both background and lay out a course of action for studying the brain's computational paradigm. It contains a mix of concepts and ideas drawn from computational neuroscience, combined with those of the author. As background, relevant biological features are described in terms of their computational and communication properties. The brain's neocortex is constructed of massively interconnected neurons that compute and communicate via voltage spikes, and a strong argument can be made that precise spike timing is an essential element of the paradigm. Drawing from the biological features, a mathematics-based computational paradigm is constructed. The key feature is spiking neurons that perform communication and processing in space-time, with emphasis on time. In these paradigms, time is used as a freely available resource for both communication and computation. Neuron models are first discussed in general, and one is chosen for detailed development. Using the model, single-neuron computation is first explored. Neuron inputs are encoded as spike patterns, and the neuron is trained to identify input pattern similarities. Individual neurons are building blocks for constructing larger ensembles, referred to as "columns". These columns are trained in an unsupervised manner and operate collectively to perform the basic cognitive function of pattern clustering. Similar input patterns are mapped to a much smaller set of similar output patterns, thereby dividing the input patterns into identifiable clusters. Larger cognitive systems are formed by combining columns into a hierarchical architecture. These higher level architectures are the subject of ongoing study, and progress to date is described in detail in later chapters. Simulation plays a major role in model development, and the simulation infrastructure developed by the author is described.
ISBN: 1627058907Subjects--Topical Terms:
532070
Neural networks (Computer science)
LC Class. No.: QA76.87
Dewey Class. No.: 006.32
Space-time computing with temporal neural networks
LDR
:03870nmm a2200277 i 4500
001
2185922
006
m o d
007
cr cn|||||||||
008
200117s2017 cau ob 000 0 eng d
020
$a
1627058907
020
$a
1627059482
020
$a
9781627058902
020
$a
9781627059480
035
$a
MCPB0006321
040
$a
iG Publishing
$b
eng
$e
aacr2
$c
iG Publishing
041
0
$a
eng
050
0 0
$a
QA76.87
082
0 4
$a
006.32
100
1
$a
Smith, James E.
$3
3399558
245
1 0
$a
Space-time computing with temporal neural networks
$h
[electronic resource] /
$c
James E. Smith.
250
$a
1st ed.
260
$a
San Rafael, California :
$b
Morgan & Claypool Publishers,
$c
2017.
300
$a
1 online resource (243 p.)
490
1
$a
Synthesis Lectures on Computer Architecture ;
$v
39.
504
$a
Includes bibliographical references and index.
505
0
$a
Space-time computing with temporal neural networks -- Abstract, Keywords -- Contents -- Figure Credits -- Preface -- Acknowledgments -- Part I. Introduction to Space-Time Computing and Temporal Neural Networks -- Chapter 1. Introduction -- Chapter 2. Space-Time Computing -- Chapter 3. Biological Overview -- Part II. Modeling Temporal Neural Networks -- Chapter 4. Connecting TNNs with Biology -- Chapter 5. Neuron Modeling -- Chapter 6. Computing with Excitatory Neurons -- Chapter 7. System Architecture -- Part III. Extended Design Study: Clustering the MNIST Dataset -- Chapter 8. Simulator Implementation -- Chapter 9. Clustering the MNIST Dataset -- Chapter 10. Summary and Conclusions -- References -- Author Biography.
520
3
$a
Understanding and implementing the brain's computational paradigm is the one true grand challenge facing computer researchers. Not only are the brain's computational capabilities far beyond those of conventional computers, its energy efficiency is truly remarkable. This book, written from the perspective of a computer designer and targeted at computer researchers, is intended to give both background and lay out a course of action for studying the brain's computational paradigm. It contains a mix of concepts and ideas drawn from computational neuroscience, combined with those of the author. As background, relevant biological features are described in terms of their computational and communication properties. The brain's neocortex is constructed of massively interconnected neurons that compute and communicate via voltage spikes, and a strong argument can be made that precise spike timing is an essential element of the paradigm. Drawing from the biological features, a mathematics-based computational paradigm is constructed. The key feature is spiking neurons that perform communication and processing in space-time, with emphasis on time. In these paradigms, time is used as a freely available resource for both communication and computation. Neuron models are first discussed in general, and one is chosen for detailed development. Using the model, single-neuron computation is first explored. Neuron inputs are encoded as spike patterns, and the neuron is trained to identify input pattern similarities. Individual neurons are building blocks for constructing larger ensembles, referred to as "columns". These columns are trained in an unsupervised manner and operate collectively to perform the basic cognitive function of pattern clustering. Similar input patterns are mapped to a much smaller set of similar output patterns, thereby dividing the input patterns into identifiable clusters. Larger cognitive systems are formed by combining columns into a hierarchical architecture. These higher level architectures are the subject of ongoing study, and progress to date is described in detail in later chapters. Simulation plays a major role in model development, and the simulation infrastructure developed by the author is described.
650
0
$a
Neural networks (Computer science)
$3
532070
650
0
$a
Computational neuroscience.
$3
610819
650
0
$a
Temporal databases.
$3
630602
650
0
$a
Space and time.
$3
526021
700
1
$a
Martonosi, Margaret,
$e
editor.
$3
3399559
830
0
$a
Synthesis Lectures on Computer Architecture ;
$v
39.
$3
3399560
856
4 0
$u
http://portal.igpublish.com/iglibrary/search/MCPB0006321.html
$z
click for full text
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9372542
電子資源
11.線上閱覽_V
電子書
EB QA76.87
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入