Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Information and entropy in neural ne...
~
Princeton University.
Linked to FindBook
Google Book
Amazon
博客來
Information and entropy in neural networks and interacting systems.
Record Type:
Language materials, printed : Monograph/item
Title/Author:
Information and entropy in neural networks and interacting systems./
Author:
Shafee, Fariel.
Description:
238 p.
Notes:
Source: Dissertation Abstracts International, Volume: 69-12, Section: B, page: 7556.
Contained By:
Dissertation Abstracts International69-12B.
Subject:
Artificial Intelligence. -
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3338696
ISBN:
9780549933564
Information and entropy in neural networks and interacting systems.
Shafee, Fariel.
Information and entropy in neural networks and interacting systems.
- 238 p.
Source: Dissertation Abstracts International, Volume: 69-12, Section: B, page: 7556.
Thesis (Ph.D.)--Princeton University, 2009.
In this dissertation we present a study of certain characteristics of interacting systems that are related to information. The first is periodicity, correlation and other information-related properties of neural networks of integrate-and-fire type. We also form quasiclassical and quantum generalizations of such networks and identify the similarities and differences with the classical prototype. We indicate why entropy may be an important concept for a neural network and why a generalization of the definition of entropy may be required. Like neural networks, large ensembles of similar units that interact also need a generalization of classical information-theoretic concepts. We extend the concept of Shannon entropy in a novel way, which may be relevant when we have such interacting systems, and show how it differs from Shannon entropy and other generalizations, such as Tsallis entropy. We indicate how classical stochasticity may arise in interactions with an entangled environment in a quantum system in terms of Shannon's and generalized entropies and identify the differences. Such differences are also indicated in the use of certain prior probability distributions to fit data as per Bayesian rules. We also suggest possible quantum versions of pattern recognition, which is the principal goal of information processing in most neural networks.
ISBN: 9780549933564Subjects--Topical Terms:
769149
Artificial Intelligence.
Information and entropy in neural networks and interacting systems.
LDR
:02145nam 2200253 a 45
001
852556
005
20100630
008
100630s2009 ||||||||||||||||| ||eng d
020
$a
9780549933564
035
$a
(UMI)AAI3338696
035
$a
AAI3338696
040
$a
UMI
$c
UMI
100
1
$a
Shafee, Fariel.
$3
1018487
245
1 0
$a
Information and entropy in neural networks and interacting systems.
300
$a
238 p.
500
$a
Source: Dissertation Abstracts International, Volume: 69-12, Section: B, page: 7556.
502
$a
Thesis (Ph.D.)--Princeton University, 2009.
520
$a
In this dissertation we present a study of certain characteristics of interacting systems that are related to information. The first is periodicity, correlation and other information-related properties of neural networks of integrate-and-fire type. We also form quasiclassical and quantum generalizations of such networks and identify the similarities and differences with the classical prototype. We indicate why entropy may be an important concept for a neural network and why a generalization of the definition of entropy may be required. Like neural networks, large ensembles of similar units that interact also need a generalization of classical information-theoretic concepts. We extend the concept of Shannon entropy in a novel way, which may be relevant when we have such interacting systems, and show how it differs from Shannon entropy and other generalizations, such as Tsallis entropy. We indicate how classical stochasticity may arise in interactions with an entangled environment in a quantum system in terms of Shannon's and generalized entropies and identify the differences. Such differences are also indicated in the use of certain prior probability distributions to fit data as per Bayesian rules. We also suggest possible quantum versions of pattern recognition, which is the principal goal of information processing in most neural networks.
590
$a
School code: 0181.
650
4
$a
Artificial Intelligence.
$3
769149
650
4
$a
Physics, General.
$3
1018488
690
$a
0605
690
$a
0800
710
2
$a
Princeton University.
$3
645579
773
0
$t
Dissertation Abstracts International
$g
69-12B.
790
$a
0181
791
$a
Ph.D.
792
$a
2009
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3338696
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9069180
電子資源
11.線上閱覽_V
電子書
EB W9069180
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login