語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
New foundations for information theo...
~
SpringerLink (Online service)
FindBook
Google Book
Amazon
博客來
New foundations for information theory = logical entropy and Shannon entropy /
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
New foundations for information theory/ by David Ellerman.
其他題名:
logical entropy and Shannon entropy /
作者:
Ellerman, David.
出版者:
Cham :Springer International Publishing : : 2021.,
面頁冊數:
xiii, 113 p. :ill., digital ;24 cm.
內容註:
- Logical entropy -- The relationship between logical entropy and Shannon entropy -- The compound notions for logical and Shannon entropies -- Further developments of logical entropy -- Logical Quantum Information Theory -- Conclusion -- Appendix: Introduction to the logic of partitions.
Contained By:
Springer Nature eBook
標題:
Entropy (Information theory) -
電子資源:
https://doi.org/10.1007/978-3-030-86552-8
ISBN:
9783030865528
New foundations for information theory = logical entropy and Shannon entropy /
Ellerman, David.
New foundations for information theory
logical entropy and Shannon entropy /[electronic resource] :by David Ellerman. - Cham :Springer International Publishing :2021. - xiii, 113 p. :ill., digital ;24 cm. - SpringerBriefs in philosophy,2211-4556. - SpringerBriefs in philosophy..
- Logical entropy -- The relationship between logical entropy and Shannon entropy -- The compound notions for logical and Shannon entropies -- Further developments of logical entropy -- Logical Quantum Information Theory -- Conclusion -- Appendix: Introduction to the logic of partitions.
This monograph offers a new foundation for information theory that is based on the notion of information-as-distinctions, being directly measured by logical entropy, and on the re-quantification as Shannon entropy, which is the fundamental concept for the theory of coding and communications. Information is based on distinctions, differences, distinguishability, and diversity. Information sets are defined that express the distinctions made by a partition, e.g., the inverse-image of a random variable so they represent the pre-probability notion of information. Then logical entropy is a probability measure on the information sets, the probability that on two independent trials, a distinction or "dit" of the partition will be obtained. The formula for logical entropy is a new derivation of an old formula that goes back to the early twentieth century and has been re-derived many times in different contexts. As a probability measure, all the compound notions of joint, conditional, and mutual logical entropy are immediate. The Shannon entropy (which is not defined as a measure in the sense of measure theory) and its compound notions are then derived from a non-linear dit-to-bit transform that re-quantifies the distinctions of a random variable in terms of bits--so the Shannon entropy is the average number of binary distinctions or bits necessary to make all the distinctions of the random variable. And, using a linearization method, all the set concepts in this logical information theory naturally extend to vector spaces in general--and to Hilbert spaces in particular--for quantum logical information theory which provides the natural measure of the distinctions made in quantum measurement. Relatively short but dense in content, this work can be a reference to researchers and graduate students doing investigations in information theory, maximum entropy methods in physics, engineering, and statistics, and to all those with a special interest in a new approach to quantum information theory.
ISBN: 9783030865528
Standard No.: 10.1007/978-3-030-86552-8doiSubjects--Topical Terms:
664113
Entropy (Information theory)
LC Class. No.: Q370 / .E45 2021
Dewey Class. No.: 003.54
New foundations for information theory = logical entropy and Shannon entropy /
LDR
:03357nmm a2200337 a 4500
001
2253791
003
DE-He213
005
20211030150051.0
006
m d
007
cr nn 008maaau
008
220327s2021 sz s 0 eng d
020
$a
9783030865528
$q
(electronic bk.)
020
$a
9783030865511
$q
(paper)
024
7
$a
10.1007/978-3-030-86552-8
$2
doi
035
$a
978-3-030-86552-8
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
Q370
$b
.E45 2021
072
7
$a
HP
$2
bicssc
072
7
$a
PHI000000
$2
bisacsh
072
7
$a
QD
$2
thema
082
0 4
$a
003.54
$2
23
090
$a
Q370
$b
.E45 2021
100
1
$a
Ellerman, David.
$3
3490996
245
1 0
$a
New foundations for information theory
$h
[electronic resource] :
$b
logical entropy and Shannon entropy /
$c
by David Ellerman.
260
$a
Cham :
$b
Springer International Publishing :
$b
Imprint: Springer,
$c
2021.
300
$a
xiii, 113 p. :
$b
ill., digital ;
$c
24 cm.
490
1
$a
SpringerBriefs in philosophy,
$x
2211-4556
505
0
$a
- Logical entropy -- The relationship between logical entropy and Shannon entropy -- The compound notions for logical and Shannon entropies -- Further developments of logical entropy -- Logical Quantum Information Theory -- Conclusion -- Appendix: Introduction to the logic of partitions.
520
$a
This monograph offers a new foundation for information theory that is based on the notion of information-as-distinctions, being directly measured by logical entropy, and on the re-quantification as Shannon entropy, which is the fundamental concept for the theory of coding and communications. Information is based on distinctions, differences, distinguishability, and diversity. Information sets are defined that express the distinctions made by a partition, e.g., the inverse-image of a random variable so they represent the pre-probability notion of information. Then logical entropy is a probability measure on the information sets, the probability that on two independent trials, a distinction or "dit" of the partition will be obtained. The formula for logical entropy is a new derivation of an old formula that goes back to the early twentieth century and has been re-derived many times in different contexts. As a probability measure, all the compound notions of joint, conditional, and mutual logical entropy are immediate. The Shannon entropy (which is not defined as a measure in the sense of measure theory) and its compound notions are then derived from a non-linear dit-to-bit transform that re-quantifies the distinctions of a random variable in terms of bits--so the Shannon entropy is the average number of binary distinctions or bits necessary to make all the distinctions of the random variable. And, using a linearization method, all the set concepts in this logical information theory naturally extend to vector spaces in general--and to Hilbert spaces in particular--for quantum logical information theory which provides the natural measure of the distinctions made in quantum measurement. Relatively short but dense in content, this work can be a reference to researchers and graduate students doing investigations in information theory, maximum entropy methods in physics, engineering, and statistics, and to all those with a special interest in a new approach to quantum information theory.
650
0
$a
Entropy (Information theory)
$3
664113
650
1 4
$a
Philosophy, general.
$3
2162493
650
2 4
$a
Coding and Information Theory.
$3
891252
710
2
$a
SpringerLink (Online service)
$3
836513
773
0
$t
Springer Nature eBook
830
0
$a
SpringerBriefs in philosophy.
$3
1568619
856
4 0
$u
https://doi.org/10.1007/978-3-030-86552-8
950
$a
Mathematics and Statistics (SpringerNature-11649)
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9410313
電子資源
11.線上閱覽_V
電子書
EB Q370 .E45 2021
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入