語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Dynamic network representation based...
~
Wu, Hao.
FindBook
Google Book
Amazon
博客來
Dynamic network representation based on latent factorization of tensors
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Dynamic network representation based on latent factorization of tensors/ by Hao Wu, Xuke Wu, Xin Luo.
作者:
Wu, Hao.
其他作者:
Wu, Xuke.
出版者:
Singapore :Springer Nature Singapore : : 2023.,
面頁冊數:
viii, 80 p. :ill. (chiefly col.), digital ;24 cm.
內容註:
Chapter 1 IntroductionChapter -- 2 Multiple Biases-Incorporated Latent Factorization of tensors -- Chapter 3 PID-Incorporated Latent Factorization of Tensors -- Chapter 4 Diverse Biases Nonnegative Latent Factorization of Tensors -- Chapter 5 ADMM-Based Nonnegative Latent Factorization of Tensors -- Chapter 6 Perspectives and Conclusion.
Contained By:
Springer Nature eBook
標題:
Computer networks - Mathematical models. -
電子資源:
https://doi.org/10.1007/978-981-19-8934-6
ISBN:
9789811989346
Dynamic network representation based on latent factorization of tensors
Wu, Hao.
Dynamic network representation based on latent factorization of tensors
[electronic resource] /by Hao Wu, Xuke Wu, Xin Luo. - Singapore :Springer Nature Singapore :2023. - viii, 80 p. :ill. (chiefly col.), digital ;24 cm. - SpringerBriefs in computer science,2191-5776. - SpringerBriefs in computer science..
Chapter 1 IntroductionChapter -- 2 Multiple Biases-Incorporated Latent Factorization of tensors -- Chapter 3 PID-Incorporated Latent Factorization of Tensors -- Chapter 4 Diverse Biases Nonnegative Latent Factorization of Tensors -- Chapter 5 ADMM-Based Nonnegative Latent Factorization of Tensors -- Chapter 6 Perspectives and Conclusion.
A dynamic network is frequently encountered in various real industrial applications, such as the Internet of Things. It is composed of numerous nodes and large-scale dynamic real-time interactions among them, where each node indicates a specified entity, each directed link indicates a real-time interaction, and the strength of an interaction can be quantified as the weight of a link. As the involved nodes increase drastically, it becomes impossible to observe their full interactions at each time slot, making a resultant dynamic network High Dimensional and Incomplete (HDI) An HDI dynamic network with directed and weighted links, despite its HDI nature, contains rich knowledge regarding involved nodes' various behavior patterns. Therefore, it is essential to study how to build efficient and effective representation learning models for acquiring useful knowledge. In this book, we first model a dynamic network into an HDI tensor and present the basic latent factorization of tensors (LFT) model. Then, we propose four representative LFT-based network representation methods. The first method integrates the short-time bias, long-time bias and preprocessing bias to precisely represent the volatility of network data. The second method utilizes a proportion-al-integral-derivative controller to construct an adjusted instance error to achieve a higher convergence rate. The third method considers the non-negativity of fluctuating network data by constraining latent features to be non-negative and incorporating the extended linear bias. The fourth method adopts an alternating direction method of multipliers framework to build a learning model for implementing representation to dynamic networks with high preciseness and efficiency.
ISBN: 9789811989346
Standard No.: 10.1007/978-981-19-8934-6doiSubjects--Topical Terms:
664546
Computer networks
--Mathematical models.
LC Class. No.: TK5105.5
Dewey Class. No.: 004.6
Dynamic network representation based on latent factorization of tensors
LDR
:03156nmm a2200337 a 4500
001
2316928
003
DE-He213
005
20230307064524.0
006
m d
007
cr nn 008maaau
008
230902s2023 si s 0 eng d
020
$a
9789811989346
$q
(electronic bk.)
020
$a
9789811989339
$q
(paper)
024
7
$a
10.1007/978-981-19-8934-6
$2
doi
035
$a
978-981-19-8934-6
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
TK5105.5
072
7
$a
UN
$2
bicssc
072
7
$a
COM031000
$2
bisacsh
072
7
$a
UN
$2
thema
082
0 4
$a
004.6
$2
23
090
$a
TK5105.5
$b
.W959 2023
100
1
$a
Wu, Hao.
$3
1678939
245
1 0
$a
Dynamic network representation based on latent factorization of tensors
$h
[electronic resource] /
$c
by Hao Wu, Xuke Wu, Xin Luo.
260
$a
Singapore :
$b
Springer Nature Singapore :
$b
Imprint: Springer,
$c
2023.
300
$a
viii, 80 p. :
$b
ill. (chiefly col.), digital ;
$c
24 cm.
490
1
$a
SpringerBriefs in computer science,
$x
2191-5776
505
0
$a
Chapter 1 IntroductionChapter -- 2 Multiple Biases-Incorporated Latent Factorization of tensors -- Chapter 3 PID-Incorporated Latent Factorization of Tensors -- Chapter 4 Diverse Biases Nonnegative Latent Factorization of Tensors -- Chapter 5 ADMM-Based Nonnegative Latent Factorization of Tensors -- Chapter 6 Perspectives and Conclusion.
520
$a
A dynamic network is frequently encountered in various real industrial applications, such as the Internet of Things. It is composed of numerous nodes and large-scale dynamic real-time interactions among them, where each node indicates a specified entity, each directed link indicates a real-time interaction, and the strength of an interaction can be quantified as the weight of a link. As the involved nodes increase drastically, it becomes impossible to observe their full interactions at each time slot, making a resultant dynamic network High Dimensional and Incomplete (HDI) An HDI dynamic network with directed and weighted links, despite its HDI nature, contains rich knowledge regarding involved nodes' various behavior patterns. Therefore, it is essential to study how to build efficient and effective representation learning models for acquiring useful knowledge. In this book, we first model a dynamic network into an HDI tensor and present the basic latent factorization of tensors (LFT) model. Then, we propose four representative LFT-based network representation methods. The first method integrates the short-time bias, long-time bias and preprocessing bias to precisely represent the volatility of network data. The second method utilizes a proportion-al-integral-derivative controller to construct an adjusted instance error to achieve a higher convergence rate. The third method considers the non-negativity of fluctuating network data by constraining latent features to be non-negative and incorporating the extended linear bias. The fourth method adopts an alternating direction method of multipliers framework to build a learning model for implementing representation to dynamic networks with high preciseness and efficiency.
650
0
$a
Computer networks
$x
Mathematical models.
$3
664546
650
0
$a
Calculus of tensors.
$3
533863
650
1 4
$a
Data Science.
$3
3538937
650
2 4
$a
Data Analysis and Big Data.
$3
3538537
700
1
$a
Wu, Xuke.
$3
3630541
700
1
$a
Luo, Xin.
$3
3426712
710
2
$a
SpringerLink (Online service)
$3
836513
773
0
$t
Springer Nature eBook
830
0
$a
SpringerBriefs in computer science.
$3
1567571
856
4 0
$u
https://doi.org/10.1007/978-981-19-8934-6
950
$a
Computer Science (SpringerNature-11645)
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9453178
電子資源
11.線上閱覽_V
電子書
EB TK5105.5
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入