語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
FindBook
Google Book
Amazon
博客來
Attention-Based Graph Representation Learning.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Attention-Based Graph Representation Learning./
作者:
Pang, Yan.
面頁冊數:
1 online resource (117 pages)
附註:
Source: Dissertations Abstracts International, Volume: 83-07, Section: B.
Contained By:
Dissertations Abstracts International83-07B.
標題:
Electrical engineering. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=28866942click for full text (PQDT)
ISBN:
9798762192880
Attention-Based Graph Representation Learning.
Pang, Yan.
Attention-Based Graph Representation Learning.
- 1 online resource (117 pages)
Source: Dissertations Abstracts International, Volume: 83-07, Section: B.
Thesis (Ph.D.)--University of Colorado at Denver, 2021.
Includes bibliographical references
Attention-based graph neural networks, both static and dynamic, can effectively solve many real-world problems across diverse fields. Their success is tied to the evolution of message-passing mechanisms in recent years.The current mechanisms in attention-based static graph neural networks aggregate information based on attention coefficients from neighbors to the center node. However, all representations within a node are conveyed equally at the macro-level (node-level), which leaves an opportunity to explore new ways to optimize the aggregation method. In this dissertation, we first propose a new static attention-based network called Graph Decipher, which makes the message flows of node features from micro-level (feature-level) to global-level transparent and boosts performance on node classification tasks. In order to reduce the computational burden incurred at the micro-level, Graph Decipher utilizes a category-oriented approach to convey relevant node representations. Additionally, Graph Decipher can be applied to imbalanced node classification on multi-class static graph datasets by leveraging its ability to explore representative node attributes by category. Although static graph neural networks have been widely used in modeling and representing learning graph structure data, many real-world problems involve time-varied factors. This means that graph neural networks must evolve from static to dynamic, in which nodes and links may emerge and disappear over time. In recent years, transformer-based dynamic graph neural networks have received increasing attention from researchers. However, due to the fully-connected attention conjunction of the standard transformer, the computation burden of current transformer-based dynamic networks is substantial. This dissertation proposes a novel dynamic graph neural network, Efficient-Dyn, which features a lightweight Sparse Temporal Transformer module that computes node representations through structural neighborhoods and temporal dynamics. Since the fully-connected attention conjunction has been simplified, the computation cost is far lower than current state-of-the-art networks. In addition, Efficient-Dyn adaptively encodes temporal information into a sequence of temporal patches with an equal amount of temporal-topological structure. This approach prevents information loss in the data preprocess while also achieving a finer temporal granularity.In real-world tasks, node classification and link prediction are the most prevalent and significant graph tasks. We conducted experiments on seven static node classification datasets under node classification tasks. The experiments demonstrate that the static Graph Decipher achieves state-of-the-art performance while imposing a substantially lower computational cost on static graphs. In addition, link prediction experiments were performed to measure the performance of dynamic Efficient-Dyn on both continuous and discrete dynamic graph datasets. Compared to several state-of-the-art graph embedding baselines, our experiments demonstrate that Efficient-Dyn performs faster inference while still achieving competitive performance.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2023
Mode of access: World Wide Web
ISBN: 9798762192880Subjects--Topical Terms:
649834
Electrical engineering.
Subjects--Index Terms:
Graph Neural NetworkIndex Terms--Genre/Form:
542853
Electronic books.
Attention-Based Graph Representation Learning.
LDR
:04397nmm a2200349K 4500
001
2359933
005
20230925131714.5
006
m o d
007
cr mn ---uuuuu
008
241011s2021 xx obm 000 0 eng d
020
$a
9798762192880
035
$a
(MiAaPQ)AAI28866942
035
$a
AAI28866942
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
$d
NTU
100
1
$a
Pang, Yan.
$3
3166404
245
1 0
$a
Attention-Based Graph Representation Learning.
264
0
$c
2021
300
$a
1 online resource (117 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertations Abstracts International, Volume: 83-07, Section: B.
500
$a
Advisor: Liu, Chao.
502
$a
Thesis (Ph.D.)--University of Colorado at Denver, 2021.
504
$a
Includes bibliographical references
520
$a
Attention-based graph neural networks, both static and dynamic, can effectively solve many real-world problems across diverse fields. Their success is tied to the evolution of message-passing mechanisms in recent years.The current mechanisms in attention-based static graph neural networks aggregate information based on attention coefficients from neighbors to the center node. However, all representations within a node are conveyed equally at the macro-level (node-level), which leaves an opportunity to explore new ways to optimize the aggregation method. In this dissertation, we first propose a new static attention-based network called Graph Decipher, which makes the message flows of node features from micro-level (feature-level) to global-level transparent and boosts performance on node classification tasks. In order to reduce the computational burden incurred at the micro-level, Graph Decipher utilizes a category-oriented approach to convey relevant node representations. Additionally, Graph Decipher can be applied to imbalanced node classification on multi-class static graph datasets by leveraging its ability to explore representative node attributes by category. Although static graph neural networks have been widely used in modeling and representing learning graph structure data, many real-world problems involve time-varied factors. This means that graph neural networks must evolve from static to dynamic, in which nodes and links may emerge and disappear over time. In recent years, transformer-based dynamic graph neural networks have received increasing attention from researchers. However, due to the fully-connected attention conjunction of the standard transformer, the computation burden of current transformer-based dynamic networks is substantial. This dissertation proposes a novel dynamic graph neural network, Efficient-Dyn, which features a lightweight Sparse Temporal Transformer module that computes node representations through structural neighborhoods and temporal dynamics. Since the fully-connected attention conjunction has been simplified, the computation cost is far lower than current state-of-the-art networks. In addition, Efficient-Dyn adaptively encodes temporal information into a sequence of temporal patches with an equal amount of temporal-topological structure. This approach prevents information loss in the data preprocess while also achieving a finer temporal granularity.In real-world tasks, node classification and link prediction are the most prevalent and significant graph tasks. We conducted experiments on seven static node classification datasets under node classification tasks. The experiments demonstrate that the static Graph Decipher achieves state-of-the-art performance while imposing a substantially lower computational cost on static graphs. In addition, link prediction experiments were performed to measure the performance of dynamic Efficient-Dyn on both continuous and discrete dynamic graph datasets. Compared to several state-of-the-art graph embedding baselines, our experiments demonstrate that Efficient-Dyn performs faster inference while still achieving competitive performance.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2023
538
$a
Mode of access: World Wide Web
650
4
$a
Electrical engineering.
$3
649834
650
4
$a
Engineering.
$3
586835
653
$a
Graph Neural Network
653
$a
Graph Decipher
655
7
$a
Electronic books.
$2
lcsh
$3
542853
690
$a
0544
690
$a
0537
710
2
$a
ProQuest Information and Learning Co.
$3
783688
710
2
$a
University of Colorado at Denver.
$b
Electrical Engineering.
$3
2049763
773
0
$t
Dissertations Abstracts International
$g
83-07B.
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=28866942
$z
click for full text (PQDT)
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9482289
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入