語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Efficient Methods in Deep Learning L...
~
Sha, Long.
FindBook
Google Book
Amazon
博客來
Efficient Methods in Deep Learning Lifecycle: Representation, Prediction and Model Compression.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Efficient Methods in Deep Learning Lifecycle: Representation, Prediction and Model Compression./
作者:
Sha, Long.
出版者:
Ann Arbor : ProQuest Dissertations & Theses, : 2021,
面頁冊數:
134 p.
附註:
Source: Dissertations Abstracts International, Volume: 82-12, Section: B.
Contained By:
Dissertations Abstracts International82-12B.
標題:
Computer science. -
電子資源:
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=27667652
ISBN:
9798738640520
Efficient Methods in Deep Learning Lifecycle: Representation, Prediction and Model Compression.
Sha, Long.
Efficient Methods in Deep Learning Lifecycle: Representation, Prediction and Model Compression.
- Ann Arbor : ProQuest Dissertations & Theses, 2021 - 134 p.
Source: Dissertations Abstracts International, Volume: 82-12, Section: B.
Thesis (Ph.D.)--Brandeis University, 2021.
This item must not be sold to any third party vendors.
The proliferation of digital technologies has led to an explosion in the number of large datasets available in the last few years, placing traditional machine learning approaches to data processing and modeling at a competitive disadvantage. Nevertheless, analyzing complex, high-dimensional, and noisy datasets can be a tremendous challenge. Deep learning, as part of a broader family of machine learning methods, has shown superior performance in dealing with such challenges in the past decade.However, several challenges in the deep learning lifecycle hinder the performance and democratization of deep learning methods. This dissertation spotlights a key challenge: efficiency. Specifically, we focused on three topics: efficient representation learning, efficient temporal model learning, and efficient model compression. The three topics correspond to the sequential stages of data representation, modeling, and deployment in the deep learning lifecycle.The first topic is efficient representation learning. Our research focuses on the field of knowledge graph representation learning. Though researchers have investigated several knowledge graph embedding methods, efficiently comparing them with existing solutions and exploring new ones remains challenging. We have, thus, proposed a unified group-theoretic framework for general knowledge graph embedding problems and explored two novel efficient embedding methods that though compact in size, demonstrate impressive results on benchmark datasets.The second topic is efficient temporal model learning. As a significant part of artificial intelligence, temporal learning utilizes temporal data to predict future events or infer latent traits. We found that numerous deep learning methods are focused on computer vision and natural language processing though efficient prediction models for temporal learning are in demand. This thesis proposes three efficient prediction models in temporal learning that can deliver superior performance while providing interpretable insights into the model and the task. The first model pertains to efficient knowledge tracing, which analyzes students' learning activities to attempt to quantify how well they master the knowledge components. The second model is for studying the epidemic data of the novel coronavirus SARS-CoV-2 to predict trends and examine the impact of its environmental factors. The third model utilizes longitudinal electronic medical records to predict patient mortality risk; this can help identify high-risk patients. The third topic is efficient model compression. We found that most state-of-the-art deep learning methods typically require substantial memory and storage overhead. This hinders their use of edge computing devices. Deep learning models with a large number of parameters and great computational complexity also consume significant energy, making their deployment on battery-limited devices difficult. To tackle this challenge, we have proposed a probabilistic inference method for pruning deep neural networks, which efficiently compresses the model and ensures minimum performance loss.In this dissertation, we have discussed efficient methods in the deep learning lifecycle, especially in representation learning, prediction, and model compression. We hope that our contributions serve as a catalyst for deep learning democratization and inspire further exploration of the subject.
ISBN: 9798738640520Subjects--Topical Terms:
523869
Computer science.
Subjects--Index Terms:
Deep learning
Efficient Methods in Deep Learning Lifecycle: Representation, Prediction and Model Compression.
LDR
:04715nmm a2200409 4500
001
2280370
005
20210827095934.5
008
220723s2021 ||||||||||||||||| ||eng d
020
$a
9798738640520
035
$a
(MiAaPQ)AAI27667652
035
$a
AAI27667652
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Sha, Long.
$0
(orcid)0000-0002-5732-9531
$3
3558885
245
1 0
$a
Efficient Methods in Deep Learning Lifecycle: Representation, Prediction and Model Compression.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2021
300
$a
134 p.
500
$a
Source: Dissertations Abstracts International, Volume: 82-12, Section: B.
500
$a
Advisor: Hong, Pengyu.
502
$a
Thesis (Ph.D.)--Brandeis University, 2021.
506
$a
This item must not be sold to any third party vendors.
520
$a
The proliferation of digital technologies has led to an explosion in the number of large datasets available in the last few years, placing traditional machine learning approaches to data processing and modeling at a competitive disadvantage. Nevertheless, analyzing complex, high-dimensional, and noisy datasets can be a tremendous challenge. Deep learning, as part of a broader family of machine learning methods, has shown superior performance in dealing with such challenges in the past decade.However, several challenges in the deep learning lifecycle hinder the performance and democratization of deep learning methods. This dissertation spotlights a key challenge: efficiency. Specifically, we focused on three topics: efficient representation learning, efficient temporal model learning, and efficient model compression. The three topics correspond to the sequential stages of data representation, modeling, and deployment in the deep learning lifecycle.The first topic is efficient representation learning. Our research focuses on the field of knowledge graph representation learning. Though researchers have investigated several knowledge graph embedding methods, efficiently comparing them with existing solutions and exploring new ones remains challenging. We have, thus, proposed a unified group-theoretic framework for general knowledge graph embedding problems and explored two novel efficient embedding methods that though compact in size, demonstrate impressive results on benchmark datasets.The second topic is efficient temporal model learning. As a significant part of artificial intelligence, temporal learning utilizes temporal data to predict future events or infer latent traits. We found that numerous deep learning methods are focused on computer vision and natural language processing though efficient prediction models for temporal learning are in demand. This thesis proposes three efficient prediction models in temporal learning that can deliver superior performance while providing interpretable insights into the model and the task. The first model pertains to efficient knowledge tracing, which analyzes students' learning activities to attempt to quantify how well they master the knowledge components. The second model is for studying the epidemic data of the novel coronavirus SARS-CoV-2 to predict trends and examine the impact of its environmental factors. The third model utilizes longitudinal electronic medical records to predict patient mortality risk; this can help identify high-risk patients. The third topic is efficient model compression. We found that most state-of-the-art deep learning methods typically require substantial memory and storage overhead. This hinders their use of edge computing devices. Deep learning models with a large number of parameters and great computational complexity also consume significant energy, making their deployment on battery-limited devices difficult. To tackle this challenge, we have proposed a probabilistic inference method for pruning deep neural networks, which efficiently compresses the model and ensures minimum performance loss.In this dissertation, we have discussed efficient methods in the deep learning lifecycle, especially in representation learning, prediction, and model compression. We hope that our contributions serve as a catalyst for deep learning democratization and inspire further exploration of the subject.
590
$a
School code: 0021.
650
4
$a
Computer science.
$3
523869
650
4
$a
Artificial intelligence.
$3
516317
650
4
$a
COVID-19.
$3
3554449
653
$a
Deep learning
653
$a
Model compression
653
$a
Representation learning
653
$a
Temporal learning
653
$a
Democratization
653
$a
Efficiency
653
$a
Knowledge tracing
653
$a
Prediction models
653
$a
Patient mortality risk
690
$a
0984
690
$a
0800
710
2
$a
Brandeis University.
$b
Computer Science.
$3
1030661
773
0
$t
Dissertations Abstracts International
$g
82-12B.
790
$a
0021
791
$a
Ph.D.
792
$a
2021
793
$a
English
856
4 0
$u
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=27667652
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9432103
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入