Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Linked to FindBook
Google Book
Amazon
博客來
AI in Education : = Effective Machine Learning Methods to Improve Data Scarcity and Knowledge Generalization.
Record Type:
Electronic resources : Monograph/item
Title/Author:
AI in Education :/
Reminder of title:
Effective Machine Learning Methods to Improve Data Scarcity and Knowledge Generalization.
Author:
Shen, Jia Tracy.
Description:
1 online resource (122 pages)
Notes:
Source: Dissertations Abstracts International, Volume: 85-05, Section: B.
Contained By:
Dissertations Abstracts International85-05B.
Subject:
Deep learning. -
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=30720585click for full text (PQDT)
ISBN:
9798380725910
AI in Education : = Effective Machine Learning Methods to Improve Data Scarcity and Knowledge Generalization.
Shen, Jia Tracy.
AI in Education :
Effective Machine Learning Methods to Improve Data Scarcity and Knowledge Generalization. - 1 online resource (122 pages)
Source: Dissertations Abstracts International, Volume: 85-05, Section: B.
Thesis (Ph.D.)--The Pennsylvania State University, 2023.
Includes bibliographical references
In education, machine learning (ML), especially deep learning (DL) in recent years, has been extensively used to improve both teaching and learning. Despite the rapid advancement of ML and its application in education, a few challenges remain to be addressed. In this thesis, in particular, we focus on two such challenges: (i) data scarcity and (ii) knowledge generalization. First, given the privacy concerns of students or students' behavior differences, it is common to have missing data in the education domain, which challenges the application of ML methods. Second, due to varying data distributions across education platforms and applications, ML methods often struggle to generalize well over unseen data sets. Therefore, this thesis proposes effective data augmentation methods to combat the challenge (i) and investigate transfer learning techniques to solve the challenge (ii). More specifically for the challenge (i), we provide simple to complex solutions to augment data by: (a) optimizing statistical time lag selection to reduce matrix sparsity and improve original model performance by 32% in classification tasks and 12% in regression tasks; and (b) developing deep generative models (i.e., LSTM-[L]VAEs) to generate missing data to improve original model performance by 50%. For the challenge (ii), we employ transfer learning to improve model generalization and enable knowledge transfer from other domains to the education domain in three approaches: (1) a comparison approach; (2) a TAPT (Task Adapted Pre-train) approach; (3) a DAPT (Domain Adapted Pre-train) approach. Approach (1) first demonstrates the effectiveness of the transfer learning and then compares the transferability saliency between different models (i.e., LSTM vs. AdaRNN vs. Transformer) and transfer learning methods (i.e., feature-based vs. instance-based). It discovers that the Transformer model is 3-4 times more effective than other model structures and feature-based method is up to 5 times superior to its counterpart in transferability. Furthermore, Approach (2) leverages the shared semantic and lexical extractions from the pre-trained general language model and forms a TAPT BERT model to adapt to the particular education tasks. It surpasses the original general language model by 2%. Finally, Approach (3) further trains on the general language model but adapts to a large mathematical corpus to form a DAPT model. It is demonstrated to improve prior state-of-the-art models and BASE BERT by up to 22% and 8%, respectively.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2023
Mode of access: World Wide Web
ISBN: 9798380725910Subjects--Topical Terms:
3554982
Deep learning.
Subjects--Index Terms:
Machine learningIndex Terms--Genre/Form:
542853
Electronic books.
AI in Education : = Effective Machine Learning Methods to Improve Data Scarcity and Knowledge Generalization.
LDR
:04010nmm a2200421K 4500
001
2364494
005
20231130105314.5
006
m o d
007
cr mn ---uuuuu
008
241011s2023 xx obm 000 0 eng d
020
$a
9798380725910
035
$a
(MiAaPQ)AAI30720585
035
$a
(MiAaPQ)PennState_28228jqs5443
035
$a
AAI30720585
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
$d
NTU
100
1
$a
Shen, Jia Tracy.
$3
3705300
245
1 0
$a
AI in Education :
$b
Effective Machine Learning Methods to Improve Data Scarcity and Knowledge Generalization.
264
0
$c
2023
300
$a
1 online resource (122 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertations Abstracts International, Volume: 85-05, Section: B.
500
$a
Advisor: Lee, Dongwon.
502
$a
Thesis (Ph.D.)--The Pennsylvania State University, 2023.
504
$a
Includes bibliographical references
520
$a
In education, machine learning (ML), especially deep learning (DL) in recent years, has been extensively used to improve both teaching and learning. Despite the rapid advancement of ML and its application in education, a few challenges remain to be addressed. In this thesis, in particular, we focus on two such challenges: (i) data scarcity and (ii) knowledge generalization. First, given the privacy concerns of students or students' behavior differences, it is common to have missing data in the education domain, which challenges the application of ML methods. Second, due to varying data distributions across education platforms and applications, ML methods often struggle to generalize well over unseen data sets. Therefore, this thesis proposes effective data augmentation methods to combat the challenge (i) and investigate transfer learning techniques to solve the challenge (ii). More specifically for the challenge (i), we provide simple to complex solutions to augment data by: (a) optimizing statistical time lag selection to reduce matrix sparsity and improve original model performance by 32% in classification tasks and 12% in regression tasks; and (b) developing deep generative models (i.e., LSTM-[L]VAEs) to generate missing data to improve original model performance by 50%. For the challenge (ii), we employ transfer learning to improve model generalization and enable knowledge transfer from other domains to the education domain in three approaches: (1) a comparison approach; (2) a TAPT (Task Adapted Pre-train) approach; (3) a DAPT (Domain Adapted Pre-train) approach. Approach (1) first demonstrates the effectiveness of the transfer learning and then compares the transferability saliency between different models (i.e., LSTM vs. AdaRNN vs. Transformer) and transfer learning methods (i.e., feature-based vs. instance-based). It discovers that the Transformer model is 3-4 times more effective than other model structures and feature-based method is up to 5 times superior to its counterpart in transferability. Furthermore, Approach (2) leverages the shared semantic and lexical extractions from the pre-trained general language model and forms a TAPT BERT model to adapt to the particular education tasks. It surpasses the original general language model by 2%. Finally, Approach (3) further trains on the general language model but adapts to a large mathematical corpus to form a DAPT model. It is demonstrated to improve prior state-of-the-art models and BASE BERT by up to 22% and 8%, respectively.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2023
538
$a
Mode of access: World Wide Web
650
4
$a
Deep learning.
$3
3554982
650
4
$a
Time series.
$3
3561811
650
4
$a
Education.
$3
516579
650
4
$a
Educational technology.
$3
517670
650
4
$a
Information technology.
$3
532993
653
$a
Machine learning
653
$a
Education domain
653
$a
Deep learning
653
$a
General language model
653
$a
Data sets
655
7
$a
Electronic books.
$2
lcsh
$3
542853
690
$a
0515
690
$a
0489
690
$a
0800
690
$a
0710
710
2
$a
ProQuest Information and Learning Co.
$3
783688
710
2
$a
The Pennsylvania State University.
$3
699896
773
0
$t
Dissertations Abstracts International
$g
85-05B.
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=30720585
$z
click for full text (PQDT)
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9486850
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login