語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
FindBook
Google Book
Amazon
博客來
An Exploration of Comparability Issues in Educational Research : = Scale Linking, Equating, and Propensity Score Weighting.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
An Exploration of Comparability Issues in Educational Research :/
其他題名:
Scale Linking, Equating, and Propensity Score Weighting.
作者:
Wu, Tong.
面頁冊數:
1 online resource (152 pages)
附註:
Source: Dissertations Abstracts International, Volume: 84-10, Section: A.
Contained By:
Dissertations Abstracts International84-10A.
標題:
Educational tests & measurements. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=30423810click for full text (PQDT)
ISBN:
9798379437534
An Exploration of Comparability Issues in Educational Research : = Scale Linking, Equating, and Propensity Score Weighting.
Wu, Tong.
An Exploration of Comparability Issues in Educational Research :
Scale Linking, Equating, and Propensity Score Weighting. - 1 online resource (152 pages)
Source: Dissertations Abstracts International, Volume: 84-10, Section: A.
Thesis (Ph.D.)--The University of North Carolina at Charlotte, 2023.
Includes bibliographical references
This three-article dissertation aims to address three methodological challenges to ensure comparability in educational research, including scale linking, test equating, and propensity score (PS) weighting. The first study intends to improve test scale comparability by evaluating the effect of six missing data handling approaches, including listwise deletion (LWD), treating missing data as incorrect responses (IN), corrected item mean imputation (CM), imputing with a response function (RF), multiple imputation (MI), and full information likelihood information (FIML), on item response theory (IRT) scale linking accuracy when missing data occur within common items. The relative performance of these six missing data treatment methods under two missing mechanisms is explored with simulated data. Results show that RF, MI, and FIML produce fewer errors for conducting scale linking, whereas LWD is associated with the most errors regardless of testing conditions. The second study aims to ensure test score comparability by proposing a new equating method to account for rater errors in rater-mediated assessments. Specifically, the performance of using an IRT observed-score equating method with a hierarchical rater model (HRM) is investigated under various conditions. The newly proposed equating method leads to comparable bias, SE, and RMSE to that of a traditional IRT observedscore equating method with the use of generalized partial credit model (GPCM) as normal raters scoring the new test forms. However, when aberrant raters are involved in the scoring process, the HRM IRT observed-score equating method generally produces more accurate results in bias and RMSE, though generates comparable SEs to the traditional method. The third study examines the performance of six covariate balance diagnostics when using PS weighting method with multilevel data. Specifically, a set of simulated conditions is used to examine the ability of within-cluster and pooled absolute standardized bias (ASB), variance ratio (VR), and percent bias reduction (PBR) methods to identify a correct PS model. In addition, the association between the balance statistics and the bias in treatment effect is explored. Within-cluster ASB and PBR are observed to be associated with the most accurate results in the choice of PS model as compared to other diagnostics. Pooled ASB is found to have the highest association with the treatment effect bias. By advancing the methodology for addressing comparability issues, the dissertation intends to enhance the validity and improve the quality of educational research.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2023
Mode of access: World Wide Web
ISBN: 9798379437534Subjects--Topical Terms:
3168483
Educational tests & measurements.
Subjects--Index Terms:
Covariate balance diagnosticsIndex Terms--Genre/Form:
542853
Electronic books.
An Exploration of Comparability Issues in Educational Research : = Scale Linking, Equating, and Propensity Score Weighting.
LDR
:04088nmm a2200397K 4500
001
2360882
005
20231015185442.5
006
m o d
007
cr mn ---uuuuu
008
241011s2023 xx obm 000 0 eng d
020
$a
9798379437534
035
$a
(MiAaPQ)AAI30423810
035
$a
AAI30423810
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
$d
NTU
100
1
$a
Wu, Tong.
$3
3285343
245
1 3
$a
An Exploration of Comparability Issues in Educational Research :
$b
Scale Linking, Equating, and Propensity Score Weighting.
264
0
$c
2023
300
$a
1 online resource (152 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertations Abstracts International, Volume: 84-10, Section: A.
500
$a
Advisor: Kim, Stella; Westine, Carl.
502
$a
Thesis (Ph.D.)--The University of North Carolina at Charlotte, 2023.
504
$a
Includes bibliographical references
520
$a
This three-article dissertation aims to address three methodological challenges to ensure comparability in educational research, including scale linking, test equating, and propensity score (PS) weighting. The first study intends to improve test scale comparability by evaluating the effect of six missing data handling approaches, including listwise deletion (LWD), treating missing data as incorrect responses (IN), corrected item mean imputation (CM), imputing with a response function (RF), multiple imputation (MI), and full information likelihood information (FIML), on item response theory (IRT) scale linking accuracy when missing data occur within common items. The relative performance of these six missing data treatment methods under two missing mechanisms is explored with simulated data. Results show that RF, MI, and FIML produce fewer errors for conducting scale linking, whereas LWD is associated with the most errors regardless of testing conditions. The second study aims to ensure test score comparability by proposing a new equating method to account for rater errors in rater-mediated assessments. Specifically, the performance of using an IRT observed-score equating method with a hierarchical rater model (HRM) is investigated under various conditions. The newly proposed equating method leads to comparable bias, SE, and RMSE to that of a traditional IRT observedscore equating method with the use of generalized partial credit model (GPCM) as normal raters scoring the new test forms. However, when aberrant raters are involved in the scoring process, the HRM IRT observed-score equating method generally produces more accurate results in bias and RMSE, though generates comparable SEs to the traditional method. The third study examines the performance of six covariate balance diagnostics when using PS weighting method with multilevel data. Specifically, a set of simulated conditions is used to examine the ability of within-cluster and pooled absolute standardized bias (ASB), variance ratio (VR), and percent bias reduction (PBR) methods to identify a correct PS model. In addition, the association between the balance statistics and the bias in treatment effect is explored. Within-cluster ASB and PBR are observed to be associated with the most accurate results in the choice of PS model as compared to other diagnostics. Pooled ASB is found to have the highest association with the treatment effect bias. By advancing the methodology for addressing comparability issues, the dissertation intends to enhance the validity and improve the quality of educational research.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2023
538
$a
Mode of access: World Wide Web
650
4
$a
Educational tests & measurements.
$3
3168483
650
4
$a
Educational evaluation.
$3
526425
653
$a
Covariate balance diagnostics
653
$a
Equating
653
$a
Missing data
653
$a
Propensity score weighting
653
$a
Rater-mediated assessments
653
$a
Scale linking
655
7
$a
Electronic books.
$2
lcsh
$3
542853
690
$a
0288
690
$a
0443
710
2
$a
ProQuest Information and Learning Co.
$3
783688
710
2
$a
The University of North Carolina at Charlotte.
$b
Educational Leadership.
$3
1277940
773
0
$t
Dissertations Abstracts International
$g
84-10A.
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=30423810
$z
click for full text (PQDT)
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9483238
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入