語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
FindBook
Google Book
Amazon
博客來
Using Item Response Models and Analysis to Address Practical Measurement Questions.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Using Item Response Models and Analysis to Address Practical Measurement Questions./
作者:
Lyu, Weicong.
面頁冊數:
1 online resource (133 pages)
附註:
Source: Dissertations Abstracts International, Volume: 85-02, Section: B.
Contained By:
Dissertations Abstracts International85-02B.
標題:
Educational psychology. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=30638139click for full text (PQDT)
ISBN:
9798380156547
Using Item Response Models and Analysis to Address Practical Measurement Questions.
Lyu, Weicong.
Using Item Response Models and Analysis to Address Practical Measurement Questions.
- 1 online resource (133 pages)
Source: Dissertations Abstracts International, Volume: 85-02, Section: B.
Thesis (Ph.D.)--The University of Wisconsin - Madison, 2023.
Includes bibliographical references
Item response theory (IRT) is currently the dominant methodological paradigm in educational and psychological measurement. IRT models are based on assumptions about the relationship between latent traits and observed responses, so the accuracy of the methodology depends heavily on the reasonableness of these assumptions. This dissertation consists of three studies, all of which focus on different scenarios where existing IRT models do not agree closely with reality and thus may provide misleading or insufficient characterizations of measurement phenomena.In the first study, I discuss anchoring, the tendency for respondents to select categories near the rating category used for the immediately preceding item in self-report rating scale assessments. I propose a psychometric model based on a multidimensional nominal model for response style that also simultaneously accommodates a respondent-level anchoring tendency. This model is applied to a real dataset measuring extraversion, and empirical results support attending to both anchoring and midpoint response styles as ways of assessing respondent engagement.In the second study, I examine the simultaneous relevance of content trait level and response styles as predictive factors of response time on noncognitive assessments, and the potential for omitted variable bias when ignoring either factor. Using response time data from several noncognitive assessments, I demonstrate how a multilevel model leads to consistent findings that support the simultaneous relevance of both factors. The average effects of response style consistently emerge as stronger, although also show greater respondent-level variability, than those of content traits.In the third study, test items whose scores reflect sequential or IRTree modeling outcomes are considered. For such items, I argue that item specific factors, although not empirically measurable, are often present across stages of the same item. A conceptual model that incorporates such factors is proposed and used to demonstrate how they create ambiguity in the interpretations of item and person parameters beyond the first stage. Various empirical applications show patterns of violations of item parameter invariance across stages that are highly suggestive of item specific factors.These studies reflect some recent advances in IRT modeling applied to practical issues, which hopefully will benefit both methodologists and practitioners.
Electronic reproduction.
Ann Arbor, Mich. :
ProQuest,
2023
Mode of access: World Wide Web
ISBN: 9798380156547Subjects--Topical Terms:
517650
Educational psychology.
Subjects--Index Terms:
Item response theoryIndex Terms--Genre/Form:
542853
Electronic books.
Using Item Response Models and Analysis to Address Practical Measurement Questions.
LDR
:03797nmm a2200373K 4500
001
2361427
005
20231019120621.5
006
m o d
007
cr mn ---uuuuu
008
241011s2023 xx obm 000 0 eng d
020
$a
9798380156547
035
$a
(MiAaPQ)AAI30638139
035
$a
AAI30638139
040
$a
MiAaPQ
$b
eng
$c
MiAaPQ
$d
NTU
100
1
$a
Lyu, Weicong.
$3
3702101
245
1 0
$a
Using Item Response Models and Analysis to Address Practical Measurement Questions.
264
0
$c
2023
300
$a
1 online resource (133 pages)
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
500
$a
Source: Dissertations Abstracts International, Volume: 85-02, Section: B.
500
$a
Advisor: Bolt, Daniel M.
502
$a
Thesis (Ph.D.)--The University of Wisconsin - Madison, 2023.
504
$a
Includes bibliographical references
520
$a
Item response theory (IRT) is currently the dominant methodological paradigm in educational and psychological measurement. IRT models are based on assumptions about the relationship between latent traits and observed responses, so the accuracy of the methodology depends heavily on the reasonableness of these assumptions. This dissertation consists of three studies, all of which focus on different scenarios where existing IRT models do not agree closely with reality and thus may provide misleading or insufficient characterizations of measurement phenomena.In the first study, I discuss anchoring, the tendency for respondents to select categories near the rating category used for the immediately preceding item in self-report rating scale assessments. I propose a psychometric model based on a multidimensional nominal model for response style that also simultaneously accommodates a respondent-level anchoring tendency. This model is applied to a real dataset measuring extraversion, and empirical results support attending to both anchoring and midpoint response styles as ways of assessing respondent engagement.In the second study, I examine the simultaneous relevance of content trait level and response styles as predictive factors of response time on noncognitive assessments, and the potential for omitted variable bias when ignoring either factor. Using response time data from several noncognitive assessments, I demonstrate how a multilevel model leads to consistent findings that support the simultaneous relevance of both factors. The average effects of response style consistently emerge as stronger, although also show greater respondent-level variability, than those of content traits.In the third study, test items whose scores reflect sequential or IRTree modeling outcomes are considered. For such items, I argue that item specific factors, although not empirically measurable, are often present across stages of the same item. A conceptual model that incorporates such factors is proposed and used to demonstrate how they create ambiguity in the interpretations of item and person parameters beyond the first stage. Various empirical applications show patterns of violations of item parameter invariance across stages that are highly suggestive of item specific factors.These studies reflect some recent advances in IRT modeling applied to practical issues, which hopefully will benefit both methodologists and practitioners.
533
$a
Electronic reproduction.
$b
Ann Arbor, Mich. :
$c
ProQuest,
$d
2023
538
$a
Mode of access: World Wide Web
650
4
$a
Educational psychology.
$3
517650
650
4
$a
Educational tests & measurements.
$3
3168483
650
4
$a
Quantitative psychology.
$3
2144748
653
$a
Item response theory
653
$a
Measurement phenomena
653
$a
Methodological paradigm
655
7
$a
Electronic books.
$2
lcsh
$3
542853
690
$a
0525
690
$a
0288
690
$a
0632
710
2
$a
ProQuest Information and Learning Co.
$3
783688
710
2
$a
The University of Wisconsin - Madison.
$b
Educational Psychology.
$3
3170079
773
0
$t
Dissertations Abstracts International
$g
85-02B.
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=30638139
$z
click for full text (PQDT)
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9483783
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入