語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
A comparability study on differences...
~
Rankin, Angelica Desiree.
FindBook
Google Book
Amazon
博客來
A comparability study on differences between scores of handwritten and typed responses on a large-scale writing assessment.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
A comparability study on differences between scores of handwritten and typed responses on a large-scale writing assessment./
作者:
Rankin, Angelica Desiree.
面頁冊數:
245 p.
附註:
Source: Dissertation Abstracts International, Volume: 77-02(E), Section: B.
Contained By:
Dissertation Abstracts International77-02B(E).
標題:
Quantitative psychology. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3726893
ISBN:
9781339117676
A comparability study on differences between scores of handwritten and typed responses on a large-scale writing assessment.
Rankin, Angelica Desiree.
A comparability study on differences between scores of handwritten and typed responses on a large-scale writing assessment.
- 245 p.
Source: Dissertation Abstracts International, Volume: 77-02(E), Section: B.
Thesis (Ph.D.)--The University of Iowa, 2015.
As the use of technology for personal, professional, and learning purposes increases, more and more assessments are transitioning from a traditional paper-based testing format to a computer-based one. During this transition, some assessments are being offered in both paper and computer formats in order to accommodate examinees and testing center capabilities. Scores on the paper-based test are often intended to be directly comparable to the computer-based scores, but such claims of comparability are often unsupported by research specific to that assessment. Not only should the scores be examined for differences, but the thought processes used by raters while scoring those assessments should also be studied to better understand why raters might score response modes differently. Previous comparability literature can be informative, but more contemporary, test-specific research is needed in order to completely support the direct comparability of scores.
ISBN: 9781339117676Subjects--Topical Terms:
2144748
Quantitative psychology.
A comparability study on differences between scores of handwritten and typed responses on a large-scale writing assessment.
LDR
:04187nmm a2200301 4500
001
2074160
005
20160927125402.5
008
170521s2015 ||||||||||||||||| ||eng d
020
$a
9781339117676
035
$a
(MiAaPQ)AAI3726893
035
$a
AAI3726893
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Rankin, Angelica Desiree.
$3
3189461
245
1 2
$a
A comparability study on differences between scores of handwritten and typed responses on a large-scale writing assessment.
300
$a
245 p.
500
$a
Source: Dissertation Abstracts International, Volume: 77-02(E), Section: B.
500
$a
Advisers: Stephen B. Dunbar; Catherine J. Welch.
502
$a
Thesis (Ph.D.)--The University of Iowa, 2015.
520
$a
As the use of technology for personal, professional, and learning purposes increases, more and more assessments are transitioning from a traditional paper-based testing format to a computer-based one. During this transition, some assessments are being offered in both paper and computer formats in order to accommodate examinees and testing center capabilities. Scores on the paper-based test are often intended to be directly comparable to the computer-based scores, but such claims of comparability are often unsupported by research specific to that assessment. Not only should the scores be examined for differences, but the thought processes used by raters while scoring those assessments should also be studied to better understand why raters might score response modes differently. Previous comparability literature can be informative, but more contemporary, test-specific research is needed in order to completely support the direct comparability of scores.
520
$a
The goal of this thesis was to form a more complete understanding of why analytic scores on a writing assessment might differ, if at all, between handwritten and typed responses. A representative sample of responses to the writing composition portion of a large-scale high school equivalency assessment were used. Six trained raters analytically scored approximately six-hundred examinee responses each. Half of those responses were typed, and the other half were the transcribed handwritten duplicates. Multiple methods were used to examine why differences between response modes might exist. A MANOVA framework was applied to examine score differences between response modes, and the systematic analyses of think-alouds and interviews were used to explore differences in rater cognition. The results of these analyses indicated that response mode was of no practical significance, meaning that domain scores were not notably dependent on whether or not a response was presented as typed or handwritten. Raters, on the other hand, had a more substantial effect on scores. Comments from the think-alouds and interviews suggest that, while the scores were not affected by response mode, raters tended to consider certain aspects of typed responses differently than handwritten responses. For example, raters treated typographical errors differently from other conventional errors when scoring typed responses, but not while scoring the handwritten duplicates. Raters also indicated that they preferred scoring typed responses over handwritten ones, but felt they could overcome their personal preferences to score both response modes similarly.
520
$a
Empirical investigations on the comparability of scores, combined with the analysis of raters' thought processes, helped to provide a more evidence-based answer to the question of why scores might differ between response modes. Such information could be useful for test developers when making decisions regarding what mode options to offer and how to best train raters to score such assessments. The design of this study itself could be useful for testing organizations and future research endeavors, as it could be used as a guide for exploring score differences and the human-based reasons behind them.
590
$a
School code: 0096.
650
4
$a
Quantitative psychology.
$3
2144748
650
4
$a
Educational tests & measurements.
$3
3168483
690
$a
0632
690
$a
0288
710
2
$a
The University of Iowa.
$b
Psychological and Quantitative Foundations.
$3
2098464
773
0
$t
Dissertation Abstracts International
$g
77-02B(E).
790
$a
0096
791
$a
Ph.D.
792
$a
2015
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3726893
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9307028
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入