語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Data Sharing in Peer-Assessment Syst...
~
Song, Yang.
FindBook
Google Book
Amazon
博客來
Data Sharing in Peer-Assessment Systems for Education.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Data Sharing in Peer-Assessment Systems for Education./
作者:
Song, Yang.
出版者:
Ann Arbor : ProQuest Dissertations & Theses, : 2017,
面頁冊數:
101 p.
附註:
Source: Dissertation Abstracts International, Volume: 79-07(E), Section: B.
Contained By:
Dissertation Abstracts International79-07B(E).
標題:
Computer science. -
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10758969
ISBN:
9780355635454
Data Sharing in Peer-Assessment Systems for Education.
Song, Yang.
Data Sharing in Peer-Assessment Systems for Education.
- Ann Arbor : ProQuest Dissertations & Theses, 2017 - 101 p.
Source: Dissertation Abstracts International, Volume: 79-07(E), Section: B.
Thesis (Ph.D.)--North Carolina State University, 2017.
Fifty years of research has found great potential for peer assessment as a pedagogical approach. With peer assessment, not only do students receive more copious assessments; they also learn to become assessors. In recent decades, more educational peer assessments have been facilitated by online systems. Those online systems are designed differently to suit different class settings and student groups; therefore, their designs are all different from each other: rating-based or ranking-based, reviews assigned randomly or to fixed groups, anonymous or onymous review, etc. Though there are different systems and a large number of users for each, there is a dearth of comparisons between different designs. This is mainly caused by the fact that the data generated by peer assessment systems is stored and analyzed separately; there is no standard for data sharing in this research community.
ISBN: 9780355635454Subjects--Topical Terms:
523869
Computer science.
Data Sharing in Peer-Assessment Systems for Education.
LDR
:03995nmm a2200337 4500
001
2163698
005
20181022132747.5
008
190424s2017 ||||||||||||||||| ||eng d
020
$a
9780355635454
035
$a
(MiAaPQ)AAI10758969
035
$a
AAI10758969
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Song, Yang.
$3
1265217
245
1 0
$a
Data Sharing in Peer-Assessment Systems for Education.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2017
300
$a
101 p.
500
$a
Source: Dissertation Abstracts International, Volume: 79-07(E), Section: B.
500
$a
Adviser: Edward F. Gehringer.
502
$a
Thesis (Ph.D.)--North Carolina State University, 2017.
520
$a
Fifty years of research has found great potential for peer assessment as a pedagogical approach. With peer assessment, not only do students receive more copious assessments; they also learn to become assessors. In recent decades, more educational peer assessments have been facilitated by online systems. Those online systems are designed differently to suit different class settings and student groups; therefore, their designs are all different from each other: rating-based or ranking-based, reviews assigned randomly or to fixed groups, anonymous or onymous review, etc. Though there are different systems and a large number of users for each, there is a dearth of comparisons between different designs. This is mainly caused by the fact that the data generated by peer assessment systems is stored and analyzed separately; there is no standard for data sharing in this research community.
520
$a
In this work, we focus on the data sharing between educational peer assessment systems. We designed a Peer-Review Markup Language (PRML) as a generic data schema to modeling and sharing data generated by different educational peer assessment systems. Based on PRML, a data warehouse can be built and different systems can ETL (Extract, Transform and Load) their data, contribute the data to the common data warehouse and share the data with other researchers.
520
$a
Making use of data shared by different peer assessment systems can help researchers to answer more general research questions, e.g. are reviewers more reliable in ranking-based or rating-based peer assessment? To answer this question, we designed algorithms to evaluate assessors' reliabilities based on their rating/ranking against the global ranks of the artifacts they have reviewed. These algorithms are suitable for data from both rating-based and ranking-based peer assessment systems. The experiments were done based on more than 15,000 peer assessments from multiple peer assessment systems. We found that the assessors in ranking-based peer assessments are more reliable than the assessors in rating-based peer assessments. Further analysis also demonstrated that the assessors in ranking-based assessments tend to assess the more differentiable artifacts correctly, but there no such pattern for rating-based assessors.
520
$a
Another research question that can be answered with this shared data is, how do collusions harm the peer review process? Ideally, if only a small number of students try to "game" the peer assessment process, the overall validity will not be affected much. However, one researcher found from his experience that more students became colluders through a semester -- they gave each other high scores, or, even worse, gave high scores to every artifact they reviewed. In the worst case, a big number of colluders may make the honest reviewers outliers, which harms the validity of peer assessment. We have defined two different patterns of possible collusions and apply graph mining algorithms to detect the colluders in the data shared with us.
590
$a
School code: 0155.
650
4
$a
Computer science.
$3
523869
650
4
$a
Educational technology.
$3
517670
650
4
$a
Curriculum development.
$3
684418
690
$a
0984
690
$a
0710
690
$a
0727
710
2
$a
North Carolina State University.
$b
Computer Science.
$3
2099755
773
0
$t
Dissertation Abstracts International
$g
79-07B(E).
790
$a
0155
791
$a
Ph.D.
792
$a
2017
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=10758969
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9363245
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入