語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Evaluation in the crowd = crowdsourc...
~
Dagstuhl Seminar on Evaluation in the Crowd: Crowdsourcing and Human-Centered Experiments ((2015 :)
FindBook
Google Book
Amazon
博客來
Evaluation in the crowd = crowdsourcing and human-centered experiments : Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 - 27, 2015 : revised contributions /
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
Evaluation in the crowd/ edited by Daniel Archambault, Helen Purchase, Tobias Hossfeld.
其他題名:
crowdsourcing and human-centered experiments : Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 - 27, 2015 : revised contributions /
其他題名:
Seminar 15481
其他作者:
Archambault, Daniel.
團體作者:
Dagstuhl Seminar on Evaluation in the Crowd: Crowdsourcing and Human-Centered Experiments
出版者:
Cham :Springer International Publishing : : 2017.,
面頁冊數:
vii, 191 p. :ill., digital ;24 cm.
內容註:
Crowdsourcing Versus the Laboratory: Towards Human-centered Experiments Using the Crowd -- Understanding The Crowd: Ethical and Practical Matters in the Academic Use of Crowdsourcing -- Crowdsourcing Technology to Support Academic Research -- Crowdsourcing for Information Visualization: Promises and Pitfalls -- Cognitive Information Theories of Psychology and Applications with Visualization and HCI through Crowdsourcing Platforms -- Crowdsourcing Quality of Experience Experiments.
Contained By:
Springer eBooks
標題:
Human computation - Congresses. -
電子資源:
http://dx.doi.org/10.1007/978-3-319-66435-4
ISBN:
9783319664354
Evaluation in the crowd = crowdsourcing and human-centered experiments : Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 - 27, 2015 : revised contributions /
Evaluation in the crowd
crowdsourcing and human-centered experiments : Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 - 27, 2015 : revised contributions /[electronic resource] :Seminar 15481edited by Daniel Archambault, Helen Purchase, Tobias Hossfeld. - Cham :Springer International Publishing :2017. - vii, 191 p. :ill., digital ;24 cm. - Lecture notes in computer science,102640302-9743 ;. - Lecture notes in computer science ;10264..
Crowdsourcing Versus the Laboratory: Towards Human-centered Experiments Using the Crowd -- Understanding The Crowd: Ethical and Practical Matters in the Academic Use of Crowdsourcing -- Crowdsourcing Technology to Support Academic Research -- Crowdsourcing for Information Visualization: Promises and Pitfalls -- Cognitive Information Theories of Psychology and Applications with Visualization and HCI through Crowdsourcing Platforms -- Crowdsourcing Quality of Experience Experiments.
As the outcome of the Dagstuhl Seminar 15481 on Crowdsourcing and Human-Centered Experiments, this book is a primer for computer science researchers who intend to use crowdsourcing technology for human centered experiments. The focus of this Dagstuhl seminar, held in Dagstuhl Castle in November 2015, was to discuss experiences and methodological considerations when using crowdsourcing platforms to run human-centered experiments to test the effectiveness of visual representations. The inspiring Dagstuhl atmosphere fostered discussions and brought together researchers from different research directions. The papers provide information on crowdsourcing technology and experimental methodologies, comparisons between crowdsourcing and lab experiments, the use of crowdsourcing for visualisation, psychology, QoE and HCI empirical studies, and finally the nature of crowdworkers and their work, their motivation and demographic background, as well as the relationships among people forming the crowdsourcing community.
ISBN: 9783319664354
Standard No.: 10.1007/978-3-319-66435-4doiSubjects--Topical Terms:
3258666
Human computation
--Congresses.
LC Class. No.: QA76.9.H84
Dewey Class. No.: 004.36019
Evaluation in the crowd = crowdsourcing and human-centered experiments : Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 - 27, 2015 : revised contributions /
LDR
:02700nmm a2200337 a 4500
001
2108891
003
DE-He213
005
20170927200451.0
006
m d
007
cr nn 008maaau
008
180519s2017 gw s 0 eng d
020
$a
9783319664354
$q
(electronic bk.)
020
$a
9783319664347
$q
(paper)
024
7
$a
10.1007/978-3-319-66435-4
$2
doi
035
$a
978-3-319-66435-4
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
QA76.9.H84
072
7
$a
UYZG
$2
bicssc
072
7
$a
COM070000
$2
bisacsh
082
0 4
$a
004.36019
$2
23
090
$a
QA76.9.H84
$b
D127 2015
111
2
$a
Dagstuhl Seminar on Evaluation in the Crowd: Crowdsourcing and Human-Centered Experiments
$d
(2015 :
$c
Dagstuhl, Wadern, Germany)
$3
3258662
245
1 0
$a
Evaluation in the crowd
$h
[electronic resource] :
$b
crowdsourcing and human-centered experiments : Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 - 27, 2015 : revised contributions /
$c
edited by Daniel Archambault, Helen Purchase, Tobias Hossfeld.
246
3
$a
Seminar 15481
260
$a
Cham :
$b
Springer International Publishing :
$b
Imprint: Springer,
$c
2017.
300
$a
vii, 191 p. :
$b
ill., digital ;
$c
24 cm.
490
1
$a
Lecture notes in computer science,
$x
0302-9743 ;
$v
10264
505
0
$a
Crowdsourcing Versus the Laboratory: Towards Human-centered Experiments Using the Crowd -- Understanding The Crowd: Ethical and Practical Matters in the Academic Use of Crowdsourcing -- Crowdsourcing Technology to Support Academic Research -- Crowdsourcing for Information Visualization: Promises and Pitfalls -- Cognitive Information Theories of Psychology and Applications with Visualization and HCI through Crowdsourcing Platforms -- Crowdsourcing Quality of Experience Experiments.
520
$a
As the outcome of the Dagstuhl Seminar 15481 on Crowdsourcing and Human-Centered Experiments, this book is a primer for computer science researchers who intend to use crowdsourcing technology for human centered experiments. The focus of this Dagstuhl seminar, held in Dagstuhl Castle in November 2015, was to discuss experiences and methodological considerations when using crowdsourcing platforms to run human-centered experiments to test the effectiveness of visual representations. The inspiring Dagstuhl atmosphere fostered discussions and brought together researchers from different research directions. The papers provide information on crowdsourcing technology and experimental methodologies, comparisons between crowdsourcing and lab experiments, the use of crowdsourcing for visualisation, psychology, QoE and HCI empirical studies, and finally the nature of crowdworkers and their work, their motivation and demographic background, as well as the relationships among people forming the crowdsourcing community.
650
0
$a
Human computation
$v
Congresses.
$3
3258666
650
1 4
$a
Computer Science.
$3
626642
650
2 4
$a
User Interfaces and Human Computer Interaction.
$3
892554
650
2 4
$a
Computer Communication Networks.
$3
775497
650
2 4
$a
Information Systems Applications (incl. Internet)
$3
1565452
650
2 4
$a
Economic Theory/Quantitative Economics/Mathematical Methods.
$3
2162305
700
1
$a
Archambault, Daniel.
$3
3258663
700
1
$a
Purchase, Helen.
$3
2072462
700
1
$a
Hossfeld, Tobias.
$3
3258664
710
2
$a
SpringerLink (Online service)
$3
836513
773
0
$t
Springer eBooks
830
0
$a
Lecture notes in computer science ;
$v
10264.
$3
3258665
856
4 0
$u
http://dx.doi.org/10.1007/978-3-319-66435-4
950
$a
Computer Science (Springer-11645)
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9323293
電子資源
11.線上閱覽_V
電子書
EB QA76.9.H84
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入