Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Evaluation in the crowd = crowdsourc...
~
Dagstuhl Seminar on Evaluation in the Crowd: Crowdsourcing and Human-Centered Experiments ((2015 :)
Linked to FindBook
Google Book
Amazon
博客來
Evaluation in the crowd = crowdsourcing and human-centered experiments : Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 - 27, 2015 : revised contributions /
Record Type:
Electronic resources : Monograph/item
Title/Author:
Evaluation in the crowd/ edited by Daniel Archambault, Helen Purchase, Tobias Hossfeld.
Reminder of title:
crowdsourcing and human-centered experiments : Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 - 27, 2015 : revised contributions /
remainder title:
Seminar 15481
other author:
Archambault, Daniel.
corporate name:
Dagstuhl Seminar on Evaluation in the Crowd: Crowdsourcing and Human-Centered Experiments
Published:
Cham :Springer International Publishing : : 2017.,
Description:
vii, 191 p. :ill., digital ;24 cm.
[NT 15003449]:
Crowdsourcing Versus the Laboratory: Towards Human-centered Experiments Using the Crowd -- Understanding The Crowd: Ethical and Practical Matters in the Academic Use of Crowdsourcing -- Crowdsourcing Technology to Support Academic Research -- Crowdsourcing for Information Visualization: Promises and Pitfalls -- Cognitive Information Theories of Psychology and Applications with Visualization and HCI through Crowdsourcing Platforms -- Crowdsourcing Quality of Experience Experiments.
Contained By:
Springer eBooks
Subject:
Human computation - Congresses. -
Online resource:
http://dx.doi.org/10.1007/978-3-319-66435-4
ISBN:
9783319664354
Evaluation in the crowd = crowdsourcing and human-centered experiments : Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 - 27, 2015 : revised contributions /
Evaluation in the crowd
crowdsourcing and human-centered experiments : Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 - 27, 2015 : revised contributions /[electronic resource] :Seminar 15481edited by Daniel Archambault, Helen Purchase, Tobias Hossfeld. - Cham :Springer International Publishing :2017. - vii, 191 p. :ill., digital ;24 cm. - Lecture notes in computer science,102640302-9743 ;. - Lecture notes in computer science ;10264..
Crowdsourcing Versus the Laboratory: Towards Human-centered Experiments Using the Crowd -- Understanding The Crowd: Ethical and Practical Matters in the Academic Use of Crowdsourcing -- Crowdsourcing Technology to Support Academic Research -- Crowdsourcing for Information Visualization: Promises and Pitfalls -- Cognitive Information Theories of Psychology and Applications with Visualization and HCI through Crowdsourcing Platforms -- Crowdsourcing Quality of Experience Experiments.
As the outcome of the Dagstuhl Seminar 15481 on Crowdsourcing and Human-Centered Experiments, this book is a primer for computer science researchers who intend to use crowdsourcing technology for human centered experiments. The focus of this Dagstuhl seminar, held in Dagstuhl Castle in November 2015, was to discuss experiences and methodological considerations when using crowdsourcing platforms to run human-centered experiments to test the effectiveness of visual representations. The inspiring Dagstuhl atmosphere fostered discussions and brought together researchers from different research directions. The papers provide information on crowdsourcing technology and experimental methodologies, comparisons between crowdsourcing and lab experiments, the use of crowdsourcing for visualisation, psychology, QoE and HCI empirical studies, and finally the nature of crowdworkers and their work, their motivation and demographic background, as well as the relationships among people forming the crowdsourcing community.
ISBN: 9783319664354
Standard No.: 10.1007/978-3-319-66435-4doiSubjects--Topical Terms:
3258666
Human computation
--Congresses.
LC Class. No.: QA76.9.H84
Dewey Class. No.: 004.36019
Evaluation in the crowd = crowdsourcing and human-centered experiments : Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 - 27, 2015 : revised contributions /
LDR
:02700nmm a2200337 a 4500
001
2108891
003
DE-He213
005
20170927200451.0
006
m d
007
cr nn 008maaau
008
180519s2017 gw s 0 eng d
020
$a
9783319664354
$q
(electronic bk.)
020
$a
9783319664347
$q
(paper)
024
7
$a
10.1007/978-3-319-66435-4
$2
doi
035
$a
978-3-319-66435-4
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
QA76.9.H84
072
7
$a
UYZG
$2
bicssc
072
7
$a
COM070000
$2
bisacsh
082
0 4
$a
004.36019
$2
23
090
$a
QA76.9.H84
$b
D127 2015
111
2
$a
Dagstuhl Seminar on Evaluation in the Crowd: Crowdsourcing and Human-Centered Experiments
$d
(2015 :
$c
Dagstuhl, Wadern, Germany)
$3
3258662
245
1 0
$a
Evaluation in the crowd
$h
[electronic resource] :
$b
crowdsourcing and human-centered experiments : Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 - 27, 2015 : revised contributions /
$c
edited by Daniel Archambault, Helen Purchase, Tobias Hossfeld.
246
3
$a
Seminar 15481
260
$a
Cham :
$b
Springer International Publishing :
$b
Imprint: Springer,
$c
2017.
300
$a
vii, 191 p. :
$b
ill., digital ;
$c
24 cm.
490
1
$a
Lecture notes in computer science,
$x
0302-9743 ;
$v
10264
505
0
$a
Crowdsourcing Versus the Laboratory: Towards Human-centered Experiments Using the Crowd -- Understanding The Crowd: Ethical and Practical Matters in the Academic Use of Crowdsourcing -- Crowdsourcing Technology to Support Academic Research -- Crowdsourcing for Information Visualization: Promises and Pitfalls -- Cognitive Information Theories of Psychology and Applications with Visualization and HCI through Crowdsourcing Platforms -- Crowdsourcing Quality of Experience Experiments.
520
$a
As the outcome of the Dagstuhl Seminar 15481 on Crowdsourcing and Human-Centered Experiments, this book is a primer for computer science researchers who intend to use crowdsourcing technology for human centered experiments. The focus of this Dagstuhl seminar, held in Dagstuhl Castle in November 2015, was to discuss experiences and methodological considerations when using crowdsourcing platforms to run human-centered experiments to test the effectiveness of visual representations. The inspiring Dagstuhl atmosphere fostered discussions and brought together researchers from different research directions. The papers provide information on crowdsourcing technology and experimental methodologies, comparisons between crowdsourcing and lab experiments, the use of crowdsourcing for visualisation, psychology, QoE and HCI empirical studies, and finally the nature of crowdworkers and their work, their motivation and demographic background, as well as the relationships among people forming the crowdsourcing community.
650
0
$a
Human computation
$v
Congresses.
$3
3258666
650
1 4
$a
Computer Science.
$3
626642
650
2 4
$a
User Interfaces and Human Computer Interaction.
$3
892554
650
2 4
$a
Computer Communication Networks.
$3
775497
650
2 4
$a
Information Systems Applications (incl. Internet)
$3
1565452
650
2 4
$a
Economic Theory/Quantitative Economics/Mathematical Methods.
$3
2162305
700
1
$a
Archambault, Daniel.
$3
3258663
700
1
$a
Purchase, Helen.
$3
2072462
700
1
$a
Hossfeld, Tobias.
$3
3258664
710
2
$a
SpringerLink (Online service)
$3
836513
773
0
$t
Springer eBooks
830
0
$a
Lecture notes in computer science ;
$v
10264.
$3
3258665
856
4 0
$u
http://dx.doi.org/10.1007/978-3-319-66435-4
950
$a
Computer Science (Springer-11645)
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9323293
電子資源
11.線上閱覽_V
電子書
EB QA76.9.H84
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login