Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
An examination of evaluation methods...
~
Schatz, Steven Craig.
Linked to FindBook
Google Book
Amazon
博客來
An examination of evaluation methods for comparing two information retrieval systems supporting teacher performance.
Record Type:
Electronic resources : Monograph/item
Title/Author:
An examination of evaluation methods for comparing two information retrieval systems supporting teacher performance./
Author:
Schatz, Steven Craig.
Description:
148 p.
Notes:
Source: Dissertation Abstracts International, Volume: 65-05, Section: A, page: 1578.
Contained By:
Dissertation Abstracts International65-05A.
Subject:
Information Science. -
Online resource:
http://wwwlib.umi.com/dissertations/fullcit/3134042
Online resource:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3134042
An examination of evaluation methods for comparing two information retrieval systems supporting teacher performance.
Schatz, Steven Craig.
An examination of evaluation methods for comparing two information retrieval systems supporting teacher performance.
- 148 p.
Source: Dissertation Abstracts International, Volume: 65-05, Section: A, page: 1578.
Thesis (Ph.D.)--Indiana University, 2004.
This study examines existent and new methods for evaluating the success of information retrieval systems. The theory underlying current methods is not robust enough to handle the current volume of information. Traditional measures rely on judgments of whether a document is relevant to a particular question. A good system returns all the relevant documents and no extraneous documents. There is a rich literature questioning the efficacy of relevance judgments. Such questions as: Relevant to who? When? To what purpose? are not well answered in traditional theory.Subjects--Topical Terms:
1017528
Information Science.
An examination of evaluation methods for comparing two information retrieval systems supporting teacher performance.
LDR
:02847nmm 2200337 4500
001
1866477
005
20050105135826.5
008
130614s2004 eng d
035
$a
(UnM)AAI3134042
035
$a
AAI3134042
040
$a
UnM
$c
UnM
100
1
$a
Schatz, Steven Craig.
$3
1953861
245
1 3
$a
An examination of evaluation methods for comparing two information retrieval systems supporting teacher performance.
300
$a
148 p.
500
$a
Source: Dissertation Abstracts International, Volume: 65-05, Section: A, page: 1578.
500
$a
Chair: Thomas Schwen.
502
$a
Thesis (Ph.D.)--Indiana University, 2004.
520
$a
This study examines existent and new methods for evaluating the success of information retrieval systems. The theory underlying current methods is not robust enough to handle the current volume of information. Traditional measures rely on judgments of whether a document is relevant to a particular question. A good system returns all the relevant documents and no extraneous documents. There is a rich literature questioning the efficacy of relevance judgments. Such questions as: Relevant to who? When? To what purpose? are not well answered in traditional theory.
520
$a
In this study, two new measures (Spink's Information Need and Cooper's Utility) are used in evaluating two systems, comparing these new measures with traditional measures and each other.
520
$a
Two very different systems of searching were used to search the same set of 500 documents. One system, a text based system, resembled most common web search engines. The other system used a series of meta data tags for searching.
520
$a
Thirty-four educators searched for information using both search engines and evaluated the information retrieved by each. The participants searched a total of four times---twice using each system. Construct measures, derived by multiplying each of the three measures (traditional, information need, and utility) by a rating of satisfaction were compared using two way analysis of variance.
520
$a
Results indicated that there was a significant correlation between the three measures---so the new measures provided an equivalent method of evaluating systems and have some significant advantages---including no need for relevance judgments and easy application in situ. While the main focus of the study was on the methods of evaluation, the evaluation in this case showed that the text system was better than the tag based system.
590
$a
School code: 0093.
650
4
$a
Information Science.
$3
1017528
650
4
$a
Education, Technology.
$3
1017498
650
4
$a
Library Science.
$3
881164
690
$a
0723
690
$a
0710
690
$a
0399
710
2 0
$a
Indiana University.
$3
960096
773
0
$t
Dissertation Abstracts International
$g
65-05A.
790
1 0
$a
Schwen, Thomas,
$e
advisor
790
$a
0093
791
$a
Ph.D.
792
$a
2004
856
$u
http://wwwlib.umi.com/dissertations/fullcit/3134042
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3134042
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9185353
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login