語系:
繁體中文
English
說明(常見問題)
回圖書館首頁
手機版館藏查詢
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
An examination of evaluation methods...
~
Schatz, Steven Craig.
FindBook
Google Book
Amazon
博客來
An examination of evaluation methods for comparing two information retrieval systems supporting teacher performance.
紀錄類型:
書目-電子資源 : Monograph/item
正題名/作者:
An examination of evaluation methods for comparing two information retrieval systems supporting teacher performance./
作者:
Schatz, Steven Craig.
面頁冊數:
148 p.
附註:
Source: Dissertation Abstracts International, Volume: 65-05, Section: A, page: 1578.
Contained By:
Dissertation Abstracts International65-05A.
標題:
Information Science. -
電子資源:
http://wwwlib.umi.com/dissertations/fullcit/3134042
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3134042
An examination of evaluation methods for comparing two information retrieval systems supporting teacher performance.
Schatz, Steven Craig.
An examination of evaluation methods for comparing two information retrieval systems supporting teacher performance.
- 148 p.
Source: Dissertation Abstracts International, Volume: 65-05, Section: A, page: 1578.
Thesis (Ph.D.)--Indiana University, 2004.
This study examines existent and new methods for evaluating the success of information retrieval systems. The theory underlying current methods is not robust enough to handle the current volume of information. Traditional measures rely on judgments of whether a document is relevant to a particular question. A good system returns all the relevant documents and no extraneous documents. There is a rich literature questioning the efficacy of relevance judgments. Such questions as: Relevant to who? When? To what purpose? are not well answered in traditional theory.Subjects--Topical Terms:
1017528
Information Science.
An examination of evaluation methods for comparing two information retrieval systems supporting teacher performance.
LDR
:02847nmm 2200337 4500
001
1866477
005
20050105135826.5
008
130614s2004 eng d
035
$a
(UnM)AAI3134042
035
$a
AAI3134042
040
$a
UnM
$c
UnM
100
1
$a
Schatz, Steven Craig.
$3
1953861
245
1 3
$a
An examination of evaluation methods for comparing two information retrieval systems supporting teacher performance.
300
$a
148 p.
500
$a
Source: Dissertation Abstracts International, Volume: 65-05, Section: A, page: 1578.
500
$a
Chair: Thomas Schwen.
502
$a
Thesis (Ph.D.)--Indiana University, 2004.
520
$a
This study examines existent and new methods for evaluating the success of information retrieval systems. The theory underlying current methods is not robust enough to handle the current volume of information. Traditional measures rely on judgments of whether a document is relevant to a particular question. A good system returns all the relevant documents and no extraneous documents. There is a rich literature questioning the efficacy of relevance judgments. Such questions as: Relevant to who? When? To what purpose? are not well answered in traditional theory.
520
$a
In this study, two new measures (Spink's Information Need and Cooper's Utility) are used in evaluating two systems, comparing these new measures with traditional measures and each other.
520
$a
Two very different systems of searching were used to search the same set of 500 documents. One system, a text based system, resembled most common web search engines. The other system used a series of meta data tags for searching.
520
$a
Thirty-four educators searched for information using both search engines and evaluated the information retrieved by each. The participants searched a total of four times---twice using each system. Construct measures, derived by multiplying each of the three measures (traditional, information need, and utility) by a rating of satisfaction were compared using two way analysis of variance.
520
$a
Results indicated that there was a significant correlation between the three measures---so the new measures provided an equivalent method of evaluating systems and have some significant advantages---including no need for relevance judgments and easy application in situ. While the main focus of the study was on the methods of evaluation, the evaluation in this case showed that the text system was better than the tag based system.
590
$a
School code: 0093.
650
4
$a
Information Science.
$3
1017528
650
4
$a
Education, Technology.
$3
1017498
650
4
$a
Library Science.
$3
881164
690
$a
0723
690
$a
0710
690
$a
0399
710
2 0
$a
Indiana University.
$3
960096
773
0
$t
Dissertation Abstracts International
$g
65-05A.
790
1 0
$a
Schwen, Thomas,
$e
advisor
790
$a
0093
791
$a
Ph.D.
792
$a
2004
856
$u
http://wwwlib.umi.com/dissertations/fullcit/3134042
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3134042
筆 0 讀者評論
館藏地:
全部
電子資源
出版年:
卷號:
館藏
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約狀態
備註欄
附件
W9185353
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
在架
0
1 筆 • 頁數 1 •
1
多媒體
評論
新增評論
分享你的心得
Export
取書館
處理中
...
變更密碼
登入