Language:
English
繁體中文
Help
回圖書館首頁
手機版館藏查詢
Login
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Assessing Grammatical Features acros...
~
Kim, Susie.
Linked to FindBook
Google Book
Amazon
博客來
Assessing Grammatical Features across Score Levels in Second Language Writing: A Corpus-Based Analysis.
Record Type:
Electronic resources : Monograph/item
Title/Author:
Assessing Grammatical Features across Score Levels in Second Language Writing: A Corpus-Based Analysis./
Author:
Kim, Susie.
Published:
Ann Arbor : ProQuest Dissertations & Theses, : 2019,
Description:
133 p.
Notes:
Source: Dissertations Abstracts International, Volume: 80-12, Section: A.
Contained By:
Dissertations Abstracts International80-12A.
Subject:
Linguistics. -
Online resource:
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=13881225
ISBN:
9781392155882
Assessing Grammatical Features across Score Levels in Second Language Writing: A Corpus-Based Analysis.
Kim, Susie.
Assessing Grammatical Features across Score Levels in Second Language Writing: A Corpus-Based Analysis.
- Ann Arbor : ProQuest Dissertations & Theses, 2019 - 133 p.
Source: Dissertations Abstracts International, Volume: 80-12, Section: A.
Thesis (Ph.D.)--Michigan State University, 2019.
This item must not be sold to any third party vendors.
Recent research in the areas of second language testing and learner corpus research has provided increased insight into linguistic features of various score levels and into the meaning of a test score (Cushing, 2017; Knoch & Chapelle, 2018). However, language testing researchers have asserted the need to select linguistic features that are relevant to the test construct for test validation purposes (Egbert, 2017; Xi, 2017). In addition, the Common European Framework of Reference (CEFR) has been widely adopted in testing contexts and provides level descriptions for linguistic abilities, but empirical validation of its use in various testing contexts is critical (Wisniewski, 2017, 2018). Addressing these two limitations, I drew upon learner-produced written English from a large-scale English exam, the Certificate of English Language Competency (CEFR B2-level certification). The aim of the study was to (a) investigate specific grammatical features and overall linguistic accuracy of second language English texts to reveal patterns of language use at different score levels, and (b) examine how well rating rubric descriptors reflect characteristics of examinee texts and differentiate between score levels to find evidence for test validity.In order to provide concrete, context-relevant grammatical features for investigation, I selected 14 grammatical features from the English Profile studies (English Profile, 2015; Hawkins & Filipovic, 2012), which have also been documented in L2 writing research. Data included 560 texts written on three different topics and ranging across five levels of performance. I extracted the occurrences of 14 grammatical features from the corpus using Natural Language Processing tools and analyzed occurrences of these features attested in the corpus. Additionally, a subset of the texts was manually coded for error to examine overall accuracy of each texts.Consistent with the findings in existing literature, I found significant differences in the frequencies of certain clausal features across lower score levels. Both the frequencies of the 14 grammatical features and the overall number of different types of these features used in each text were moderately useful in predicting the grammar subscore. I identified co-occurring patterns of the target grammatical features by performing a principal components analysis. The results showed that grammar structures that are of similar types (e.g., finite, non-finite) and functions (e.g., complement, noun modifier) tended to occur together and exhibited (cross-sectional) developmental patterns. For a subset of data coded for errors, the error-free clause ratio was calculated, which significantly distinguished between each pair of adjacent levels.This study's findings highlight the need for empirical investigation of how learner language has been described by experts in proficiency descriptors (e.g., Council of Europe, 2001, 2018) and how reliably the constructs of rubric descriptors attest in test performance data. I suggest that writing assessment materials can benefit from reference to the tangible characteristics of L2 development found in writing development research (e.g., phrasal complexity, morphological accuracy, and association strength between a construction and its lexis).
ISBN: 9781392155882Subjects--Topical Terms:
524476
Linguistics.
Subjects--Index Terms:
CEFR
Assessing Grammatical Features across Score Levels in Second Language Writing: A Corpus-Based Analysis.
LDR
:04555nmm a2200373 4500
001
2279490
005
20210823080229.5
008
220723s2019 ||||||||||||||||| ||eng d
020
$a
9781392155882
035
$a
(MiAaPQ)AAI13881225
035
$a
(MiAaPQ)grad.msu:16818
035
$a
AAI13881225
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Kim, Susie.
$3
3557948
245
1 0
$a
Assessing Grammatical Features across Score Levels in Second Language Writing: A Corpus-Based Analysis.
260
1
$a
Ann Arbor :
$b
ProQuest Dissertations & Theses,
$c
2019
300
$a
133 p.
500
$a
Source: Dissertations Abstracts International, Volume: 80-12, Section: A.
500
$a
Publisher info.: Dissertation/Thesis.
500
$a
Advisor: Polio, Charlene;Reed, Daniel J.
502
$a
Thesis (Ph.D.)--Michigan State University, 2019.
506
$a
This item must not be sold to any third party vendors.
520
$a
Recent research in the areas of second language testing and learner corpus research has provided increased insight into linguistic features of various score levels and into the meaning of a test score (Cushing, 2017; Knoch & Chapelle, 2018). However, language testing researchers have asserted the need to select linguistic features that are relevant to the test construct for test validation purposes (Egbert, 2017; Xi, 2017). In addition, the Common European Framework of Reference (CEFR) has been widely adopted in testing contexts and provides level descriptions for linguistic abilities, but empirical validation of its use in various testing contexts is critical (Wisniewski, 2017, 2018). Addressing these two limitations, I drew upon learner-produced written English from a large-scale English exam, the Certificate of English Language Competency (CEFR B2-level certification). The aim of the study was to (a) investigate specific grammatical features and overall linguistic accuracy of second language English texts to reveal patterns of language use at different score levels, and (b) examine how well rating rubric descriptors reflect characteristics of examinee texts and differentiate between score levels to find evidence for test validity.In order to provide concrete, context-relevant grammatical features for investigation, I selected 14 grammatical features from the English Profile studies (English Profile, 2015; Hawkins & Filipovic, 2012), which have also been documented in L2 writing research. Data included 560 texts written on three different topics and ranging across five levels of performance. I extracted the occurrences of 14 grammatical features from the corpus using Natural Language Processing tools and analyzed occurrences of these features attested in the corpus. Additionally, a subset of the texts was manually coded for error to examine overall accuracy of each texts.Consistent with the findings in existing literature, I found significant differences in the frequencies of certain clausal features across lower score levels. Both the frequencies of the 14 grammatical features and the overall number of different types of these features used in each text were moderately useful in predicting the grammar subscore. I identified co-occurring patterns of the target grammatical features by performing a principal components analysis. The results showed that grammar structures that are of similar types (e.g., finite, non-finite) and functions (e.g., complement, noun modifier) tended to occur together and exhibited (cross-sectional) developmental patterns. For a subset of data coded for errors, the error-free clause ratio was calculated, which significantly distinguished between each pair of adjacent levels.This study's findings highlight the need for empirical investigation of how learner language has been described by experts in proficiency descriptors (e.g., Council of Europe, 2001, 2018) and how reliably the constructs of rubric descriptors attest in test performance data. I suggest that writing assessment materials can benefit from reference to the tangible characteristics of L2 development found in writing development research (e.g., phrasal complexity, morphological accuracy, and association strength between a construction and its lexis).
590
$a
School code: 0128.
650
4
$a
Linguistics.
$3
524476
650
4
$a
English as a Second Language.
$3
3423938
653
$a
CEFR
653
$a
Grammatical features
653
$a
Second language assessment
653
$a
Second language writing
690
$a
0290
690
$a
0441
710
2
$a
Michigan State University.
$b
Second Language Studies - Doctor of Philosophy.
$3
3171490
773
0
$t
Dissertations Abstracts International
$g
80-12A.
790
$a
0128
791
$a
Ph.D.
792
$a
2019
793
$a
English
856
4 0
$u
https://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=13881225
based on 0 review(s)
Location:
ALL
電子資源
Year:
Volume Number:
Items
1 records • Pages 1 •
1
Inventory Number
Location Name
Item Class
Material type
Call number
Usage Class
Loan Status
No. of reservations
Opac note
Attachments
W9431223
電子資源
11.線上閱覽_V
電子書
EB
一般使用(Normal)
On shelf
0
1 records • Pages 1 •
1
Multimedia
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login