Impact of AWE Rubrics and Automated Assessment on EFL Writing Instruction

Impact of AWE Rubrics and Automated Assessment on EFL Writing Instruction

Jinlan‎ Tang, Yi'an Wu
DOI: 10.4018/IJCALLT.2017040104
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This paper addressed a gap in research literature on rubrics by investigating how the formative use of rubrics of an automated writing evaluation (AWE) tool, the Writing Roadmap, along with a novel type of meta-cognitive activity, i.e., automated assessment, assisted EFL writing instruction. A one-year teaching experiment incorporating the use of the tool was undertaken at the tertiary level. A mixed-methods research approach in the form of questionnaires, interviews, and the participants' interim project reports was employed to evaluate the efficacy of the teaching experiment. The research demonstrated that formative use of AWE rubrics along with automated assessment mediated writing instruction via offering timely and objective assessment, aligning teaching and assessment goals, aiding the feedback process, increasing student-teacher interactions, and promoting learner autonomy.
Article Preview
Top

1. Introduction

With the rapid development of information and communications technology and the increasing interaction across borders, English has become a major language of international communication and English language education has been attached ever growing importance. Among the four key skills of English, writing has posed big challenges for learners especially in the EFL context. In China, for example, learners need to improve their writing skill the most in the College English Test (Jin, 2010), so is their performance in the writing section of IELTs (IELTS, 2015). It has been noted that writing skill development involves a continuous cycle of drafting, assessment, feedback and revision, of which assessment and feedback are essential. But the task is challenging, especially in China, where qualified English teachers are in great demand.

Research shows that an important principle of effective writing assessment is to have comprehensible and explicit assessment rubrics and to get them communicated to the students (e.g. Brown, Race, & Smith, 1996). Attempts have been made to use rubrics to inform students of the assessment criteria, guide their writing, and assist with their revision. Such assessment transparency and encouragement of self-regulation were found to have produced positive results (e.g. Brookhart, 2014; Laurian & Fitzgerald, 2013; Panadero, et al, 2012; Wang, 2014).

However, there was limited evidence to show that using rubrics in itself could lead to improvement in performance (Panadero & Jonsson, 2013). As a matter of fact, rubrics are seen as most effective when used along with meta-cognitive activities such as peer and self-assessment (Andrade, Wang, Du, & Akawi, 2009; Panadero, Alonso-Tapia, & Huertas, 2012), self-assessment and exemplars (Jonsson, 2010), and explicit instruction and modeling, training in monitoring and scaffolding of writing (Brown, Glasswell, & Harland, 2004). It is observed that most meta-cognitive activities involve teacher intervention and students’ use of self-regulatory skills, which might be challenging due to the conflict between the demand for multiple-drafts’ feedback and the limited number of teachers.

With the increasing use of computers in assessment and teaching, and the acknowledged benefits of technology-enhanced assessment, such as offering immediate feedback and motivating students (e.g. Tang, Rich, & Wang, 2012), this study intends to address a gap in research literature on rubrics by exploring how the integral use of the Writing Roadmap (WRM) rubrics along with its automated assessment, assists writing instruction in the EFL classroom.

Complete Article List

Search this Journal:
Reset
Volume 14: 1 Issue (2024)
Volume 13: 1 Issue (2023)
Volume 12: 5 Issues (2022)
Volume 11: 4 Issues (2021)
Volume 10: 4 Issues (2020)
Volume 9: 4 Issues (2019)
Volume 8: 4 Issues (2018)
Volume 7: 4 Issues (2017)
Volume 6: 4 Issues (2016)
Volume 5: 4 Issues (2015)
Volume 4: 4 Issues (2014)
Volume 3: 4 Issues (2013)
Volume 2: 4 Issues (2012)
Volume 1: 4 Issues (2011)
View Complete Journal Contents Listing