Automated Writing Evaluation (AWE) System
(Redirected from Automated Writing Evaluation System)
Jump to navigation
Jump to search
An Automated Writing Evaluation (AWE) System is an Automated System that uses natural language processing and machine learning to assess written text, provide feedback, and support writing skill development.
- AKA: Automated Essay Scoring System, AI Writing Assistant, Writing Feedback Tool, Automated Text Evaluation System.
- Context:
- It can analyze grammar, syntax, coherence, and style in student essays, professional documents, or creative writing.
- It can integrate with learning management systems (LMS) to streamline grading workflows for educators.
- It can employ algorithms like neural networks or rule-based systems to detect plagiarism or logical inconsistency.
- It can adapt to domain-specific writing standards (e.g., academic writing, technical reports, business proposals).
- It can improve writing proficiency by offering personalized feedback on weaknesses (e.g., vocabulary diversity, argument structure).
- ...
- Examples:
- Educational AWE Systems, such as:
- Grammarly, focusing on grammar checking and tone adjustment.
- Turnitin Feedback Studio, detecting plagiarism and providing rubric-based scoring.
- ETS e-rater, used in standardized tests like TOEFL for automated essay scoring.
- Research-Backed Systems, such as:
- WriteLab, leveraging NLP for argumentation analysis.
- Scribbr, designed for non-native speakers to improve academic writing.
- Educational AWE Systems, such as:
- Counter-Examples:
- Basic Spell Checkers, which lack contextual feedback (e.g., Microsoft Word Spell Check).
- Manual Grading, where human teachers assess writing without automation.
- Generic Text Editors (e.g., Notepad), which do not evaluate writing quality.
- See: Natural Language Processing, Educational Technology, AI in Education, Formative Feedback, Plagiarism Detection, Learning Analytics.
References
2025
- (Liu et al., 2025) ⇒ Zhexiong Liu, Diane Litman, Elaine Wang, Tianwen Li, & Mason Gobat. (2025). "eRevise+RF: A Writing Evaluation System for Assessing Student Essay Revisions and Providing Formative Feedback". In: arXiv.
- QUOTE: The ability to revise essays in response to feedback is important for students' writing success. An automated writing evaluation (AWE) system that supports students in revising their essays is thus essential. We present eRevise+RF, an enhanced AWE system for assessing student essay revisions (e.g., changes made to an essay to improve its quality in response to essay feedback) and providing revision feedback.
2024a
- (Fan & Ma, 2024) ⇒ Ning Fan, Yingying Ma. (2024). "The Effects of Automated Writing Evaluation (AWE) Feedback on Students' English Writing Quality: A Systematic Literature Review". In: Language Teaching Research Quarterly.
- QUOTE: The purpose of this review is to examine the effects of automated writing evaluation (AWE) feedback on students' English writing performance. We systematically reviewed studies that have empirically focused on this purpose.
This review uses several combinations of key words to search in the databases of JSTOR, SSCI, and ERIC for peer-reviewed articles published from 2005 to April 2020. The systematic review produced 22 eligible studies categorized as within-group and between group studies based on Stevenson and Phakiti's (2014) categorization.
- QUOTE: The purpose of this review is to examine the effects of automated writing evaluation (AWE) feedback on students' English writing performance. We systematically reviewed studies that have empirically focused on this purpose.
2024b
- (Wang et al., 2024) ⇒ Izia Xiaoxiao Wang, Xihan Wu, Edith Coates, Min Zeng, Jiexin Kuang, Siliang Liu, Mengyang Qiu, Jungyeul Park (2024). "Neural Automated Writing Evaluation with Corrective Feedback". In: arXiv.
- QUOTE: The utilization of technology in second language learning and teaching has become ubiquitous. For the assessment of writing specifically, automated writing evaluation (AWE) and grammatical error correction (GEC) have become immensely popular and effective methods for enhancing writing proficiency and delivering instant feedback. By leveraging the power of natural language processing (NLP) and machine learning algorithms, these systems provide more accurate and unbiased scoring.
2023a
- (Escalante et al., 2023) ⇒ Juan Escalante, Austin Pack, and Alex Barrett (2023). "AI-generated feedback on writing: insights into efficacy and ENL student preference". In: SpringerOpen - International Journal of Educational Technology in Higher Education.
- QUOTE: Automated writing evaluation (AWE) systems such as Grammarly and Pigai assist learners and educators in the writing process by providing corrective feedback on learner writing. These systems, and older tools such as spelling and grammar checkers, rely on natural language processing to identify errors and infelicities in writing and suggest improvements. However, with the recent unleashing of highly sophisticated generative pretrained transformer (GPT) large language models (LLMs), such as GPT-4 by OpenAI and PaLM 2 by Google, AWE may be entering a new era.
2023b
- (Fakher Ajabshir & Ebadi, 2023) ⇒ Zahra Fakher Ajabshir, and Saman Ebadi . (2023). "The effects of automatic writing evaluation and teacher-focused feedback on CALF measures and overall quality of L2 writing across different genress". In: SpringerOpen - Asian-Pacific Journal of Second and Foreign Language Education volume.
- QUOTE: While traditionally providing feedback to written texts was done by teachers or peers, with increasing technological advancements and the devising of automated writing evaluation (AWE) tools, this responsibility has been delegated to online editing and proofreading platforms. These platforms serve as learning affordances that scaffold teachers by providing immediate feedback on micro-level writing features like grammar and spelling. Thus, teachers and students can allocate more time and attentional resources to macro-level writing skills such as organization and content (...)
2023c
- (Fleckenstein et al., 2023) ⇒ Johanna Fleckenstein, Lucas W. Liebenow, and Jennifer Meyer(2023). "Automated feedback and writing: a multi-level meta-analysis of effects on students' performance". In: Frontiers in Artificial Intelligence.
- QUOTE: Adaptive learning opportunities and individualized, timely feedback are considered to be effective support measures for students' writing in educational contexts. However, the extensive time and expertise required to analyze numerous drafts of student writing pose a barrier to teaching. Automated writing evaluation (AWE) tools can be used for individual feedback based on advances in Artificial Intelligence (AI) technology. A number of primary (quasi-)experimental studioes have investigated the effect of AWE feedback on students' writing performance.