Terwijl de halve onderwijswereld momenteel het hoofd breekt over de kansen en bedreigingen van ChatGTP is de opmars van AI in het onderwijs uiteraard al langer aan de gang. Zo maken docenten en studenten steeds vaker gebruik van writing assistants zoals Grammarly en Pigai. Deze assistants maken gebruik van onder andere natural language processing om fouten in teksten te identificeren en feedback of tips over je schrijfsels te genereren. Een nieuwe meta-analyse onderzocht de effectiviteit van writing assistants. De resultaten van deze meta-analyse laten zien dat het gebruiken van zulke writing assistants inderdaad ertoe leidt dat studenten betere teksten gaan schrijven.
Het abstract
Automated writing evaluation (AWE) has been frequently used to provide feedback on student writing. Many empirical studies have examined the effectiveness of AWE on writing quality, but the results were inconclusive. Thus, the magnitude of AWE’s overall effect and factors influencing its effectiveness across studies remained unclear. This study re-examined the issue by meta-analyzing the results of 26 primary studies with a total of 2468 participants from 2010 to 2022. The results revealed that AWE had a large positive overall effect on writing quality (g = 0.861, p < 0.001). Further moderator analyses indicated that AWE was more effective for post-secondary students than for secondary students and had more benefits for English as a Foreign Language (EFL) and English as a Second Language (ESL) learners than for Native English Speaker (NES) learners. When the genre of writing was considered, AWE showed a more significant impact on argumentative writing than on academic and mixed writing genres. However, intervention duration, feedback combination, and AWE platform did not moderate the effect of AWE on writing quality. The implications and recommendations for both research and practice are discussed in depth.
Geen opmerkingen:
Een reactie posten