Examination of an Automated Procedure for Calculating Morphological Complexity

The aim of this study was to advance the analysis of written language transcripts by validating an automated scoring procedure using an automated open-access tool for calculating morphological complexity (MC) from written transcripts. The MC of words in 146 written responses of students in fifth gra...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:American journal of speech-language pathology 2023-09, Vol.32 (5), p.1-2330
Hauptverfasser: Wood, Carla, Garcia-Salas, Miguel, Schatschneider, Christopher
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The aim of this study was to advance the analysis of written language transcripts by validating an automated scoring procedure using an automated open-access tool for calculating morphological complexity (MC) from written transcripts. The MC of words in 146 written responses of students in fifth grade was assessed using two procedures: (a) hand-coding of words containing derivational morphemes by trained scorers and (b) an automated analysis of MC using Morpholex, a newly developed web-based tool. Correlational analysis between the different MC calculations was examined to consider the relation between hand-coded derivational morpheme counts and the automated measures. Additionally, all MC measures were compared to a previously gathered rating of writing quality to consider predictive validity between the automated Morpholex score and teachers' ratings of writing quality. Automated measures of MC had a strong relation ( .63) with hand-coding of the number of words with derivational morphemes. Additionally, the number of derivational and inflectional and derivational morphemes accounted for a significant amount of the variation in teachers' overall ratings of writing quality. Automated scoring of MC has potential utility as a valid alternative to hand-coding language samples, which may be valuable for progress monitoring of growth in complexity across repeated samples and measuring components that influence perceived quality of academic writing.
ISSN:1058-0360
1558-9110
DOI:10.1044/2023_AJSLP-23-00044