Learning from Animation: Smooth Pursuits of Synaptic Transmission of an Impulse with Contextual Cues

Multimedia animations can provide better learning experiences for learners by carrying microscopic science concepts or subjects to macroscopic level. The purpose of this study is to explore how learners with and without prior experience study a multimedia learning material in an only graphically-ani...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:World journal on educational technology 2013, Vol.5 (2), p.238-247
Hauptverfasser: Köseoğlu, Pınar, Mazman, Sacide Güzin, Altun, Arif, Efendioğlu, Akın
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Multimedia animations can provide better learning experiences for learners by carrying microscopic science concepts or subjects to macroscopic level. The purpose of this study is to explore how learners with and without prior experience study a multimedia learning material in an only graphically-animated design and in a verbal contextual cue design through eye tracking methodology. A total of 39 undergraduates from the Biology Education Department participated in the study. A three minute animation, describing how the inter-neurons transfer of stimulus happens through synapses, containing two different cue based design (verbal contextual or graphical animated) was used to collect data. The animation was transformed to an experiment on an eye tracker to collect eye movements. A repeated measure ANOVA was executed for data analysis. Results showed a significant within subjects’ treatment effect for design types (verbalized cue vs. graphical animation) in terms of eye movements while between subjects effects for comparison of prior experience groups were not found being significant. Based on the findings from this study, it is suggested that further studies could be designed with a expert groups as well as expanding the content.
ISSN:1309-0348