Challenging ChatGPT with Different Types of Physics Education Questions

Altres ajuts: acords transformatius de la UAB Will my job be replaced by ChatGPT? Can artificial intelligence (AI) engines do homework for students? How do I know if a delivered assignment was made by a robot? These and many other questions have been occupying the minds of professionals from differe...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: López Simó, Víctor, Rezende Junior, Mikael Frank
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Altres ajuts: acords transformatius de la UAB Will my job be replaced by ChatGPT? Can artificial intelligence (AI) engines do homework for students? How do I know if a delivered assignment was made by a robot? These and many other questions have been occupying the minds of professionals from different areas, including professors and researchers, especially since ChatGPT was launched in November 2022. This Generative Pretrained Transformer mechanism works through a chat interface that allows establishing conversations based on the targeted processing of a large volume of data, but the inner functioning of ChatGPT still acts as a "black box" for most of us. As physics educators, we are particularly interested in understanding the kind of information it can provide to students, how reliable this information can be, and where it may still fall short. This helps us better understand how we can use it. In the last months, different investigations have indicated the need for detailed studies to better understand both the potentialities and limitations of AI in physics teaching and learning scenarios. On the one hand, and presented different strategies to use Chat GTP in the physics classroom, presenting easy-to-implement examples of how ChatGPT can be used in physics classrooms to foster critical thinking skills at the secondary school level [1] and to generate bad examples to be addressed with students to critique and fix them [2]. On the other hand, some investigations have analyzed the level of performance of this IA tool to solve physics problems. According to [3], ChatGPT would narrowly pass a calculus-based physics course while exhibiting many of the preconceptions and errors of a beginning learner. In parallel, found that ChatGPT3.5 can match or exceed the median performance of a university student who has completed one semester of college physics, and found very impressive basic problem-solving capabilities of ChatGPT in interpreting simple physics problems, assuming relevant parameters, and writing correct codes. Despite those previous contributions focus either on identifying ChatGPT-based physics education good practices or testing ChatGPT physics' performance in comparison with real students, our particular interest lies in understanding how different typologies of physics education problems may influence both the correctness and the variability of the answers provided by the tool. It is well known in Physics Education Research that the typology of ph