Open Access
I know what you coded last summer
Author(s) -
Lucas Mendonça de Souza,
Igor Moreira Felix,
Bernardo Martins Ferreira,
Anarosa Alves Franco Brandão,
Leônidas de Oliveira Brandão
Publication year - 2021
Publication title -
anais do xxxii simpósio brasileiro de informática na educação (sbie 2021)
Language(s) - English
Resource type - Conference proceedings
DOI - 10.5753/sbie.2021.218673
Subject(s) - computer science , context (archaeology) , learning analytics , syntax , code (set theory) , task (project management) , workload , recursion (computer science) , analytics , data science , artificial intelligence , programming language , paleontology , management , set (abstract data type) , economics , biology , operating system
The outbreak of the COVID-19 pandemic caused a surge in enrollments in online courses. Consequently, this boost in numbers of students affected teachers ability to evaluate exercises and resolve doubts. In this context, tools designed to evaluate and provide feedback on code solutions can be used in programming courses to reduce teachers workload. Nonetheless, even with using such tools, the literature shows that learning how to program is a challenging task. Programming is complex and the programming language employed can also affect students outcomes. Thus, designing good exercises can reduce students difficulties in identifying the problem and help reduce syntax challenges. This research employs learning analytics processes on automatic evaluation tools interaction logs and code solutions to find metrics capable of identifying problematic exercises and their difficulty. In this context, an exercise is considered problematic if students have problems interpreting its description or its solution requires complex programming structures like loops, conditionals and recursion. The data comes from online introductory programming courses. Results show that the computed metrics can identify problematic exercises, as well as those that are being challenging.