z-logo
open-access-imgOpen Access
Analysis of Students' Peer Reviews to Crowdsourced Programming Assignments
Author(s) -
Nea Pirttinen,
Vilma Kangas,
Henrik Nygren,
Juho Lein,
Arto Hellas
Publication year - 2018
Publication title -
helda (university of helsinki)
Language(s) - English
Resource type - Conference proceedings
DOI - 10.1145/3279720.3279741
Subject(s) - computer science , set (abstract data type) , code (set theory) , quality (philosophy) , mathematics education , multimedia , programming language , psychology , philosophy , epistemology
We have used a tool called CrowdSorcerer that allows students to create programming assignments. The students are given a topic by a teacher, after which the students design a programming assignment: the assignment description, the code template, a model solution and a set of input-output -tests. The created assignments are peer reviewed by other students on the course. We study students' peer reviews to these student-generated assignments, focusing on examining the differences between novice and experienced programmers. We then analyze whether the exercises created by experienced programmers are rated better quality-wise than those created by novices. Additionally, we investigate the differences between novices and experienced programmers as peer reviewers: can novices review assignments as well as experienced programmers?

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom