z-logo
open-access-imgOpen Access
Best Practices for Using Algorithmic Calculated Questions via a Course Learning Management System
Author(s) -
Gillian Nicholls,
William Schell,
Neal Lewis
Publication year - 2016
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/p.26377
Subject(s) - blackboard (design pattern) , computer science , interface (matter) , learning management , range (aeronautics) , multimedia , software engineering , engineering , bubble , maximum bubble pressure method , parallel computing , aerospace engineering
Textbook publishers have created online algorithmic problem banks for books that have high enrollment. These banks allow an instructor to assign problems from the book with students submitting their answers via an online interface. The algorithmic problems ensure that each student gets a different combination of problem parameter values. Problems are graded automatically with partial credit and immediate feedback available. Instructors benefit by not needing to grade the problems. Students benefit by potentially having multiple attempts to solve each problem with feedback in between attempts. However, these online resources are only available for large enrollment courses where it is financially feasible for publishers to create them, and there is normally an extra cost to the students for access. For instructors teaching courses without such publisher resources or for those wanting additional assignments outside the publisher systems, many commonly used Learning Management Systems (LMS) have similar functionality. A number of the LMS packages in current use such as BlackboardTM, MoodleTM, BrightspaceTM, and CanvasTM have the capability for creating calculated quiz questions with algorithmic features. This allows an instructor to design an online question with a range of values for one or more parameters in a problem such that each attempt will have a different correct answer. This paper presents best practices for designing and using algorithmic calculated questions for quizzes and/or homework. The paper discusses ways to build the questions in advance, test possible answer combinations, and design likely wrong answers for partial credit and feedback. The pros and cons of using these calculated questions are reviewed. Examples and actual experiences from using the questions demonstrate that this is a beneficial way for instructors to enrich the learning experience while streamlining grading. This is an efficient way for a new engineering educator to gradually build a set of automated problems that can be modified to create new problems with minimal additional effort. Background Online homework problems have been offered by large adoption textbook publishers for some time. Typically, these systems allow instructors to assign problems from a textbook or an online bank of problems and students pay an extra fee for access. They log in to the textbook publisher’s website system and submit their homework answers by selecting from multiple choices for each question or by entering the final numerical calculation result. The problems are often algorithmically randomized so that each attempt by a student has slightly different parameter values. Grading is automatic, partial credit may be available, and students get immediate feedback. These systems discourage students from thoughtlessly copying answers from other students. The automated nature of the systems relieves instructors from the tedious labor involved in receiving paper homework, grading, logging scores, and sharing feedback with students. Prior research has had mixed results, but generally shows online homework supports the educational process. While Taraban, Anderson, Hayes, & Sharma1 found little association between online homework and exam scores, research by Lass, Morzuch, & Rogers2 concluded that online homework was associated with improved exam performance. Capaldi & Berg3 developed and studied use of an online learning system for students including online homework problems. The analysis showed that students using the online system achieved significantly greater learning as demonstrated on exams. Knight, Nicholls, & Componation4 discussed the efficiency of utilizing online homework, observing that assessments created in one class section could be readily imported to use in other sections. The automated grading and score recording greatly reduced the time demands on instructors and supported increased class sizes. They concluded exam performance could be predicted using the results of online homework. Davis & McDonald5 reported that students performed significantly better when a combination of online and handwritten homework was used compared to just handwritten homework. However, they observed that some students became frustrated when they could not identify minor inaccuracies in their work within the online system. The availability of online homework systems varies. Stowell6 observed that commercial providers of online homework are generally limited in upper-level engineering coursework and typically are only available for the large enrollment classes of statics and dynamics. An experiment in creating online homework for a chemical engineering class found it was well received by both students and faculty. However, the profits from its adoption were modest, and the author concluded it was not presently financially feasible to provide commercial online homework systems for smaller classes. Pandian et al.7 developed a web-based authoring tool called “CAPE” to assist instructors in creating online homework with diagnostics and corresponding feedback for students. They reported that instructors found some difficulties in using the tool, but that it was quite powerful in supporting intelligent tutoring. Carter & Yuhnke8 utilized online homework constructed within the Blackboard9 learning management system. They constructed homework assessments in two parts. The first part used multiple choice and matching questions, and students were given two attempts at these questions. The second part used algorithmic computational problems where parameter values changed with each attempt. Students were allowed unlimited attempts at questions from the second part. Student feedback indicated the immediate grading was positive, but the limitations on partial credit were a disadvantage. The main issue seemed to be problems the instructors had in selecting an appropriate tolerance to allow for rounding errors. Overall, online homework offers a means of engaging students in the material to foster greater learning; provides immediate feedback to students; and greatly reduces the time required for an instructor to administer the assignments. While online homework provided by a textbook publisher is simpler for instructors to use, upper level engineering courses and courses with smaller enrollments are unlikely to have online textbook publisher homework available. The LMS algorithmic calculated questions present a means for instructors to construct their own online homework assignments without needing to do computer programming or resort to an assisted authoring tool. Commercially available learning management systems such as Blackboard, Moodle10, Brightspace11, and Canvas12 provide the ability for an instructor to construct algorithmic calculated questions and assemble them into online homework assignments or quizzes. This paper shows how algorithmic calculated questions can be designed and utilized in order to aid the educational learning process. A key motivator is to share best practices so that educators, particularly those new to academia, can more easily adopt this tool. Time is a precious resource in academia for both educators and students. Online algorithmic calculated questions can be a positive course element if implemented properly. Calculated Questions with Algorithmic Parameters Quantitative analysis questions can be created with randomized algorithmic parameter values such that a different combination of data is seen each time the question is attempted. Depending upon the LMS, the maximum number of random combinations of the parameters may be capped. There are at least two types of calculated algorithmic questions that are available within LMS packages. The most commonly available type requires the student to enter the numerical result of the calculations. These questions require the student to analyze the information, calculate a final answer, and type it into the response box. The LMS compares the final answer to the correct results for that set of data. If the student’s answer is within a tolerance factor set by the instructor (to allow for rounding errors), the question is automatically graded as correct. One of the larger commercial LMS packages, Moodle, also allows the student’s answer to be evaluated against one or more alternate formulas constructed to calculate the answer a student would get if a typical conceptual error was made. This expands the ability to award partial credit beyond just rounding issues. If the answer is within the instructor-defined tolerance factor for another formula programmed to receive partial credit, the question is automatically graded at the pre-set level of partial credit. The second type of calculated question is similar to a multiple choice question. Calculated multi-choice questions present the student with a random combination of data, and a set of multiple choice answers calculated by formulas. If the student selects the answer that was calculated by using the correct formula and/or procedure, the question is automatically graded to receive full credit. If the student selects an answer calculated with a formula/procedure that would be used as part of a typical conceptual error, the question is automatically graded at the instructor’s choice of partial/no credit. At present, it appears this type of algorithmic, multiple choice calculated question is available only to users of the Moodle LMS. One of the most powerful things about these questions is that, in some systems, they can be designed to be synchronized with other questions so a series of separate, related problems can be solved using the same randomized set of parameter values. For example, a student could be given a set of random parameters representing the height, width, and length of a box. One question could ask the student to calculate the area footprint of the box while a second could ask about the volume of the box. This allows the student to demonstrate mas

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom