Mixing Exam Formats To Enhance Examination Learning And Test Taking Skills
Author(s) -
Maher Murad,
Robert Martinazzi
Publication year - 2020
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/1-2--12049
Subject(s) - grading (engineering) , session (web analytics) , test (biology) , mathematics education , computer science , multiple choice , entrance exam , psychology , pedagogy , world wide web , mathematics , engineering , significant difference , curriculum , paleontology , statistics , civil engineering , biology
The concept of Mixing Exam Formats (MEF) was developed to enhance student learning beyond the exam and to train students to become more familiar with the Fundamental of Engineering (FE) and the Civil Engineering Professional Exam (PE) multiple-choice exam formats. Under this concept, the exam is graded such that each student gets two scores: the first is called “Objective Score” which is based only on grading the multiple choices. The second score “Traditional Score” is based on the traditional grading of the detailed solution. The instructor returns the exams having graded the “Objective” portion. If an “Objective” answer is incorrect the students are required to thoroughly analyze their own work to determine where they made specific errors and why the correct answer was not determined. Students report their findings in a report. The instructor grades the reports and gives a final grade which is a combination of the two scores. The use of MEF concept helps students understand the material covered in the exam while also improving their test taking skills especially choosing the most correct answer. The concept allows students to identify and eliminate their mistakes. This paper covers the details associated with the concept development, implementation, and student responses of using the MEF concept as a method to extend the learning beyond examination and as a tool that will train students to be more effective when taking the FE and PE exams. Introduction Professors use a variety of examination formats to evaluate student learning. Traditional exams usually require detailed solutions in problem-solving exams. The Fundamental of Engineering (FE) and recently the Civil Engineering Principles and Practice Exam (PE) use only multiplechoice format and are considered “Objective Exams”. It is essential for practicing civil engineers to take and pass the PE exam to become professional engineers. No design can be accepted or implemented without being stamped by a professional engineer. The National Council of Examiners for Engineering and Surveying (NCEES) develops PE examinations that are taken by engineers for licensure as professional engineers. (1) In the academic environment, the choice of exam format and the method of grading the exam greatly affect the effectiveness of the exam in its attempt to evaluate student learning. Also, depending on how the exam is written as “Objective” or Traditional”, it can become a valuable tool for extending leaning beyond the examination. Exams are usually the basis to evaluate how P ge 858.1 Proceedings of the 2003 American Society for Engineering Education Annual Conference & Exposition Copyright 2003, American Society for Engineering Education well students learned course material. The value of exams as a learning tool has always been questioned. Some courses use papers or projects as the basis for evaluation instead. These methods possess the advantage of directing the attention of students to their writing but have the disadvantage of providing the instructor with no opportunity to evaluate how well the students mastered the basic ideas and skills being taught. (2) In lecture based courses it is even more difficult to replace exams with other means of evaluation especially when the course is problem solving in nature. Therefore exams are likely to continue to be utilized for evaluation but the challenge remains on how to make exams more effective as a learning tool. Using the MEF concept, students are given the opportunity to revisit their graded exams, review, and analyze and learn from their errors. The concept also provides training on reaching the most correct answer through a structured approach where common mistakes are avoided. Instructors written comments on exams are powerful communications that affect subsequent motivation and maximize students learning from exams. (3) Providing feedback to students is useful but usually marks the end of the learning process from exams. Many Engineering Technology (ET) students at University of Pittsburgh at Johnstown (UPJ) have been introduced to the useful concept “After Action Report (AAR)” which was developed to make the instructor’s general comments on the exam extend the learning process for the students. (4) The AAR concept also gives students an opportunity to provide feedback while analyzing their errors. The MEF concept presented in this paper is considered to be an extension to the AAR concept because the MEF concept not only requires an after action report to be submitted by the students but also uses this exercise to train students to reach the most correct answer by using mix exam formats. ET students were also introduced to the concept of “Syntax Error Analysis” which involves giving the students a problem along with an erroneous solution. Students are asked to analyze the problem to determine where the errors occur in the analysis and make corrections. (5) Concept Development and Implementation The MEF concept is introduced to students on the first day of classes. The concept is discussed along with the course syllabus. The intention of using the MEF concept as a learning tool and a way to improve the test taking skills of students are also discussed. Under the MEF concept, the exam is prepared such that the students are instructed to show all work including the basic formula, step-by-step solution, sketches, units, and a correct final answer. At the same time the students are instructed to select the most correct answer from the four options given for each problem, even if they had to guess one of the options. The exam is graded such that each student gets two scores: one “Objective Score” based only on grading the multiple choices without consideration to the detailed solution provided by the student. The second score “Traditional Score” is based on the traditional grading of the detailed solution without consideration to the multiple-choice options. The instructor returns the exams having graded the “Objective” portion with the answer as either correct or incorrect. If an “Objective” answer is incorrect the students would be required to thoroughly analyze his/her P ge 858.2 Proceedings of the 2003 American Society for Engineering Education Annual Conference & Exposition Copyright 2003, American Society for Engineering Education own work to determine where he/she made specific errors and why the correct answer was not determined. Students report their findings in a report submitted to the professor. The instructor grades the reports and gives a final grade which is a combination of the two scores. A typical grading sheet is shown in Appendix A. The use of MEF concept helps students understand the material covered in the exam while also improving their test taking skills especially choosing the most correct answer. The concept allows students to identify and eliminate their mistakes, which prevented them from getting the correct final answer. The Mixing Exam Formats (MEF) concept has been introduced in the Highway Surveying and Design class. The Highway Design is a junior level course. Only Civil Engineering Technology (CET) students take the course which is it preceded by two-sophomore level courses in surveying. The Highway course is design and problem solving in nature. It develops students ability to use mathematical formulas, specifications and guidelines by design agencies, assumptions and finally common sense to recommend solutions for a given highway problem. The Highway Design class contained twenty nine (29) students. Traditionally, exams in highway design would include problems that require making sound engineering assumptions and may lead to different solutions or alternative designs. Analysis of Results Two exams were given to students with mixing formats as described earlier. The exams results were analyzed. Twenty nine (29) students took both exams. Table 1 shows a summary of the results or descriptive statistics of the exams results. Table 1: Descriptive Statistics for Exams with Mixing Formats Item Exam 1 Exam 2 Objective score Traditional Score Final Score Objective score Traditional Score Final Score Average 76.2 83.0 86.6 60.7 67.5 79.9 Minimum 37.5 53.0 56.0 40.0 52.0 58.0 Maximum 100 96.5 99.0 90.0 83.0 91.0 Variance 330.6 99.0 104.8 235.2 80.8 56.3 The following comments can be made on the results of Table 1: The “Objective Score” is based only on grading the multiple choices without consideration • to the detailed solution provided by the student. Exam 1 consisted of Eight (8) problems and so the “Objective Score” for each problem is either 12.5 or 0. Exam two consisted of ten (10) problems and so the “Objective Score” for each problem is either 10 or 0. The “Traditional Score” is based on the traditional grading of the detailed solution without • consideration to what the student circles in the multiple-choice options. Points are assigned to components of a detailed solution such as basic formula, proper substitution, logical steps, sketches, units, and a correct final answer. P ge 858.3 Proceedings of the 2003 American Society for Engineering Education Annual Conference & Exposition Copyright 2003, American Society for Engineering Education The “Final Score” is the “Traditional Score” adjusted for guessing the right answer and for • the after action report submitted by the student. If a student could not get the right final answer through the detailed solution but managed to circle the correct answer from guessing or engineering sense he/she gets extra points for that correct guess. Also students get extra credits for revisiting their exams and analyzing their errors as well as providing the detailed correct solutions. The extra credit is to encourage students to take the process seriously. The average “Objective Score” was lower than the “Traditional Score” for both exams. • This has been also the case for the majority of individual scores as indicated in Table 2. In general, studen
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom