Student Understanding Of Program Outcomes Through Formative And Summative Course Level Assessment
Author(s) -
Karim Nasr,
Raghu Echempati,
Arnaldo Mazzei
Publication year - 2020
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/1-2--12786
Subject(s) - summative assessment , formative assessment , session (web analytics) , set (abstract data type) , computer science , mathematics education , process (computing) , test (biology) , medical education , psychology , medicine , world wide web , programming language , paleontology , biology
In this paper, an approach is suggested to begin a process in which each student, while solving a homework problem, or a test or a project is asked to provide additional information concerning what concept(s) is (are) targeted in each homework problem and to what extent, if any, the Program Outcomes (PO’s) were encountered. The courses used here as examples for this approach are: Mechanics III (particle and rigid body kinematics and dynamics) and Design of Mechanical Components I. Students seem in tune with the targeted concepts via course experiences but rather non-consistent with regards to the interpretation of Program Outcomes. For many students, this is the first time that they are asked to examine the outcomes critically, but they all seem to understand and realize the merit of the process (particularly due to the quick feedback of the results that they receive). Some students were further challenged to “redesign” some of the homework problems in such a way that the previously addressed “weaker” Program Outcomes could be better addressed in those redesigned problems. The results of the “redesign” exercise are interesting in that students found it both difficult and challenging to create a new set of homework problems. This leads to the need for the instructor to provide effective ways of posing homework problems, which may be different from conventional exercise problems presented in the currently available textbooks. Presented here is a course-level formative and summative assessment of students’ understanding of the Program Outcomes, including comparison with the instructor’s target expectation for the achievement of such outcomes. The paper concludes with ways to gather better data illustrating students’ interpretation of Program Outcomes and perhaps redesign course content and instructional method to better meet desired outcomes. Introduction Recently, the accreditation process of engineering programs has taken a new form, becoming an outcome-based process wherein individual courses and experiences must contribute to the big picture of engineering education and students’ achievement of specific abilities and skills. This P ge 9.131.1 Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition Copyright © 2004, American Society for Engineering Education process has caused the majority of engineering programs around the nation to reflect on their educational focus, examine teaching and learning styles, experiment with new and innovative approaches to assess students’ learning, and above all put in place an improvement process [1] . Kettering University, like all accredited engineering schools, has adapted and responded to ABET EC 2000 [2,3] . A formal curriculum reform process occurred over 1999-2001, and produced a curriculum that embodied EC 2000 criteria. Trial assessment practices began on Fall 2000, both for core courses and capstone design courses as well, and a formal multi-tier, multi-method assessment process began on July 2001. In relation to ABET EC 2000’s Criterion 3, Program Outcomes and Assessment, assessment and demonstration of outcomes achievement are not only a part of the improvement process, but also expected of any program desiring accreditation. In the light of the above, many engineering courses and curricula have been influenced by EC 2000 criteria, and instructors were urged to make a special effort in addressing such guidelines. As a result, EC 2000 has had a profound impact on the structure and content of an engineering course. Instructors, in addition to focusing on a design and an end product, must revisit how the course contributes to students’ achievement of EC 2000 outcomes. At Kettering University, course-level correlation of course learning objectives to EC 2000 outcomes was performed for each course. A basic course in Machine Design, which is one of the subject matters in the context of this paper, tends to be perceived as a first “design” course by many students, although some “design experiences” may be given in courses like Mechanics, Thermodynamics, Fluid Mechanics, and Heat Transfer. However, the open-ended nature of a Machine Design course seems to make it difficult for a typical student to accept and appreciate. One of the reasons for this may be due to the student’s perception that a “unique solution” should exist to an otherwise seemingly well-posed question from among the standard exercise problems. Therefore, the “success” of a faculty teaching design courses perhaps depends on how well this philosophy is communicated to the students. Also, design courses are taught in different ways in different schools. Many schools in the U.S. and in Europe teach the design process initially to conceptually design a system, rather than teaching a more traditional analysis and design of machine components. More advanced computational techniques are used to parametrically analyze and optimally design a component or a system. At Kettering University, one course in Dynamics of Particles and Rigid Bodies (MECH 310) and one course in Machine Design (MECH 312) are required for all ME majors. A second Machine Design (MECH 412) and/or another course on Integrated Machine and Mechanism Design (MECH 510) are offered as sequential senior electives for those with Machine Design as the area of focus or concentration. A number of tools can be used to document students’ achievement of Program Outcomes (actual students’ work, external and internal surveys, exit interviews, pre-test and post-test examinations, etc.). Some surveys attempt to match students’ perception on outcomes achievement to instructor’s expectation. It is worthwhile then to examine whether students have the same understanding of Program Outcomes and whether course experiences contribute to outcomes achievement. This paper explores the possibility of gathering questionable data since the understanding and interpretation of the various attributes within the program outcomes vary among students. Additionally, somewhat different but more critical issue exists with the way the exercise problems at the end of a traditional textbook are posed, or for that matter, how the P ge 9.131.2 Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition Copyright © 2004, American Society for Engineering Education problems on a test are designed under the current system, which may not address many skills that program outcomes require. Approach & Motivation There are a number of references in the literature which focus on assessment methodologies, presenting techniques such as surveys, portfolios, entrance and exit interviews, teaching goals inventories (TGI’s), and many others [4-7] . In this paper, an attempt is made to analyze the assessment surveys returned by the students for homework problems that they solved in the MECH 310 course (taught in the Summer 2003) and MECH 312 course taught in the Fall 2002 and Summer 2003. This formative (during the term) assessment survey was declared optional but given extra credit to those who participated in it. The students that participated in these surveys are different between these different terms. The homework problems are typically assigned from the textbooks. During the Fall 2002 term, nine homework problems were assigned to and assessed by the MECH 312 Machine Design students. Some of these students were challenged to rewrite a few of the homework problems of their choice so that the otherwise “weaker” (low contribution, in their view) outcomes would become “stronger” (average or higher contribution, in their view). Only five (12.5 %) students participated in this rewriting project since this activity is usually very time consuming. Three out of these five students reported that they took over 5 to 6 hours in designing and solving a single problem. Their solution included comments on what the original problem lacked in addressing certain outcomes and suggestions on how to modify the problem statement to make those outcomes stronger in their view. The other two students just reworded the problems to include such phrases as for example, this bolt is to be used by Boeing, or this spring is to be used in a toy, etc. However, their solution to such problems did not involve any discussion or the application of an iterative process. This leads to a belief that the instructors must prepare problems based on what is perceived to satisfy the course learning objectives to a larger extent. Based on the lessons learned from the Fall 2002 survey, a different batch (Summer 2003) of MECH 312 students were asked to return the assessment surveys of each test and the final project. However, in this paper, only the results of the assessment survey of the project are presented. There are other instructional methods that may serve outcomes satisfaction better than traditional approaches. For example, Problem-Based Learning [8] is an instructional approach that promotes critical thinking by presenting a real-life problem of relevance that needs to be solved. The motivation for solving the problem becomes an automatic part of the solution where students are playing the roles of authentic investigators and instructors are facilitators. Since solving a practical problem is the objective, uncovering fundamental principles and concepts are natural consequences of the solution approach. Students are not left wondering if what they are studying has any use, but rather challenged by the excitement of solving real-life problems. In engineering, this feeling is a great motivational tool. More than motivation exclusively, a problem-based approach may lead to student independence, along with promoting creativity and critical thinking. P ge 9.131.3 Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition Copyright © 2004, American Society for Engineering Education Regardless of the instructional approach or th
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom