z-logo
open-access-imgOpen Access
The Use Of Peer Evaluations In A Nontraditional First-Year System Design Class
Author(s) -
Joseph J. Pow,
María Helguera,
Elizabeth Pieri,
S. Wolters,
Michael Augspurger,
Briaeuberger,
Victoria Scholl,
Elizabeth Bondi
Publication year - 2020
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/1-2--23185
Subject(s) - capstone , class (philosophy) , transformational leadership , peer feedback , test (biology) , first class , computer science , mathematics education , peer assessment , peer mentoring , psychology , medical education , pedagogy , medicine , artificial intelligence , social psychology , paleontology , algorithm , data mining , biology
In the fall of 2010 the Chester F. Carlson Center for Imaging Science, an imaging systems engineering department at the Rochester Institute of Technology, completely abandoned its traditional lecture based pedagogy for incoming freshmen and in its place implemented a radically different project based class for first year students. Similar to many existing senior level capstone experiences, this new approach challenged first year students to work together as a single integrated multidisciplinary team for a full academic year to design, develop, build, and test a unique, fully functional imaging system from scratch. Now in its fourth year, all indications are that this pedagogy has been transformational, not only for the freshmen who have taken the class, but for the department as a whole. It has changed long held perceptions about the abilities of first year college students, and has led to a new understanding of the role of faculty in technical undergraduate degree programs. One of the central ways in which this pedagogy differs from a traditional approach is in its desired student outcomes. Whereas the outcomes of the old pedagogy were primarily knowledge oriented, in the new class the outcomes are focused on the degree to which students begin to adopt the behaviors and practices of professional engineers. Consequently conventional assessment tools which measure only knowledge, such as quizzes, tests, and final exams, are of limited value. Instead, instructors in the new class must rely on other techniques to assess student growth and development. One of these is the use of formal peer evaluations. Although these peer evaluations were treated as mandatory assignments due at the end of each academic term, their scope and format were determined by the students themselves. The evaluations were submitted to the instructor, who sanitized them to preserve the anonymity of the evaluators and then compiled and distributed them to each recipient. In this way every student received feedback on how at least a portion of their classmates perceived their performance three times over the course of the year long project. In this paper we examine the peer evaluations submitted by the first three cohorts to experience the new pedagogy in an effort to gain some insight into their use in a non-traditional classroom. Our analysis focuses on three primary aspects of the peer evaluation system. First, we look at how the scope and format of the evaluations evolved over time as a way of understanding which characteristics the students felt were most essential in their fellow team members. Next we examine the nature and quality of the feedback as a way to assess the perceived value of the peer evaluations. And lastly we draw from the records of the course instructor to see the degree to which the peer evaluations aligned with the perceptions of the instructors. Together these analyses can thus inform the use of peer evaluations as an assessment tool in engineering classes at other institutions. Introduction Freshman Imaging Project Overview For the past several years research on STEM education has consistently revealed the benefits of non-traditional project-based experiences for students at all levels. As a leading sponsor of scholarly work in this area, the Association of American Colleges and Universities’ Project Kaleidoscope (PKAL) has been a vocal advocate for widespread STEM education reform. The themes emerging from PKAL research regarding undergraduate STEM education are clear and consistent: Learning should be experiential and steeped in investigation from the very first courses. 1 Learning should be personally meaningful for students and faculty, it should make connections to other fields of inquiry, and it should suggest practical applications related to the experience of students. 2 Learning should take place in a community where faculty see students as partners in learning, where students collaborate with one another and gain confidence that they succeed, and where institutions support such communities of learners. 3 Higher education should produce new frames of understanding by piloting new ideas, tools, and approaches to keep students’ learning on the cutting edge. 4 In 2010 the Chester F. Carlson Center for Imaging Science, an imaging systems engineering department at the Rochester Institute of Technology, developed and implemented a new freshman-level course, known as the Freshman Imaging Project, which embodies this pedagogical framework. While the architects of this new pedagogy wanted it to reflect the most recent research on STEM education, it was also built upon other fundamental beliefs. For example, the belief that first year students are capable of understanding advanced concepts, and their motivation is enhanced by giving them more independence and more control over their educational experience. The team which developed this experience felt strongly that if successful, this pedagogy would be transformational, and would not only challenge widely held perceptions of students’ abilities, but also the role of the faculty in undergraduate STEM education. The new curriculum is a year-long (three academic quarters) sequence of courses in which the students work together as a single integrated multidisciplinary team to design and build a different functional imaging device from scratch each year. The general type of device is specified by the department faculty but the students are responsible for establishing technical performance parameters by assessing the needs of prospective users of their system. Once those performance parameters are established, the students are responsible for creating their own work breakdown structure, as well as planning and executing the entire design and development effort. The only major milestones the students are required to meet are two formal design reviews for external evaluators at the end of the fall and winter quarters, and a public demonstration of the finished product at an annual campus-wide innovation festival at the end of the academic year. An instructor of record is assigned responsibility for the course but there are no required textbooks or formal lectures. The students jointly construct a common understanding of new concepts by researching in the published literature any topics they need to investigate, and then share their interpretations with their classmates. As necessary, they seek assistance from subject area experts in the faculty, from upper class students, and from outside sources. Scheduled class meetings (two per week for two hours each) take place in a dedicated 800 square foot laboratory configured specifically for this purpose which is available on a 24 hour basis to freshmen enrolled in the course. No other classes are scheduled in this room. The sequence of courses that make up this year-long experience is required for imaging systems engineering majors, but in an effort to maximize the authenticity of the experience, freshmen from other degree programs are encouraged to enroll in any or all of the three courses. Students continuing in the course from previous quarters are responsible for orienting and integrating any new students into the design team. Although interaction with upper class students is strongly encouraged, formal enrollment in the course is restricted to only first year students. Since one of the primary outcomes from this pedagogy is to have the students adopt the behaviors of professional scientists and engineers, particular attention is given to providing opportunities for the students to share their experiences with a variety of audiences in both written and oral formats. For example while doing their initial research the students compile a collection of written précis which help them construct a common understanding of key technical concepts. Their written products also include requests for purchases of equipment, user’s manuals for their systems, and responses to action items raised at the formal design reviews. Those design reviews are the primary oral presentations each quarter, but presentations are also given to a variety of undergraduate student groups such as the Society of Imaging Science and Technology, the Society of Mechanical Engineers, and the university’s College of Science Undergraduate Research Symposium. Because this pedagogy represents such a radical departure from any the department has previously used, it is important to evaluate its effectiveness. To do this, the department initially planned to enlist the aid of external evaluators to conduct a formal assessment. However anticipated funding to support this effort did not materialize, so a rigorous evaluation has not yet been performed. Additionally, the desire to draw any clear conclusions regarding its effectiveness is also hampered by the small sample size. To date, only four cohorts – a total of 84 students including those who are currently enrolled – have taken this class. And since the students from the first cohort have yet to graduate, the full impact of this pedagogy on their academic careers is just now being assessed. More data must be collected and a more comprehensive analysis must be done before any definitive conclusions can be drawn about the effectiveness of this approach. In spite of this, the department is attempting to identify any early indications in student attitudes and behaviors which may be attributable to this pedagogy. Indeed, some appear to be emerging. For example, student feedback on standard course evaluations and other informal surveys has been overwhelmingly positive. When asked “What is your overall rating of this course?” 72% of the responding students have given it the “best possible rating,” another 15% rated it “above average.” No students have ever given ratings of “below average” or “worst possible.” The Use of Peer Evaluations One of the primary challenges associated with the implementation of a non-traditional

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom