The Comprehensive Assessment Of Team Member Effectiveness: A New Peer Evaluation Instrument
Author(s) -
Hal Pomeranz,
Harlan W. Feinstein,
Matthew Ohland
Publication year - 2020
Publication title -
2006 annual conference and exposition proceedings
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/1-2--751
Subject(s) - teamwork , computer science , peer assessment , identification (biology) , likert scale , confidentiality , sample (material) , scale (ratio) , plan (archaeology) , interface (matter) , reliability (semiconductor) , rating scale , knowledge management , psychology , computer security , mathematics education , law , archaeology , maximum bubble pressure method , biology , developmental psychology , power (physics) , bubble , chromatography , quantum mechanics , parallel computing , political science , botany , physics , history , chemistry
A multi-university research team has designed a peer evaluation instrument that is simple to use. The system gathers data from students through a web interface, ensuring the confidentiality of the peer ratings. The system analyzes the data to calculate suggested grade adjustments for equitably distributing a team’s grade among the team’s members. The system also provides extensive feedback to faculty as to certain dynamics of student teams that can be discerned from the peer evaluation data. Reliability and validity studies are under way. This paper includes a description of the behaviorally anchored rating scales, the electronic interface, and the feedback it provides. The design of a new instrument The instrument was developed from a broad base of teamwork research. The identification of potential items from the teamwork literature, the creation of a Likert-scale instrument, and the use of exploratory factor analysis and confirmatory factor on a large survey sample to reduce the instrument and identify the factor structure are in press elsewhere. 1 Our earlier published work described the importance of assessing teamwork in the engineering classroom and the challenges it represents and laid out the ambitious assessment plan that would help develop an instrument that is easy to use and yet meaningful for both faculty and students, 2 described and demonstrated the benefit of a behaviorally anchored rating system, 3 detailed the process of creating a new behaviorally anchored rating scale to simplify administration, data analysis, and reporting, and make feedback more understandable. 4 This paper shows how the behaviorally anchored rating scales are incorporated into an electronic interface, how the database is designed, and the feedback it provides. Designing a complicated database to keep the administration and reporting simple To make the peer evaluation simple for students and faculty, the database schema became quite complicated, summarized in a 20-page functional requirements document. The system includes views for administrators, faculty, and students, and has modules for password protection and consent and reporting. The instrument is called the Comprehensive Assessment of Team Member Effectiveness, referred to as CATME for short. Administrator interface: The administrator grants access to the system after confirming that the request has come from a faculty member. The interface shows active and pending faculty accounts, keeps track of each faculty’s last login date and time. This interface also provides access to the raw data for surveys released for research purposes. Faculty interface: Participating faculty enter information about classes using the system, populate a class with students, populate teams of students within a class, and set up surveys for team activities. The faculty interface also controls the instructions given to the students, the P ge 11262.2 factors surveyed, whether consent is required, and what data will be reported. Faculty select from among five primary survey factors as published earlier (Contributing to the Team's Work, Interacting with Teammates, Keeping Team on Track, Expecting Quality, and Having TaskRelated Knowledge/Skills/Abilities) in addition to optional follow-up questions. Anything entered by a faculty member at any level can be selected for future use—for example, if teams remain constant for multiple administrations of the peer evaluation instrument, the teams may be reused within the system. The system supports uploading common text file formats to facilitate large-scale data entry. The system also has automated features, such as notifying students when a survey instrument has opened. Student interface: Students using the CATME peer evaluation each receive an email when they are first assigned in the system, so they can set up the system with a password they can remember and make the system secure. A student logs in to find a list of active surveys that they are being requested to complete and a second list of completed surveys where the students can view their results. If a student doesn't complete the entire survey, the interface remembers where the student left off and what data they’ve already entered. When the students return, they start at the first set of unanswered questions. Since the teams are assigned, the interface shows each student a list of their teammates being evaluated. Demonstrating the interface This poster presentation will feature screen shots of each interface as well as a live demo. A demonstration mode is being designed, particularly to allow faculty evaluating the system to see the student interface. A sample interface screenshot is shown below, illustrating the numerical ratings resulting from the survey administration, recommended grade adjustment factors (if desired by the faculty member), and special notes that alert the faculty member to possible dynamics occurring within a team.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom