z-logo
open-access-imgOpen Access
Assessment of Student Outcomes Using Industry-Academia Assessment Teams
Author(s) -
Kevin Sutterer,
Michael G. Robinson,
James G. Hanson,
Michael Reeves,
Andrew Twarek
Publication year - 2020
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/1-2--20990
Subject(s) - rubric , medical education , work (physics) , capstone , phone , session (web analytics) , psychology , computer science , engineering , medicine , pedagogy , mechanical engineering , linguistics , philosophy , algorithm , world wide web
Rose-Hulman Institute of Technology’s (RHIT) Department of Civil Engineering is using assessment teams comprised of industry professionals and faculty members working together to assess student outcomes for continuous improvement. Two approaches are being used for assessment by industry professionals. For the first approach, assessment of some student outcomes is performed by teams of four industry experts during the department’s annual Board of Advisors’ meeting. This assessment is conducted specifically on senior capstone design reports from the prior academic year. In this approach, faculty members are available to answer questions about the students’ work and to receive advice, but not to assess. The department rates multiple student outcomes using this approach. The second assessment approach is conducted on all other student work submitted for assessment of department-specific student outcomes. In a single year, this requires rating a total of approximately 30 different sets of submissions from students. This assessment is facilitated using RHIT’s online electronic portfolio system to allow remote access and rating of student work. Each industry professional is teamed with one faculty member to conduct rating of student work submissions. The teams meet by phone and email regularly during each rating session to discuss the outcome criterion, student submissions, the rubrics for rating submissions, and interrater reliability. Upon completion of each rating session, the team provides the department with an overview that includes advice for improving student learning and for criterion or rubric revision, if appropriate. Permitting industry professionals to work directly with student submissions has accelerated the continuous improvement process in the department. External industry professionals are likely to apply an even higher standard of expectation to student work, and provide insights not readily apparent to faculty members who are immersed daily in facilitating the learning process. This has resulted in a reduction in passing rates for some student work, thus fostering greater leaps in improvement of learning in those outcomes. Team review of student work also facilitates greater levels of cooperation and more frequent deliberate communication between faculty members and industry colleagues, ultimately enhancing student learning through the sharing of ideas between these two groups. Findings are reported as: (1) a comparison of passing rate statistics before and after inclusion of industry raters, (2) reflections on the process by both industry and faculty raters, and (3) reflections on the process by the administrators of the rating. We recommend that other institutes consider use of industry raters for student outcomes because of the enhanced continuous improvement and increased collaboration between industry and academia. Programs are cautioned that inclusion of industry raters adds another dimension to the planning that increases the administrative burden, and that passing percentages for student work will likely decrease when industry raters are included. P ge 25230.2

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom