Student Learning Outcomes: Effectively Satisfying Multiple Accreditation Requirements
Author(s) -
Gerard P. Len,
John Ochs,
Derick G. Brown
Publication year - 2020
Publication title -
papers on engineering education repository (american society for engineering education)
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/1-2--23049
Subject(s) - accreditation , computer science , medical education , medicine
In June 2013, Lehigh University’s Periodic Review Report (PRR) was submitted to the regional accreditation body (the Middle States Commission on Higher Education, MSCHE) and reaccreditation self-study reports for nine undergraduate engineering programs were submitted to ABET, Inc. Assessment of Student Learning Outcomes (SLOs) is a primary requirement of both agencies, each with a significantly different scope, focus, reporting system, terminology and criteria or standards. Effectively satisfying both demands can be challenging for many reasons, including the need for leadership and coordination at many levels, avoidance of redundant effort, faculty buy in, and availability of resources. Below we discuss how we addressed and continue to address this multi-faceted challenge and earning an overall “superlative” review by MSCHE. In 2010 our progress report was accepted by MSCHE that described “progress made toward the assessment of student learning outcomes in the College of Arts & Sciences [using] qualitative & quantitative, direct and indirect means...[and] measuring progress toward those goals.” These new assessment practices in Arts and Science complemented strong assessment practices in the undergraduate engineering programs, programs in the graduate-only College of Education, and graduate and graduate programs in the Business and Economics College, all accredited by other agencies. Except for a few scattered programs, the only area that lacked overall assessment practices was the P.C. Rossin College of Engineering and Applied Science’s graduate programs. This paper reports on how we are implementing graduate engineering program assessment practices to complement three existing and different ones in the other colleges while supporting the new overarching university-wide system. MSCHE indicated our 2013 report needed “a comprehensive description of the evolution of student learning outcomes assessment practices across the university since the last visit [2008], with special attention to the evolution of such practices within the College of Arts and Sciences.” The university-level challenge was addressed first by creating a process whereby a standing graduate faculty committee and an appointed Enhancing Graduate Education (EGE) committee worked together to create a sustainable process for periodic program review that included a framework for interpreting the five new university-level graduate student learning competencies: Knowledge, Application, Context, Communication, and Leadership. Also required was development of a methodology for assessment and continuous improvement. This approach earned a very positive 2013 MSCHE evaluation: “university assessment practices of graduate Student Learning Outcomes [were] particularly thoughtful ...[including] the plans, examples of implementation [and] the support structure. The recently developed framework for graduate SLO assessment allows graduate engineering to closely complement and support the new university system. The Technical Entrepreneurship program provides an example of leadership and best-practice sharing to demonstrate useful and sustainable SLO assessment practices. Finally, an Assessment of Student Learning Assessment Processes table is used to assess the evolution of college assessment practices. P ge 24116.2 1. The challenge: satisfying multiple student learning assessment requirements The challenge we faced was: by 2013 develop overarching, integrated, comprehensive Student Learning Outcomes (SLO) Assessment practices at the university level for both undergraduate and graduate programs that complemented, but did not duplicate, the existing assessment programs in each of the college’s academic programs. Part of the challenge was the development of assessment practices for all university programs which did not have assessment plans, programs or results in place. These new efforts need to be acceptable to faculty and complement the existing assessment programs, and support the new university-level assessment program. In this paper, the focus of the program-level SLO assessment was for graduate engineering programs which did not have any assessment plans or practices in place in 2010 when the discussion of the nature of the university-level assessment began. Meeting this challenge will also satisfy the expectations of Middle States Commission on Higher Education (MSCHE), the regional accrediting agency for Lehigh University, as stated in their 2008 recommendation: at the time of the [2013 Periodic Review Report], the university provide a comprehensive description of the evolution of student learning outcomes assessment practices across the university since the last visit, with special attention to the evolution of practices in the College of Arts and Sciences. Satisfying this objective requires faculty and administration buy-in to add a new layer of assessment at the university level, and another layer for graduate programs without an assessment program. 2. Expectations from regional accreditation commission on assessment practices The main driver to establish university-level student learning outcomes assessment was the MSCHE’s 2008 recommendation provided above, and to be addressed in our 2013 Periodic Review Report. Although other accreditation agencies set expectations for other colleges, the focus of this paper are MSCHE expectations of all graduate programs, with an emphasis on engineering programs. Because not all faculty and staff were familiar with MSCHE expectations, the first author made a presentation on that topic to several groups, including the Graduate Research Committee (GRC) in 2010 and 2011, as summarized below. The expectations on Student Learning Assessment for undergraduate and graduate programs are covered in MSCHE’s Standard 14, with multiple levels (university, college, program, and course). Their recommendations typically ask for progress or evolution (rather than completeness), and often identify specific areas that require special attention and follow up reporting. MSCHE evaluators do not necessarily ask that all gaps be closed, and those that are singled out typically have two years to submit progress or monitoring reports. Thus their philosophy is to ask and expect institutions to continually strive to achieve their 14 Standards. Thus the MSCHE could “recommend” that a university address a specific gap to retain accreditation that existed in past evaluations but not specifically mentioned. Conveying this P ge 24116.3 information to constituents was important step in buy in, and explained why could be singled out in the future even though they were not mentioned in the past. Although MSCHE cannot dictate what a university must do, if their recommendations are not addressed, MSCHE accreditation can be put in jeopardy with severe consequences, i.e., accreditation is a requirement for eligibility for Title IV Federal Student Financial Aid funds, and would be disastrous for an institution to lose accreditation. So MSCHE “recommendations” are “required” to be addressed to avoid the potential (eventual) loss of MSCHE accreditation. Also, when institutions do not evolve their practices adequately, they risk receiving a “blanket recommendation” that includes requirements not specifically part of their standards. For example although there is no specific requirement for external program review, American University received a recommendation that included implementation of a required external review for all programs, graduate and undergraduate even though it is not in their standards. In summary, it is important to convey key information to administrators, faculty and staff so they understand the expectations and ramifications of not maintaining regional accreditation. 3. Overview of established college-level student learning assessment practices Programs with established SLO assessment practices are summarized in this section. Section 4 describes the university-level undergraduate practices, Section 5 describes the university-level graduate practices, and Section 6 describes how we are closing the remaining gap in collegelevel practices in the P.C. Rossin College of Engineering and Applied Science. Accredited programs with established SLO assessment considered to be “mature” include: • accredited programs in the Business & Economics College (undergraduate and graduate) • College of Education (graduate) • P.C. Rossin College of Engineering and Applied Science (undergraduate) SLO Assessment is part of the accreditation process of numerous accreditation agencies (MSCHE, AACSB, ABET, PA State Department of Education). Each accreditation agency has different criteria and standards, each with a different scope, focus, reporting system and timing. Effectively satisfying all demands can be challenging for many reasons, including leadership and coordination at many levels, avoidance of duplicative effort, faculty buy in, and availability of resources. A Student Learning Outcomes (SLO) Assessment gap was identified by MSCHE in 2008, resulting in a recommendation that we report on progress toward assessment of Student Learning Outcomes in graduate and undergraduate programs in the College of Arts and Sciences (CAS) by April 2010. In response, CAS developed an in house software application used by each department each semester to document selected direct evidence of student learning, and now has an established college-level assessment practices. Thus while not “mature” the College of Arts and Sciences now has established assessment practices. P ge 24116.4 The evolution of assessment practices in each college can be summarized by a table such as Table 1: Rubric for Evaluating Student Learning Assessment Processes by Colleges to illustrate their progress. A similar practice for graduate programs is being developed. The third round of self-study engineering reports that describe the SLO assessment were submitted to ABET in 2013; previous ones were in
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom