Assessment – Evolutionary Not Revolutionary
Author(s) -
Raymond Thompson
Publication year - 2020
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/1-2--8925
Subject(s) - accreditation , aviation , process (computing) , medical education , engineering management , engineering , political science , aeronautics , computer science , medicine , aerospace engineering , operating system
The Aviation Technology (AT) department at Purdue University began the assessment process in 1996 in response to an upcoming visit by the Council for Aviation Accreditation (CAA), the accrediting body for aviation programs in North America. The information gathered satisfied the CAA but only presented the requested information In anticipation of the North Central Accreditation (NCA) regional accreditation for Purdue University as a whole in 1999, the university began an assessment initiative that would include all academic programs. The university established a series of student learning outcomes for each school. AT is part of the School of Technology (SOT). The SOT established what learning outcomes its graduates should have and the SOT Assessment Committee created an eight step assessment framework that all departments would use as a guide for their individual assessment plans. The assessment process in Aviation Technology began by examining where assessment information was currently being gathered. Over eight sources were identified, but there was no central organization or structure in place to utilize the results. The first iteration of the AT Assessment Program gathered these sources under a central umbrella and started to address faculty issues and concerns. After the first year, the assessment process was examined and while considerable data was gathered, it became apparent that the feedback mechanisms were minimal and ineffective. The second iteration produced a solid feedback system, this time with faculty input. The assessment process is into its third year. Faculty are becoming more supportive of these activities and the mechanisms are becoming more streamlined and efficient. The plan is simple and uncomplicated and designed to satisfy the requirements of the CAA, NC, and Federal Aviation Administration requirements. The AT assessment plan is a common sense approach that is designed to evolve with time and experience. I. The Drive For Assessment Every school or program engages in assessment. Often this is informal and unstructured. For many entities, developing a structured assessment program has taken place as a result of an external force being applied such as accreditation. This was the case for the Aviation Technology (AT) department at Purdue University. The accrediting body for the university is North Central. In preparation for the visit in 1999, a comprehensive effort to develop documented and structured assessment plans for the university, schools, and each department was launched. In 1995 the university created a set of learning outcomes that every Purdue student would achieve as part of successful completion of a degree program. The School of Technology (SOT) then created a series of learning outcomes that a student in one of the eight P ge 623.1 Proceedings of the 2001 American Society for Engineering Education Annual Conference & Exposition Copyright 2001, American Society for Engineering Education SOT departments would need to achieve. In addition, the SOT formed an Assessment Committee to determine how the SOT would meet the assessment challenge. The SOT Assessment Committee formulated an eight-step assessment model 1 that each department would follow. Specific methods of assessment would be left to the discretion of each department, but each plan would need to meet the SOT model guidelines. The eight items required in each assessment plan were: 1. A brief, one or two-page description of the department and its programs. 2. The Departmental Mission Statement 3. Learning outcomes for the degree and program option offered by that department. The learning outcomes should reflect the learning outcomes stated by the University and the School of Technology. 4. The current curricula and plans of study for degrees and programs offered by the individual department. 5. Documentation of the methods and techniques used to assess degree learning outcomes. These summary documents should indicate the methods, direct or indirect, used in assessment and how the result of the assessment was used or will be used to help facilitate continuous quality improvement. Assessment activities that are not directly linked to CQI should not be included. 6. Course descriptions and learning outcomes for all courses that make up the current curriculums or programs. 7. Documentation of the methods and techniques used to assess course learning outcomes. These summary documents should indicate the methods, direct or indirect, used in assessment and how the result of the assessment was used or will be used to help facilitate continuous quality improvement. Assessment activities that are not directly linked to CQI should not be included. It is the responsibility of the individual course supervisors to ensure that course descriptions and learning outcomes are current and reflect the course as it is being taught. Assessment techniques should be appropriate to the course content and delivery mode. The summary document should be in a narrative form and reflect both the results of assessment and how the assessment is or will be used to improve the quality of the course. 8. A summary of the overall efforts and results of the department’s use of assessment to enable an ongoing and consistent continuous quality improvement program. A more general model that provides an excellent framework for assessment may be found in “Stepping Ahead: An Assessment Plan Development Guide” . A delegate from each department to the SOT Assessment Committee headed up the individual departmental efforts. Developed plans were submitted to the SOT in September 1998. In January 1999, a review team consisting of a representative from the Dean’s Office and two members of the SOT Assessment Committee visited each department. The review team examined the departmental assessment plan, implementation, and current assessment progress. Based on the results, departments were required to modify their plans and submit the revised version for 1999-2000 academic year. Although the NCA visit had taken place, a second round of visits took place in January 2000, performing similar review activities. By that point, each P ge 623.2 Proceedings of the 2001 American Society for Engineering Education Annual Conference & Exposition Copyright 2001, American Society for Engineering Education department had developed a well-conceived plan that was working reasonably well. As a result, no program reviews were scheduled for the 2000-2001 academic year. However each department is expected to continue and improve the assessment programs. II. Analyzing Current Practices The Aviation Technology department was not unique in the way assessment was performed prior to the structure imposed by the university and school. Similar to many departments, AT was accredited by a body particular to its needs. In this case, the Council for Aviation Accreditation (CAA). Since the AT department was visited by the CAA in 1996, a considerable amount of background information had already been compiled. However an assessment component was not required at the time of the CAA visitation. To begin the process of developing an assessment program that met the SOT requirements, the AT department first reviewed what data was being gathered. Surprisingly, it turned out that data was being gathered from a large number of sources (see Table 1). Course evaluations, discussions with students, department faculty reviews, etc., are a few examples of information gathered. The problem was that there was no central structure to control what information was gathered and what it was then used for. Table 1: Assessment Activities Areas Being Assessed Prior to Formal Program Areas Added After Formal Program Course Evaluations – Faculty input Course Evaluations – Student input Course Improvement Plan Curriculum Chair Review Department Head Review Facility and Equipment Review Faculty Goal Setting Industrial Advisory Committee Input Senior Exit Interviews Course Information Document Employer Surveys* Alumni Surveys* Section Goal Setting Student Services Freshman Survey *Being developed for 2001-2002 The first task in developing assessment in AT was to review the data sources already being collected and decide whether the information provided was useful. Next, areas where no information was being gathered that would be important to assess were identified. The areas of additional information are noted in Table 1. III. Developing An Assessment Plan Using the SOT assessment model as a guide, it can be seen that assessment in AT takes place at three distinct levels: Departmental, which include all areas that are department wide such as faculty reviews, student services, and industrial advisory committee input. The second level is conducted at the program level. The AT department has three distinct majors, each with unique P ge 623.3 Proceedings of the 2001 American Society for Engineering Education Annual Conference & Exposition Copyright 2001, American Society for Engineering Education programs. Finally, the third level of assessment is at the individual course level. Faculty and administration at each level developed assessment pertinent to the particular level. The major accomplishment at this point was identifying what was to be assessed and a central structure through which this would occur. With the exception of Student Services, all of the areas to be assessed had been performing assessment in some fashion. The plan was submitted in September 1998 to the SOT. IV. Implementation and Revision Implementation of a centralized assessment program has two major challenges. First and foremost is faculty participation. In many situations, faculty members view assessment as intrusive and threatening. Outstanding faculty perform assessment as a matter of course and sometimes resent the imposition of an external structure. On the other hand, faculty who would benefit from what assessment can tell them fear the process and are conc
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom