Finding Appropriate Data For Abet Self Study Sections B2 And B3 For Engineering Programs
Author(s) -
Kathryn Abel
Publication year - 2020
Publication title -
papers on engineering education repository (american society for engineering education)
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/1-2--1527
Subject(s) - accreditation , credibility , engineering management , process (computing) , curriculum , benchmark (surveying) , engineering , plan (archaeology) , engineering education , certification and accreditation , software engineering , computer science , medical education , political science , medicine , geodesy , archaeology , geography , law , history , operating system
ABET accreditation is an established benchmark for undergraduate engineering programs in the United States and ensures the quality of education college engineering students receive. As such, ABET is the recognized U.S. accreditor of engineering college and university programs. ABET outlines the criteria for each engineering program and the key elements of what is required in each engineering program’s Self Study. However, ABET leaves up to the interpretation of each engineering program the details of how to present the findings of its hopefully successful defense of its own program. This apparent paradox results from the many elements, characteristics and factors that can contribute to successful accreditation. This paper gives a summary of types of data that can be used in the ABET accreditation process. Specifically the data presented was used in the 2004 ABET accreditation of the Engineering Management program at Stevens Institute of Technology. Examples of types of data leading toward accreditation and recommendations of how colleges and universities can implement similar data assessment processes are discussed. Introduction ABET accreditation provides endorsement of curricula, facilitates university and external funding and, in general, adds credibility to an engineering program. However, achieving ABET accreditation can be a daunting task. This paper provides guidance to engineering programs considering accreditation or undergoing re-accreditation, by examining the experiences and data processes at an accredited Engineering Management Program at Stevens Institute of Technology. The paper first provides background on the EM programs at Stevens. This is followed by a description of ABET and the accreditation process. The experiences of data discovery and assessment by the Stevens Engineering Management program are next discussed. The paper concludes with suggestions for successfully accrediting an engineering program. Population Stevens Institute of Technology is a private university located across the Hudson River from Manhattan in Hoboken, New Jersey. Stevens Institute of Technology has a relatively large, well established Engineering Management (EM) Program. Its success is evidenced by the size of its faculty (15 full time faculty) and external recognition (four awards from the American Society of Engineering Management, ASEM, over the past 15 years). The EM Program at Stevens was first ABET accredited in 1992, and successfully re-accredited in 1998 and 2004. There are currently 15 faculty in the EM department, of which 10 teach in the undergraduate program. Several of the faculty have been members of the American Society for Engineering Management (ASEM) for over 5 to 15 years. P ge 12744.2 Stevens has approximately 1400 undergraduate students, of which about 100 designated Engineering Management (EM) as their preferred discipline in the 2005 – 2006 academic year. Approximately 50% of Engineering Management students choose to participate in the five year Cooperative Education program. Stevens graduates between 20 and 30 Engineering Management students a year with a Bachelor of Engineering Degree. Approximately 75% of these EM graduates have a job prior to graduation with an average starting salary of $47,700. ABET: History and Role The Engineers’ Council for Professional Development (ECPD) first formed ABET in 1932. ABET’s original task was to fill the recognized need for a “joint program for upbuilding engineering as a profession” (ABET 2004) and by 1936, ECPD had evaluated its first engineering programs. In early 1980, ECPD was renamed the Accreditation Board of Engineering and Technology (ABET) and in 1997, ABET adopted the Engineering Criteria 2000 (EC2000). This new format of accreditation was an evaluation based on a continuous improvement process focusing on engineering program outcomes. Thus, for over 50 years and in cooperation with both the engineering academic and practitioner community, ECPD/ABET has been the recognized accreditation body for undergraduate engineering programs in the United States. Their accreditation criteria have molded engineering education and guided engineering educators to endeavor for ABET’s vision of providing “world leadership in assuring quality and in stimulating innovation in applied science, computing, engineering, and technology education” (ABET 2004). The U.S. Council for Higher Education Accreditation recognizes ABET as the agency responsible for evaluating and certifying the quality of engineering education in the United States (ABET 2004). This recognition by the Council adds to the credentials of ABET accredited college programs. However, a significant reason for achieving accreditation is that many state licensing authorities recognize ABET accredited programs as satisfying the educational requirements for P.E. licensure. All states, except Michigan and New Hampshire, require graduation from an ABET accredited institution as a prerequisite to the FE/PE examination. Of the states requiring an ABET accredited degree, a small minority (15 states as of 2003) allow graduates with non-ABET accredited degrees to take the FE/PE exam as long as they have a given number of postgraduation engineering experience prior to the exam date (NCEES, 2005). Without an ABET accredited undergraduate engineering program, states may refuse to issue professional engineering licenses to individuals. Thus, many colleges choose to accredit their undergraduate programs to satisfy licensing requirements for their graduates. IN addition, accreditation inherently enhances the reputation of the Engineering professions overall and adds credibility to each university’s individual engineering program. Data Discovery and Assessment Data Examples Ever since EC 2000, engineering colleges have been striving for ways to find a process of assessment, discover data, quantify assessments and then display the data in a meaningful P ge 12744.3 and easy to read format for ABET accreditation. There are many ways to do this and many varied forms have been used seen since EC 2000 began. This paper shows some examples of the ways assessment concepts and data were used and implemented by the Engineering Management Program at Stevens Institute of Technology. Stevens Institute of Technology adopted an online assessment method in the late 1990’s in order to stream line the majority of its’ data collection and display the majority of its data in one easily accessible location. However, how each department chose to mold this data into the requirements of ABET’s Self Study differed from program to program. In the 2003 accreditation cycle, however, many Stevens’ programs liked what Engineering Management was doing and chose to have their Self-Studies reflect much of the format and data used by the Engineering Management Program. Although similarities between programs can be noted for 2003, it should be stated that several of Stevens’ programs also had individual data displays and analyses of their own as well. As mentioned above, much of the data was collected through a university-wide online assessment system. This system consisted of surveys for students to assess their classes [outcomes], and alumni to assess their satisfaction with the quality of their education, as well as, employers to assess their satisfaction with the quality of their employees (the Program’s alumni) [objectives]. However, this assessment produced only indirect data. Thus, in addition to this, direct measures were also necessary which required more of a manual collection process. The below sections outline the types of data collected and the methods used for the critical Self Study Sections of B2 and B3. ABET Self Study Section B2 – Objectives The first part of the ABET Self Study Section B2 asks for the educational objectives to be consistent with the mission, as well as, ABET criteria. Thus, this section should list the Engineering Program’s objectives and how the program objectives are consistent with the mission of the program, as well as, the mission of the college or university and the ABET accreditation criteria. Next the Self Study must demonstrate that there is a process in place to determine and periodically evaluate these objectives based on the needs of the program constituencies. Thus, this section should outline the constituents of the program. At a minimum the constituents should be the Students, the Employers of the alumni, the Alumni and the Faculty and Staff of the program. It is then the job of the Self Study to demonstrate how these constituents contribute to and benefit from the Program. This demonstration can be done in several ways. For example, Engineering Program’s should have Visiting Committees which critique, as well as, add value to Engineering Programs. By showing the varied and knowledgeable backgrounds of the members of the Visiting Committee and how the committee is made up of members from all of the constituent groups above, a program can demonstrate feedback loops to the program or other assessment data. Page 12744.4 Another method to assess the Objectives of a program is through electronic or paper Alumni surveys. Similarly, Objectives can be assessed by Employer surveys through paper or online assessment forms as well. Although these methods of assessment are indirect for both alumni and employers, programs can achieve great volumes of quantifiable data on each of their objectives through this simple process. However, direct measures should also be taken. Some direct measures which can be used to demonstrate both the alumni and employer satisfaction of the education of the program’s graduate are through average starting salary data and the comparison of this data to national averages. Similarly job placement data for the individual programs’ graduates and the comparison of this data to national averages shows the quality of the program’s graduates to employers as well as the satisfaction of the alumn
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom